Sep 29 10:44:17 crc systemd[1]: Starting Kubernetes Kubelet... Sep 29 10:44:18 crc restorecon[4678]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:18 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:19 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:19 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:19 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:19 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:19 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:19 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:19 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:19 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:19 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:19 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:19 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:19 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:19 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:19 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:19 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:19 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:19 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:19 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:19 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:19 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:19 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:19 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:19 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 10:44:19 crc restorecon[4678]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Sep 29 10:44:19 crc restorecon[4678]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Sep 29 10:44:19 crc restorecon[4678]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Sep 29 10:44:19 crc restorecon[4678]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Sep 29 10:44:19 crc restorecon[4678]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Sep 29 10:44:19 crc restorecon[4678]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Sep 29 10:44:19 crc restorecon[4678]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Sep 29 10:44:19 crc restorecon[4678]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Sep 29 10:44:19 crc restorecon[4678]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Sep 29 10:44:19 crc restorecon[4678]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Sep 29 10:44:19 crc restorecon[4678]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Sep 29 10:44:19 crc restorecon[4678]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Sep 29 10:44:19 crc restorecon[4678]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Sep 29 10:44:19 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 29 10:44:19 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 29 10:44:19 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 29 10:44:19 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 29 10:44:19 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 29 10:44:19 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 29 10:44:19 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 29 10:44:19 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 29 10:44:19 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 29 10:44:19 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 29 10:44:19 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 29 10:44:19 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 29 10:44:19 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 29 10:44:19 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 29 10:44:19 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 29 10:44:19 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 29 10:44:19 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 29 10:44:19 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 29 10:44:19 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 29 10:44:19 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 29 10:44:19 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 29 10:44:19 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Sep 29 10:44:19 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 29 10:44:19 crc restorecon[4678]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 29 10:44:19 crc restorecon[4678]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 29 10:44:19 crc restorecon[4678]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 29 10:44:19 crc restorecon[4678]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 29 10:44:19 crc restorecon[4678]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 29 10:44:19 crc restorecon[4678]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 29 10:44:19 crc restorecon[4678]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 29 10:44:19 crc restorecon[4678]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 29 10:44:19 crc restorecon[4678]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 29 10:44:19 crc restorecon[4678]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 29 10:44:19 crc restorecon[4678]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Sep 29 10:44:19 crc kubenswrapper[4752]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 29 10:44:19 crc kubenswrapper[4752]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Sep 29 10:44:19 crc kubenswrapper[4752]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 29 10:44:19 crc kubenswrapper[4752]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 29 10:44:19 crc kubenswrapper[4752]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Sep 29 10:44:19 crc kubenswrapper[4752]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.792695 4752 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.796093 4752 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.796110 4752 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.796115 4752 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.796120 4752 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.796124 4752 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.796129 4752 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.796134 4752 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.796139 4752 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.796144 4752 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.796149 4752 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.796154 4752 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.796158 4752 feature_gate.go:330] unrecognized feature gate: Example Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.796162 4752 feature_gate.go:330] unrecognized feature gate: InsightsConfig Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.796166 4752 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.796176 4752 feature_gate.go:330] unrecognized feature gate: OVNObservability Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.796180 4752 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.796184 4752 feature_gate.go:330] unrecognized feature gate: PinnedImages Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.796188 4752 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.796194 4752 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.796197 4752 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.796201 4752 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.796205 4752 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.796208 4752 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.796212 4752 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.796215 4752 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.796219 4752 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.796222 4752 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.796226 4752 feature_gate.go:330] unrecognized feature gate: GatewayAPI Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.796230 4752 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.796234 4752 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.796238 4752 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.796241 4752 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.796245 4752 feature_gate.go:330] unrecognized feature gate: SignatureStores Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.796248 4752 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.796253 4752 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.796257 4752 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.796261 4752 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.796265 4752 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.796269 4752 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.796272 4752 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.796276 4752 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.796279 4752 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.796284 4752 feature_gate.go:330] unrecognized feature gate: PlatformOperators Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.796287 4752 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.796291 4752 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.796294 4752 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.796300 4752 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.796304 4752 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.796308 4752 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.796312 4752 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.796316 4752 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.796320 4752 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.796323 4752 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.796327 4752 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.796330 4752 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.796334 4752 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.796337 4752 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.796342 4752 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.796346 4752 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.796350 4752 feature_gate.go:330] unrecognized feature gate: NewOLM Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.796354 4752 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.796358 4752 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.796362 4752 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.796366 4752 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.796371 4752 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.796375 4752 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.796378 4752 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.796382 4752 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.796385 4752 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.796389 4752 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.796392 4752 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.797685 4752 flags.go:64] FLAG: --address="0.0.0.0" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.797703 4752 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.797712 4752 flags.go:64] FLAG: --anonymous-auth="true" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.797719 4752 flags.go:64] FLAG: --application-metrics-count-limit="100" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.797725 4752 flags.go:64] FLAG: --authentication-token-webhook="false" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.797730 4752 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.797738 4752 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.797745 4752 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.797749 4752 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.797753 4752 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.797758 4752 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.797762 4752 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.797767 4752 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.797771 4752 flags.go:64] FLAG: --cgroup-root="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.797775 4752 flags.go:64] FLAG: --cgroups-per-qos="true" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.797779 4752 flags.go:64] FLAG: --client-ca-file="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.797783 4752 flags.go:64] FLAG: --cloud-config="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.797787 4752 flags.go:64] FLAG: --cloud-provider="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.797792 4752 flags.go:64] FLAG: --cluster-dns="[]" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.797816 4752 flags.go:64] FLAG: --cluster-domain="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.797821 4752 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.797826 4752 flags.go:64] FLAG: --config-dir="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.797830 4752 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.797835 4752 flags.go:64] FLAG: --container-log-max-files="5" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.797841 4752 flags.go:64] FLAG: --container-log-max-size="10Mi" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.797846 4752 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.797850 4752 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.797854 4752 flags.go:64] FLAG: --containerd-namespace="k8s.io" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.797859 4752 flags.go:64] FLAG: --contention-profiling="false" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.797863 4752 flags.go:64] FLAG: --cpu-cfs-quota="true" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.797868 4752 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.797872 4752 flags.go:64] FLAG: --cpu-manager-policy="none" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.797876 4752 flags.go:64] FLAG: --cpu-manager-policy-options="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.797881 4752 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.797885 4752 flags.go:64] FLAG: --enable-controller-attach-detach="true" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.797890 4752 flags.go:64] FLAG: --enable-debugging-handlers="true" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.797894 4752 flags.go:64] FLAG: --enable-load-reader="false" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.797898 4752 flags.go:64] FLAG: --enable-server="true" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.797902 4752 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.797912 4752 flags.go:64] FLAG: --event-burst="100" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.797917 4752 flags.go:64] FLAG: --event-qps="50" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.797922 4752 flags.go:64] FLAG: --event-storage-age-limit="default=0" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.797927 4752 flags.go:64] FLAG: --event-storage-event-limit="default=0" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.797931 4752 flags.go:64] FLAG: --eviction-hard="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.797938 4752 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.797942 4752 flags.go:64] FLAG: --eviction-minimum-reclaim="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.797947 4752 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.797952 4752 flags.go:64] FLAG: --eviction-soft="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.797957 4752 flags.go:64] FLAG: --eviction-soft-grace-period="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.797961 4752 flags.go:64] FLAG: --exit-on-lock-contention="false" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.797966 4752 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.797970 4752 flags.go:64] FLAG: --experimental-mounter-path="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.797974 4752 flags.go:64] FLAG: --fail-cgroupv1="false" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.797978 4752 flags.go:64] FLAG: --fail-swap-on="true" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.797982 4752 flags.go:64] FLAG: --feature-gates="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.797987 4752 flags.go:64] FLAG: --file-check-frequency="20s" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.797992 4752 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.797996 4752 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.798000 4752 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.798004 4752 flags.go:64] FLAG: --healthz-port="10248" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.798008 4752 flags.go:64] FLAG: --help="false" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.798012 4752 flags.go:64] FLAG: --hostname-override="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.798017 4752 flags.go:64] FLAG: --housekeeping-interval="10s" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.798022 4752 flags.go:64] FLAG: --http-check-frequency="20s" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.798026 4752 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.798030 4752 flags.go:64] FLAG: --image-credential-provider-config="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.798034 4752 flags.go:64] FLAG: --image-gc-high-threshold="85" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.798037 4752 flags.go:64] FLAG: --image-gc-low-threshold="80" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.798041 4752 flags.go:64] FLAG: --image-service-endpoint="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.798045 4752 flags.go:64] FLAG: --kernel-memcg-notification="false" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.798239 4752 flags.go:64] FLAG: --kube-api-burst="100" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.798243 4752 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.798247 4752 flags.go:64] FLAG: --kube-api-qps="50" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.798252 4752 flags.go:64] FLAG: --kube-reserved="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.798256 4752 flags.go:64] FLAG: --kube-reserved-cgroup="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.798260 4752 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.798264 4752 flags.go:64] FLAG: --kubelet-cgroups="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.798268 4752 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.798272 4752 flags.go:64] FLAG: --lock-file="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.798276 4752 flags.go:64] FLAG: --log-cadvisor-usage="false" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.798280 4752 flags.go:64] FLAG: --log-flush-frequency="5s" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.798284 4752 flags.go:64] FLAG: --log-json-info-buffer-size="0" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.798290 4752 flags.go:64] FLAG: --log-json-split-stream="false" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.798294 4752 flags.go:64] FLAG: --log-text-info-buffer-size="0" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.798298 4752 flags.go:64] FLAG: --log-text-split-stream="false" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.798302 4752 flags.go:64] FLAG: --logging-format="text" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.798306 4752 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.798311 4752 flags.go:64] FLAG: --make-iptables-util-chains="true" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.798315 4752 flags.go:64] FLAG: --manifest-url="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.798319 4752 flags.go:64] FLAG: --manifest-url-header="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.798324 4752 flags.go:64] FLAG: --max-housekeeping-interval="15s" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.798328 4752 flags.go:64] FLAG: --max-open-files="1000000" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.798334 4752 flags.go:64] FLAG: --max-pods="110" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.798338 4752 flags.go:64] FLAG: --maximum-dead-containers="-1" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.798344 4752 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.798348 4752 flags.go:64] FLAG: --memory-manager-policy="None" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.798353 4752 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.798357 4752 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.798362 4752 flags.go:64] FLAG: --node-ip="192.168.126.11" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.798367 4752 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.798385 4752 flags.go:64] FLAG: --node-status-max-images="50" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.798394 4752 flags.go:64] FLAG: --node-status-update-frequency="10s" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.798399 4752 flags.go:64] FLAG: --oom-score-adj="-999" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.798404 4752 flags.go:64] FLAG: --pod-cidr="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.798409 4752 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.798420 4752 flags.go:64] FLAG: --pod-manifest-path="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.798426 4752 flags.go:64] FLAG: --pod-max-pids="-1" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.798431 4752 flags.go:64] FLAG: --pods-per-core="0" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.798436 4752 flags.go:64] FLAG: --port="10250" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.798441 4752 flags.go:64] FLAG: --protect-kernel-defaults="false" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.798446 4752 flags.go:64] FLAG: --provider-id="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.798450 4752 flags.go:64] FLAG: --qos-reserved="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.798454 4752 flags.go:64] FLAG: --read-only-port="10255" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.798459 4752 flags.go:64] FLAG: --register-node="true" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.798463 4752 flags.go:64] FLAG: --register-schedulable="true" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.798466 4752 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.798475 4752 flags.go:64] FLAG: --registry-burst="10" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.798479 4752 flags.go:64] FLAG: --registry-qps="5" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.798483 4752 flags.go:64] FLAG: --reserved-cpus="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.798487 4752 flags.go:64] FLAG: --reserved-memory="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.798492 4752 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.798496 4752 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.798501 4752 flags.go:64] FLAG: --rotate-certificates="false" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.798505 4752 flags.go:64] FLAG: --rotate-server-certificates="false" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.798509 4752 flags.go:64] FLAG: --runonce="false" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.798512 4752 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.798518 4752 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.798523 4752 flags.go:64] FLAG: --seccomp-default="false" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.798527 4752 flags.go:64] FLAG: --serialize-image-pulls="true" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.798532 4752 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.798537 4752 flags.go:64] FLAG: --storage-driver-db="cadvisor" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.798541 4752 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.798546 4752 flags.go:64] FLAG: --storage-driver-password="root" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.798549 4752 flags.go:64] FLAG: --storage-driver-secure="false" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.798553 4752 flags.go:64] FLAG: --storage-driver-table="stats" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.798557 4752 flags.go:64] FLAG: --storage-driver-user="root" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.798561 4752 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.798566 4752 flags.go:64] FLAG: --sync-frequency="1m0s" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.798570 4752 flags.go:64] FLAG: --system-cgroups="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.798574 4752 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.798580 4752 flags.go:64] FLAG: --system-reserved-cgroup="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.798584 4752 flags.go:64] FLAG: --tls-cert-file="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.798587 4752 flags.go:64] FLAG: --tls-cipher-suites="[]" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.798593 4752 flags.go:64] FLAG: --tls-min-version="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.798597 4752 flags.go:64] FLAG: --tls-private-key-file="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.798602 4752 flags.go:64] FLAG: --topology-manager-policy="none" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.798606 4752 flags.go:64] FLAG: --topology-manager-policy-options="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.798610 4752 flags.go:64] FLAG: --topology-manager-scope="container" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.798614 4752 flags.go:64] FLAG: --v="2" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.798620 4752 flags.go:64] FLAG: --version="false" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.798625 4752 flags.go:64] FLAG: --vmodule="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.798631 4752 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.798635 4752 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.798743 4752 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.798748 4752 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.798754 4752 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.798760 4752 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.798765 4752 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.798771 4752 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.798775 4752 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.798779 4752 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.798784 4752 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.798789 4752 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.798793 4752 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.798817 4752 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.798821 4752 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.798825 4752 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.798829 4752 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.798832 4752 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.798836 4752 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.798839 4752 feature_gate.go:330] unrecognized feature gate: SignatureStores Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.798843 4752 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.798847 4752 feature_gate.go:330] unrecognized feature gate: OVNObservability Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.798850 4752 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.798855 4752 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.798858 4752 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.798862 4752 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.798865 4752 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.798869 4752 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.798872 4752 feature_gate.go:330] unrecognized feature gate: Example Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.798876 4752 feature_gate.go:330] unrecognized feature gate: PinnedImages Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.798880 4752 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.798884 4752 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.798887 4752 feature_gate.go:330] unrecognized feature gate: NewOLM Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.798891 4752 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.798894 4752 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.798898 4752 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.798901 4752 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.798904 4752 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.798908 4752 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.798912 4752 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.798916 4752 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.798920 4752 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.798924 4752 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.798929 4752 feature_gate.go:330] unrecognized feature gate: GatewayAPI Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.798932 4752 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.798936 4752 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.798939 4752 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.798943 4752 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.798947 4752 feature_gate.go:330] unrecognized feature gate: PlatformOperators Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.798950 4752 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.798954 4752 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.798958 4752 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.798961 4752 feature_gate.go:330] unrecognized feature gate: InsightsConfig Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.798965 4752 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.798968 4752 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.798972 4752 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.798977 4752 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.798981 4752 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.798984 4752 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.798988 4752 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.798992 4752 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.798995 4752 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.798999 4752 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.799002 4752 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.799005 4752 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.799010 4752 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.799015 4752 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.799019 4752 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.799022 4752 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.799026 4752 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.799029 4752 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.799033 4752 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.799037 4752 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.799051 4752 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.809737 4752 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.809786 4752 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.809886 4752 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.809897 4752 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.809903 4752 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.809907 4752 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.809913 4752 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.809920 4752 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.809924 4752 feature_gate.go:330] unrecognized feature gate: PinnedImages Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.809928 4752 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.809932 4752 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.809935 4752 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.809939 4752 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.809958 4752 feature_gate.go:330] unrecognized feature gate: OVNObservability Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.809963 4752 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.809967 4752 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.809972 4752 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.809977 4752 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.809981 4752 feature_gate.go:330] unrecognized feature gate: NewOLM Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.809984 4752 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.809988 4752 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.809993 4752 feature_gate.go:330] unrecognized feature gate: PlatformOperators Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.809997 4752 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810001 4752 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810005 4752 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810008 4752 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810012 4752 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810202 4752 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810206 4752 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810210 4752 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810214 4752 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810217 4752 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810220 4752 feature_gate.go:330] unrecognized feature gate: Example Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810225 4752 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810231 4752 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810239 4752 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810246 4752 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810251 4752 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810257 4752 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810262 4752 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810266 4752 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810271 4752 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810274 4752 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810278 4752 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810282 4752 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810285 4752 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810289 4752 feature_gate.go:330] unrecognized feature gate: GatewayAPI Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810293 4752 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810296 4752 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810299 4752 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810303 4752 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810307 4752 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810311 4752 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810315 4752 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810319 4752 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810323 4752 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810328 4752 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810333 4752 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810338 4752 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810342 4752 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810346 4752 feature_gate.go:330] unrecognized feature gate: InsightsConfig Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810350 4752 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810354 4752 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810357 4752 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810361 4752 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810364 4752 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810368 4752 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810374 4752 feature_gate.go:330] unrecognized feature gate: SignatureStores Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810377 4752 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810381 4752 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810384 4752 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810388 4752 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810392 4752 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.810399 4752 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810516 4752 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810522 4752 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810527 4752 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810531 4752 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810535 4752 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810538 4752 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810541 4752 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810545 4752 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810549 4752 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810552 4752 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810556 4752 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810559 4752 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810563 4752 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810568 4752 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810573 4752 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810577 4752 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810581 4752 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810585 4752 feature_gate.go:330] unrecognized feature gate: SignatureStores Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810588 4752 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810593 4752 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810597 4752 feature_gate.go:330] unrecognized feature gate: PinnedImages Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810601 4752 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810605 4752 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810608 4752 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810612 4752 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810616 4752 feature_gate.go:330] unrecognized feature gate: Example Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810622 4752 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810626 4752 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810629 4752 feature_gate.go:330] unrecognized feature gate: NewOLM Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810633 4752 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810637 4752 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810641 4752 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810646 4752 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810651 4752 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810655 4752 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810659 4752 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810662 4752 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810666 4752 feature_gate.go:330] unrecognized feature gate: OVNObservability Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810670 4752 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810675 4752 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810680 4752 feature_gate.go:330] unrecognized feature gate: PlatformOperators Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810684 4752 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810688 4752 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810692 4752 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810697 4752 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810701 4752 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810706 4752 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810710 4752 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810715 4752 feature_gate.go:330] unrecognized feature gate: GatewayAPI Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810719 4752 feature_gate.go:330] unrecognized feature gate: InsightsConfig Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810722 4752 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810726 4752 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810730 4752 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810734 4752 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810737 4752 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810742 4752 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810746 4752 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810750 4752 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810754 4752 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810758 4752 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810761 4752 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810765 4752 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810768 4752 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810772 4752 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810775 4752 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810779 4752 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810782 4752 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810786 4752 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810789 4752 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810793 4752 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.810796 4752 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.810817 4752 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.811024 4752 server.go:940] "Client rotation is on, will bootstrap in background" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.816223 4752 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.816317 4752 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.818282 4752 server.go:997] "Starting client certificate rotation" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.818311 4752 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.818564 4752 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-12-04 05:51:04.021294153 +0000 UTC Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.818702 4752 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 1579h6m44.202594517s for next certificate rotation Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.855459 4752 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.857893 4752 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.880407 4752 log.go:25] "Validated CRI v1 runtime API" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.917072 4752 log.go:25] "Validated CRI v1 image API" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.919329 4752 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.927640 4752 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-09-29-10-40-31-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.927686 4752 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:41 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:42 fsType:tmpfs blockSize:0}] Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.946513 4752 manager.go:217] Machine: {Timestamp:2025-09-29 10:44:19.942928526 +0000 UTC m=+0.732070213 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654124544 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:d8106fc8-56a6-4aa2-998a-aa38bb8caa68 BootID:67757396-6dfe-4e60-ba89-bdfd50031eb3 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:41 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:42 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:0d:fe:83 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:0d:fe:83 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:aa:f0:74 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:90:55:87 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:2d:4b:e7 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:56:4e:70 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:12:2c:90:26:e7:49 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:d2:ce:13:b4:11:58 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654124544 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.946763 4752 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.946988 4752 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.948153 4752 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.948383 4752 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.948420 4752 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.948631 4752 topology_manager.go:138] "Creating topology manager with none policy" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.948642 4752 container_manager_linux.go:303] "Creating device plugin manager" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.949236 4752 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.949274 4752 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.950372 4752 state_mem.go:36] "Initialized new in-memory state store" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.950481 4752 server.go:1245] "Using root directory" path="/var/lib/kubelet" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.955152 4752 kubelet.go:418] "Attempting to sync node with API server" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.955190 4752 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.955225 4752 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.955245 4752 kubelet.go:324] "Adding apiserver pod source" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.955262 4752 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.961260 4752 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.962438 4752 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.964061 4752 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.21:6443: connect: connection refused Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.964070 4752 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.21:6443: connect: connection refused Sep 29 10:44:19 crc kubenswrapper[4752]: E0929 10:44:19.964207 4752 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.21:6443: connect: connection refused" logger="UnhandledError" Sep 29 10:44:19 crc kubenswrapper[4752]: E0929 10:44:19.964203 4752 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.21:6443: connect: connection refused" logger="UnhandledError" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.964762 4752 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.968410 4752 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.968448 4752 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.968458 4752 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.968466 4752 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.968482 4752 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.968492 4752 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.968502 4752 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.968517 4752 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.968528 4752 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.968538 4752 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.968551 4752 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.968559 4752 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.969683 4752 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.970351 4752 server.go:1280] "Started kubelet" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.971193 4752 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.21:6443: connect: connection refused Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.971557 4752 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.971674 4752 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 29 10:44:19 crc systemd[1]: Started Kubernetes Kubelet. Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.973336 4752 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.974317 4752 server.go:460] "Adding debug handlers to kubelet server" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.975202 4752 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.975241 4752 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.975368 4752 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 15:45:07.629661456 +0000 UTC Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.975416 4752 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 941h0m47.65424819s for next certificate rotation Sep 29 10:44:19 crc kubenswrapper[4752]: E0929 10:44:19.975723 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.976460 4752 volume_manager.go:287] "The desired_state_of_world populator starts" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.976509 4752 volume_manager.go:289] "Starting Kubelet Volume Manager" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.976727 4752 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.976946 4752 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.976970 4752 factory.go:55] Registering systemd factory Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.976982 4752 factory.go:221] Registration of the systemd container factory successfully Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.977347 4752 factory.go:153] Registering CRI-O factory Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.977458 4752 factory.go:221] Registration of the crio container factory successfully Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.977553 4752 factory.go:103] Registering Raw factory Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.977635 4752 manager.go:1196] Started watching for new ooms in manager Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.978821 4752 manager.go:319] Starting recovery of all containers Sep 29 10:44:19 crc kubenswrapper[4752]: W0929 10:44:19.978765 4752 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.21:6443: connect: connection refused Sep 29 10:44:19 crc kubenswrapper[4752]: E0929 10:44:19.979393 4752 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.21:6443: connect: connection refused" interval="200ms" Sep 29 10:44:19 crc kubenswrapper[4752]: E0929 10:44:19.979431 4752 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.21:6443: connect: connection refused" logger="UnhandledError" Sep 29 10:44:19 crc kubenswrapper[4752]: E0929 10:44:19.980491 4752 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.21:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.1869baedccc42df8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-09-29 10:44:19.97031372 +0000 UTC m=+0.759455387,LastTimestamp:2025-09-29 10:44:19.97031372 +0000 UTC m=+0.759455387,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.983954 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.984018 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.984029 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.984039 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.984048 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.984058 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.984067 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.984077 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.984090 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.984099 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.984109 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.984118 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.984126 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.984137 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.984146 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.984156 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.984165 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.984174 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.984184 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.984192 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.984205 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.984435 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.984466 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.984476 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.984491 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.984500 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.984555 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.984574 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.984583 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.984594 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.984606 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.984617 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.984684 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.984694 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.984705 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.984719 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.984729 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.984740 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.984821 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.984831 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.984844 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.984854 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.984871 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.986758 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.986868 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.986890 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.986928 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.986949 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.986976 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.986995 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.987016 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.987040 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.987066 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.987095 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.987124 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.987161 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.987180 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.987199 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.987226 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.987248 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.987280 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.987299 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.987318 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.987342 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.987363 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.987389 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.987414 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.987437 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.987467 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.987491 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.987524 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.987550 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.987572 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.987605 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.987634 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.987670 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.987711 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.987740 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.987772 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.987826 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.987864 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.987889 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.987909 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.987937 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.987960 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.987986 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.988021 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.988045 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.988077 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.988101 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.988126 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.988159 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.988187 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.988220 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.988247 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.988269 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.988301 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.988325 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.988356 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.988379 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.988402 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.988441 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.988465 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.988500 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.988535 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.988575 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.988613 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.988642 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.988684 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.988721 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.988748 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.988786 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.988849 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.988904 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.988942 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.988969 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.988993 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.989024 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.989046 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.989079 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.989217 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.989233 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.989255 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.989271 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.989291 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.989334 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.989350 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.989369 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.989385 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.989402 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.989422 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.989438 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.989457 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.989472 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.989489 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.989542 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.989573 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.989596 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.989612 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.989626 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.989647 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.989662 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.989680 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.989694 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.989710 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.989731 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.989754 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.989775 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.989826 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.989848 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.989877 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.989898 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.989918 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.989941 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.989956 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.989975 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.989994 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.990007 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.990025 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.990041 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.990077 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.990094 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.990108 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.990128 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.990144 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.990162 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.990178 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.990196 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.990216 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.990234 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.990273 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.990288 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.990306 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.990325 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.990341 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.990355 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.990377 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.990396 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.990422 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.990441 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.990462 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.990486 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.990506 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.990532 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.990552 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.990573 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.990596 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.990622 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.990646 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.990665 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.990683 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.990881 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.996980 4752 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.997050 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.997070 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.997086 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.997103 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.997116 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.997128 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.997142 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.997153 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.997169 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.997181 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.997194 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.997205 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.997217 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.997228 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.997240 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.997251 4752 reconstruct.go:97] "Volume reconstruction finished" Sep 29 10:44:19 crc kubenswrapper[4752]: I0929 10:44:19.997260 4752 reconciler.go:26] "Reconciler: start to sync state" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.007473 4752 manager.go:324] Recovery completed Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.017973 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.019849 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.019890 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.019927 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.020643 4752 cpu_manager.go:225] "Starting CPU manager" policy="none" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.020667 4752 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.020695 4752 state_mem.go:36] "Initialized new in-memory state store" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.027736 4752 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.029429 4752 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.029575 4752 status_manager.go:217] "Starting to sync pod status with apiserver" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.029704 4752 kubelet.go:2335] "Starting kubelet main sync loop" Sep 29 10:44:20 crc kubenswrapper[4752]: E0929 10:44:20.029819 4752 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 29 10:44:20 crc kubenswrapper[4752]: W0929 10:44:20.030536 4752 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.21:6443: connect: connection refused Sep 29 10:44:20 crc kubenswrapper[4752]: E0929 10:44:20.030633 4752 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.21:6443: connect: connection refused" logger="UnhandledError" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.037011 4752 policy_none.go:49] "None policy: Start" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.037997 4752 memory_manager.go:170] "Starting memorymanager" policy="None" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.038031 4752 state_mem.go:35] "Initializing new in-memory state store" Sep 29 10:44:20 crc kubenswrapper[4752]: E0929 10:44:20.076284 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.114534 4752 manager.go:334] "Starting Device Plugin manager" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.114613 4752 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.114630 4752 server.go:79] "Starting device plugin registration server" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.115222 4752 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.115245 4752 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.115420 4752 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.115651 4752 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.115667 4752 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 29 10:44:20 crc kubenswrapper[4752]: E0929 10:44:20.124067 4752 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.130293 4752 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc"] Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.130422 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.131759 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.131832 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.131844 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.132061 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.132328 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.132381 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.133222 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.133246 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.133256 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.133403 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.133442 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.133491 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.133508 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.133530 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.133550 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.134211 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.134247 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.134259 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.135017 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.135055 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.135068 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.135228 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.135322 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.135349 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.136383 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.136406 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.136405 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.136451 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.136474 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.136416 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.136824 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.137363 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.137587 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.138121 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.138174 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.138191 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.138518 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.138572 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.138816 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.138845 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.138892 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.140243 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.140272 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.140282 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:20 crc kubenswrapper[4752]: E0929 10:44:20.181015 4752 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.21:6443: connect: connection refused" interval="400ms" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.199580 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.199629 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.199656 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.199681 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.199698 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.199716 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.199734 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.199755 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.199775 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.199893 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.200034 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.200068 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.200096 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.200142 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.200194 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.215538 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.217227 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.217285 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.217296 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.217325 4752 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 29 10:44:20 crc kubenswrapper[4752]: E0929 10:44:20.217911 4752 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.21:6443: connect: connection refused" node="crc" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.301260 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.301313 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.301333 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.301348 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.301389 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.301424 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.301417 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.301477 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.301493 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.301501 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.301511 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.301527 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.301528 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.301558 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.301563 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.301566 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.301593 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.301595 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.301614 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.301618 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.301489 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.301577 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.301639 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.301657 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.301673 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.301667 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.301690 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.301704 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.301717 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.301728 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.419042 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.420338 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.420387 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.420397 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.420425 4752 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 29 10:44:20 crc kubenswrapper[4752]: E0929 10:44:20.421071 4752 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.21:6443: connect: connection refused" node="crc" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.468935 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.477641 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.500385 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.516350 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.522121 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 29 10:44:20 crc kubenswrapper[4752]: W0929 10:44:20.527939 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-02bfb49d2e1156cd6daab3d02f7139d61342a3a495821ae220ca6f9cf36a20a0 WatchSource:0}: Error finding container 02bfb49d2e1156cd6daab3d02f7139d61342a3a495821ae220ca6f9cf36a20a0: Status 404 returned error can't find the container with id 02bfb49d2e1156cd6daab3d02f7139d61342a3a495821ae220ca6f9cf36a20a0 Sep 29 10:44:20 crc kubenswrapper[4752]: W0929 10:44:20.528949 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-327b2d598a585cfac8bf6771a3b12692ca0c5591cb5d1aad669a09fef4bc038a WatchSource:0}: Error finding container 327b2d598a585cfac8bf6771a3b12692ca0c5591cb5d1aad669a09fef4bc038a: Status 404 returned error can't find the container with id 327b2d598a585cfac8bf6771a3b12692ca0c5591cb5d1aad669a09fef4bc038a Sep 29 10:44:20 crc kubenswrapper[4752]: W0929 10:44:20.540873 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-9aa18858126777956d651e633882fbb170761ea01757c31b0130ab6d1f10f222 WatchSource:0}: Error finding container 9aa18858126777956d651e633882fbb170761ea01757c31b0130ab6d1f10f222: Status 404 returned error can't find the container with id 9aa18858126777956d651e633882fbb170761ea01757c31b0130ab6d1f10f222 Sep 29 10:44:20 crc kubenswrapper[4752]: W0929 10:44:20.545590 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-59e7090cd3a6ce9037d3b6bf42ab1ae7dbb7cd36aa9702662fd226b307aeecc5 WatchSource:0}: Error finding container 59e7090cd3a6ce9037d3b6bf42ab1ae7dbb7cd36aa9702662fd226b307aeecc5: Status 404 returned error can't find the container with id 59e7090cd3a6ce9037d3b6bf42ab1ae7dbb7cd36aa9702662fd226b307aeecc5 Sep 29 10:44:20 crc kubenswrapper[4752]: W0929 10:44:20.546141 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-4824f2d1628c8feec3e11995849ed9f8c7e6b69a2ac6ad1080c59f349a3c9a66 WatchSource:0}: Error finding container 4824f2d1628c8feec3e11995849ed9f8c7e6b69a2ac6ad1080c59f349a3c9a66: Status 404 returned error can't find the container with id 4824f2d1628c8feec3e11995849ed9f8c7e6b69a2ac6ad1080c59f349a3c9a66 Sep 29 10:44:20 crc kubenswrapper[4752]: E0929 10:44:20.582367 4752 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.21:6443: connect: connection refused" interval="800ms" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.821582 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.823332 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.823368 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.823379 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.823403 4752 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 29 10:44:20 crc kubenswrapper[4752]: E0929 10:44:20.824143 4752 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.21:6443: connect: connection refused" node="crc" Sep 29 10:44:20 crc kubenswrapper[4752]: W0929 10:44:20.900401 4752 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.21:6443: connect: connection refused Sep 29 10:44:20 crc kubenswrapper[4752]: E0929 10:44:20.900515 4752 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.21:6443: connect: connection refused" logger="UnhandledError" Sep 29 10:44:20 crc kubenswrapper[4752]: W0929 10:44:20.955200 4752 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.21:6443: connect: connection refused Sep 29 10:44:20 crc kubenswrapper[4752]: E0929 10:44:20.955323 4752 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.21:6443: connect: connection refused" logger="UnhandledError" Sep 29 10:44:20 crc kubenswrapper[4752]: I0929 10:44:20.972499 4752 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.21:6443: connect: connection refused Sep 29 10:44:21 crc kubenswrapper[4752]: I0929 10:44:21.034892 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"59e7090cd3a6ce9037d3b6bf42ab1ae7dbb7cd36aa9702662fd226b307aeecc5"} Sep 29 10:44:21 crc kubenswrapper[4752]: I0929 10:44:21.036017 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"327b2d598a585cfac8bf6771a3b12692ca0c5591cb5d1aad669a09fef4bc038a"} Sep 29 10:44:21 crc kubenswrapper[4752]: I0929 10:44:21.037061 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"02bfb49d2e1156cd6daab3d02f7139d61342a3a495821ae220ca6f9cf36a20a0"} Sep 29 10:44:21 crc kubenswrapper[4752]: I0929 10:44:21.038244 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"9aa18858126777956d651e633882fbb170761ea01757c31b0130ab6d1f10f222"} Sep 29 10:44:21 crc kubenswrapper[4752]: I0929 10:44:21.039354 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"4824f2d1628c8feec3e11995849ed9f8c7e6b69a2ac6ad1080c59f349a3c9a66"} Sep 29 10:44:21 crc kubenswrapper[4752]: W0929 10:44:21.249975 4752 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.21:6443: connect: connection refused Sep 29 10:44:21 crc kubenswrapper[4752]: E0929 10:44:21.250929 4752 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.21:6443: connect: connection refused" logger="UnhandledError" Sep 29 10:44:21 crc kubenswrapper[4752]: E0929 10:44:21.384076 4752 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.21:6443: connect: connection refused" interval="1.6s" Sep 29 10:44:21 crc kubenswrapper[4752]: W0929 10:44:21.485259 4752 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.21:6443: connect: connection refused Sep 29 10:44:21 crc kubenswrapper[4752]: E0929 10:44:21.485425 4752 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.21:6443: connect: connection refused" logger="UnhandledError" Sep 29 10:44:21 crc kubenswrapper[4752]: I0929 10:44:21.625077 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 10:44:21 crc kubenswrapper[4752]: I0929 10:44:21.627649 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:21 crc kubenswrapper[4752]: I0929 10:44:21.627704 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:21 crc kubenswrapper[4752]: I0929 10:44:21.627716 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:21 crc kubenswrapper[4752]: I0929 10:44:21.627750 4752 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 29 10:44:21 crc kubenswrapper[4752]: E0929 10:44:21.628278 4752 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.21:6443: connect: connection refused" node="crc" Sep 29 10:44:21 crc kubenswrapper[4752]: I0929 10:44:21.972556 4752 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.21:6443: connect: connection refused Sep 29 10:44:22 crc kubenswrapper[4752]: I0929 10:44:22.044441 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 10:44:22 crc kubenswrapper[4752]: I0929 10:44:22.044490 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"5335487c039d2e7e80a940cfe980fb46caf0cfc6302660b9318d9c8c525227cb"} Sep 29 10:44:22 crc kubenswrapper[4752]: I0929 10:44:22.044306 4752 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="5335487c039d2e7e80a940cfe980fb46caf0cfc6302660b9318d9c8c525227cb" exitCode=0 Sep 29 10:44:22 crc kubenswrapper[4752]: I0929 10:44:22.045695 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:22 crc kubenswrapper[4752]: I0929 10:44:22.045749 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:22 crc kubenswrapper[4752]: I0929 10:44:22.045768 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:22 crc kubenswrapper[4752]: I0929 10:44:22.046932 4752 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="1e4ae4f6e0a6df2f1e370b0ff37704c0b0252752c0d8e8a1cdd83088ca9ec951" exitCode=0 Sep 29 10:44:22 crc kubenswrapper[4752]: I0929 10:44:22.047146 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 10:44:22 crc kubenswrapper[4752]: I0929 10:44:22.047246 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"1e4ae4f6e0a6df2f1e370b0ff37704c0b0252752c0d8e8a1cdd83088ca9ec951"} Sep 29 10:44:22 crc kubenswrapper[4752]: I0929 10:44:22.048117 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:22 crc kubenswrapper[4752]: I0929 10:44:22.048174 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:22 crc kubenswrapper[4752]: I0929 10:44:22.048192 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:22 crc kubenswrapper[4752]: I0929 10:44:22.051372 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"7dd4d83f6d6b5db7fc93239bc1a6b731c67bc15ef1ca1990b53589e4ad36bfa7"} Sep 29 10:44:22 crc kubenswrapper[4752]: I0929 10:44:22.051446 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"66d77cd5048e199a6eae84be4079c3b00305f4f5223b5176a49df0feb2f0bf8d"} Sep 29 10:44:22 crc kubenswrapper[4752]: I0929 10:44:22.051472 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"74b270e951a827068c908168bf04d4cd3bcba62e472e4a3f415de8b7463fdccc"} Sep 29 10:44:22 crc kubenswrapper[4752]: I0929 10:44:22.054041 4752 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="88c78244f091e746e6cad8937b40c33fd6aef6118e696069f48acc0201635f54" exitCode=0 Sep 29 10:44:22 crc kubenswrapper[4752]: I0929 10:44:22.054124 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"88c78244f091e746e6cad8937b40c33fd6aef6118e696069f48acc0201635f54"} Sep 29 10:44:22 crc kubenswrapper[4752]: I0929 10:44:22.054163 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 10:44:22 crc kubenswrapper[4752]: I0929 10:44:22.055010 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:22 crc kubenswrapper[4752]: I0929 10:44:22.055045 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:22 crc kubenswrapper[4752]: I0929 10:44:22.055058 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:22 crc kubenswrapper[4752]: I0929 10:44:22.055963 4752 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="80f961b58569cce6d634f225369902695ccda2e78efb1c6fd635f1535467cc1c" exitCode=0 Sep 29 10:44:22 crc kubenswrapper[4752]: I0929 10:44:22.056053 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 10:44:22 crc kubenswrapper[4752]: I0929 10:44:22.056065 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"80f961b58569cce6d634f225369902695ccda2e78efb1c6fd635f1535467cc1c"} Sep 29 10:44:22 crc kubenswrapper[4752]: I0929 10:44:22.057151 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:22 crc kubenswrapper[4752]: I0929 10:44:22.057180 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:22 crc kubenswrapper[4752]: I0929 10:44:22.057192 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:22 crc kubenswrapper[4752]: I0929 10:44:22.060980 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 10:44:22 crc kubenswrapper[4752]: I0929 10:44:22.061916 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:22 crc kubenswrapper[4752]: I0929 10:44:22.062008 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:22 crc kubenswrapper[4752]: I0929 10:44:22.062069 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:22 crc kubenswrapper[4752]: W0929 10:44:22.872491 4752 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.21:6443: connect: connection refused Sep 29 10:44:22 crc kubenswrapper[4752]: E0929 10:44:22.872575 4752 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.21:6443: connect: connection refused" logger="UnhandledError" Sep 29 10:44:22 crc kubenswrapper[4752]: I0929 10:44:22.973513 4752 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.21:6443: connect: connection refused Sep 29 10:44:22 crc kubenswrapper[4752]: E0929 10:44:22.985691 4752 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.21:6443: connect: connection refused" interval="3.2s" Sep 29 10:44:23 crc kubenswrapper[4752]: I0929 10:44:23.062256 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"8b188b508629875e659215e5d09b261c54073368b770d1f876b5b0146b27f1af"} Sep 29 10:44:23 crc kubenswrapper[4752]: I0929 10:44:23.062367 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 10:44:23 crc kubenswrapper[4752]: I0929 10:44:23.063607 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:23 crc kubenswrapper[4752]: I0929 10:44:23.063658 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:23 crc kubenswrapper[4752]: I0929 10:44:23.063675 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:23 crc kubenswrapper[4752]: I0929 10:44:23.065698 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"bbe61bb570ef2be352bb3a0e55da353ce7b618b397e3bf9f0d66da0c9b6f1d4a"} Sep 29 10:44:23 crc kubenswrapper[4752]: I0929 10:44:23.065758 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"d1157b82d6f3337270d30abdceadaa1f0a01b3c6d8de6bc8e9edf083a8264f19"} Sep 29 10:44:23 crc kubenswrapper[4752]: I0929 10:44:23.065769 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"854abd6205c2eec2229d0d65aec3edb7cf1cc1e77759df41bd22deda4a08c8b6"} Sep 29 10:44:23 crc kubenswrapper[4752]: I0929 10:44:23.065779 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e6223734bbce461c09916aea7629bba0cfa97ea17050bca7417020ece9ae031a"} Sep 29 10:44:23 crc kubenswrapper[4752]: I0929 10:44:23.070074 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 10:44:23 crc kubenswrapper[4752]: I0929 10:44:23.070286 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"1c6d3ad808fe69e726b66a03be183d33f000a614fadbc7f644015633fbb2b457"} Sep 29 10:44:23 crc kubenswrapper[4752]: I0929 10:44:23.070343 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"44a5132d9611cf58eef747d86fd0cef4eb52366b9d1bacc6df0cf5be145d3998"} Sep 29 10:44:23 crc kubenswrapper[4752]: I0929 10:44:23.070658 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"9bab580564f9dd31f6b2ea23a31918a9fdd2f247d13a0bd882f38dbaee4bf0b8"} Sep 29 10:44:23 crc kubenswrapper[4752]: I0929 10:44:23.071313 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:23 crc kubenswrapper[4752]: I0929 10:44:23.071367 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:23 crc kubenswrapper[4752]: I0929 10:44:23.071379 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:23 crc kubenswrapper[4752]: I0929 10:44:23.076065 4752 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="40c90938f79ba960fa16979dd5f239674df4b13cae8b0b5d3bb48b0e46219a34" exitCode=0 Sep 29 10:44:23 crc kubenswrapper[4752]: I0929 10:44:23.076138 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"40c90938f79ba960fa16979dd5f239674df4b13cae8b0b5d3bb48b0e46219a34"} Sep 29 10:44:23 crc kubenswrapper[4752]: I0929 10:44:23.076181 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 10:44:23 crc kubenswrapper[4752]: I0929 10:44:23.077048 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:23 crc kubenswrapper[4752]: I0929 10:44:23.077099 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:23 crc kubenswrapper[4752]: I0929 10:44:23.077113 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:23 crc kubenswrapper[4752]: I0929 10:44:23.083677 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"c39ef26bf3e7b95ac9a59199bbabe11fd4e831baba1b120ef97a4839c0c4aab7"} Sep 29 10:44:23 crc kubenswrapper[4752]: I0929 10:44:23.083784 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 10:44:23 crc kubenswrapper[4752]: I0929 10:44:23.087932 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:23 crc kubenswrapper[4752]: I0929 10:44:23.087977 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:23 crc kubenswrapper[4752]: I0929 10:44:23.087987 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:23 crc kubenswrapper[4752]: I0929 10:44:23.228524 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 10:44:23 crc kubenswrapper[4752]: I0929 10:44:23.229832 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:23 crc kubenswrapper[4752]: I0929 10:44:23.229866 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:23 crc kubenswrapper[4752]: I0929 10:44:23.229875 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:23 crc kubenswrapper[4752]: I0929 10:44:23.229900 4752 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 29 10:44:23 crc kubenswrapper[4752]: E0929 10:44:23.230387 4752 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.21:6443: connect: connection refused" node="crc" Sep 29 10:44:23 crc kubenswrapper[4752]: W0929 10:44:23.321944 4752 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.21:6443: connect: connection refused Sep 29 10:44:23 crc kubenswrapper[4752]: E0929 10:44:23.322024 4752 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.21:6443: connect: connection refused" logger="UnhandledError" Sep 29 10:44:23 crc kubenswrapper[4752]: W0929 10:44:23.614544 4752 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.21:6443: connect: connection refused Sep 29 10:44:23 crc kubenswrapper[4752]: E0929 10:44:23.614650 4752 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.21:6443: connect: connection refused" logger="UnhandledError" Sep 29 10:44:24 crc kubenswrapper[4752]: I0929 10:44:24.089741 4752 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="f99c6fe84624f3e518bbe35ee9b700effb126ff1f36d995262b7ed8b73364780" exitCode=0 Sep 29 10:44:24 crc kubenswrapper[4752]: I0929 10:44:24.089881 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"f99c6fe84624f3e518bbe35ee9b700effb126ff1f36d995262b7ed8b73364780"} Sep 29 10:44:24 crc kubenswrapper[4752]: I0929 10:44:24.089927 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 10:44:24 crc kubenswrapper[4752]: I0929 10:44:24.090737 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:24 crc kubenswrapper[4752]: I0929 10:44:24.090769 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:24 crc kubenswrapper[4752]: I0929 10:44:24.090780 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:24 crc kubenswrapper[4752]: I0929 10:44:24.092549 4752 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 29 10:44:24 crc kubenswrapper[4752]: I0929 10:44:24.092586 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 10:44:24 crc kubenswrapper[4752]: I0929 10:44:24.092982 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 10:44:24 crc kubenswrapper[4752]: I0929 10:44:24.093459 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"243027f874c8561246fc694e29dc97e29bdfa821afba81016f1c3dd7433a43d3"} Sep 29 10:44:24 crc kubenswrapper[4752]: I0929 10:44:24.093556 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 10:44:24 crc kubenswrapper[4752]: I0929 10:44:24.093908 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 10:44:24 crc kubenswrapper[4752]: I0929 10:44:24.093952 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:24 crc kubenswrapper[4752]: I0929 10:44:24.093972 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:24 crc kubenswrapper[4752]: I0929 10:44:24.093982 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:24 crc kubenswrapper[4752]: I0929 10:44:24.093913 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:24 crc kubenswrapper[4752]: I0929 10:44:24.094243 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:24 crc kubenswrapper[4752]: I0929 10:44:24.094273 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:24 crc kubenswrapper[4752]: I0929 10:44:24.094640 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:24 crc kubenswrapper[4752]: I0929 10:44:24.094671 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:24 crc kubenswrapper[4752]: I0929 10:44:24.094682 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:24 crc kubenswrapper[4752]: I0929 10:44:24.094846 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:24 crc kubenswrapper[4752]: I0929 10:44:24.094887 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:24 crc kubenswrapper[4752]: I0929 10:44:24.094899 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:25 crc kubenswrapper[4752]: I0929 10:44:25.064486 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 29 10:44:25 crc kubenswrapper[4752]: I0929 10:44:25.101837 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"a174bcfad22c2a58c48792478272705c80a56775b45b14919ea1de1dd92b4cbc"} Sep 29 10:44:25 crc kubenswrapper[4752]: I0929 10:44:25.101918 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"9378a6f1ac902b030f4ecabac1eae40f884dc1546a360e178f38300e137d8b0a"} Sep 29 10:44:25 crc kubenswrapper[4752]: I0929 10:44:25.101937 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"5e2d86e0821e0155affe296e5cc70e9904f04c800943101e62509e3a5e4e0808"} Sep 29 10:44:25 crc kubenswrapper[4752]: I0929 10:44:25.101945 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 10:44:25 crc kubenswrapper[4752]: I0929 10:44:25.101954 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 10:44:25 crc kubenswrapper[4752]: I0929 10:44:25.101950 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"00965359c30aa25677d4b114c00b339b155ab4b5316d5e355536bea5b65eaba2"} Sep 29 10:44:25 crc kubenswrapper[4752]: I0929 10:44:25.102122 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 29 10:44:25 crc kubenswrapper[4752]: I0929 10:44:25.102140 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"828d416b69696f709d91feb8df8fead0f95be74a91c5dab25756e341e29413dd"} Sep 29 10:44:25 crc kubenswrapper[4752]: I0929 10:44:25.103369 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:25 crc kubenswrapper[4752]: I0929 10:44:25.103700 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:25 crc kubenswrapper[4752]: I0929 10:44:25.103716 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:25 crc kubenswrapper[4752]: I0929 10:44:25.103485 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:25 crc kubenswrapper[4752]: I0929 10:44:25.104105 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:25 crc kubenswrapper[4752]: I0929 10:44:25.104119 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:25 crc kubenswrapper[4752]: I0929 10:44:25.425247 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Sep 29 10:44:26 crc kubenswrapper[4752]: I0929 10:44:26.104615 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 10:44:26 crc kubenswrapper[4752]: I0929 10:44:26.104743 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 10:44:26 crc kubenswrapper[4752]: I0929 10:44:26.105913 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:26 crc kubenswrapper[4752]: I0929 10:44:26.106003 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:26 crc kubenswrapper[4752]: I0929 10:44:26.106016 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:26 crc kubenswrapper[4752]: I0929 10:44:26.106904 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:26 crc kubenswrapper[4752]: I0929 10:44:26.106947 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:26 crc kubenswrapper[4752]: I0929 10:44:26.106961 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:26 crc kubenswrapper[4752]: I0929 10:44:26.430819 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 10:44:26 crc kubenswrapper[4752]: I0929 10:44:26.432732 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:26 crc kubenswrapper[4752]: I0929 10:44:26.432794 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:26 crc kubenswrapper[4752]: I0929 10:44:26.432831 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:26 crc kubenswrapper[4752]: I0929 10:44:26.432872 4752 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 29 10:44:26 crc kubenswrapper[4752]: I0929 10:44:26.610275 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 29 10:44:26 crc kubenswrapper[4752]: I0929 10:44:26.610519 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 10:44:26 crc kubenswrapper[4752]: I0929 10:44:26.612151 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:26 crc kubenswrapper[4752]: I0929 10:44:26.612190 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:26 crc kubenswrapper[4752]: I0929 10:44:26.612201 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:27 crc kubenswrapper[4752]: I0929 10:44:27.107527 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 10:44:27 crc kubenswrapper[4752]: I0929 10:44:27.108772 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:27 crc kubenswrapper[4752]: I0929 10:44:27.108835 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:27 crc kubenswrapper[4752]: I0929 10:44:27.108846 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:27 crc kubenswrapper[4752]: I0929 10:44:27.305967 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 29 10:44:27 crc kubenswrapper[4752]: I0929 10:44:27.306159 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 10:44:27 crc kubenswrapper[4752]: I0929 10:44:27.307558 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:27 crc kubenswrapper[4752]: I0929 10:44:27.307632 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:27 crc kubenswrapper[4752]: I0929 10:44:27.307661 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:28 crc kubenswrapper[4752]: I0929 10:44:28.003052 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 29 10:44:28 crc kubenswrapper[4752]: I0929 10:44:28.009965 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 29 10:44:28 crc kubenswrapper[4752]: I0929 10:44:28.110491 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 10:44:28 crc kubenswrapper[4752]: I0929 10:44:28.115627 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:28 crc kubenswrapper[4752]: I0929 10:44:28.115683 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:28 crc kubenswrapper[4752]: I0929 10:44:28.115698 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:28 crc kubenswrapper[4752]: I0929 10:44:28.394064 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 29 10:44:28 crc kubenswrapper[4752]: I0929 10:44:28.394343 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 10:44:28 crc kubenswrapper[4752]: I0929 10:44:28.395783 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:28 crc kubenswrapper[4752]: I0929 10:44:28.395848 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:28 crc kubenswrapper[4752]: I0929 10:44:28.395857 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:28 crc kubenswrapper[4752]: I0929 10:44:28.663605 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 29 10:44:28 crc kubenswrapper[4752]: I0929 10:44:28.663847 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 10:44:28 crc kubenswrapper[4752]: I0929 10:44:28.665179 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:28 crc kubenswrapper[4752]: I0929 10:44:28.665219 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:28 crc kubenswrapper[4752]: I0929 10:44:28.665232 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:28 crc kubenswrapper[4752]: I0929 10:44:28.671772 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 29 10:44:29 crc kubenswrapper[4752]: I0929 10:44:29.112736 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 10:44:29 crc kubenswrapper[4752]: I0929 10:44:29.114094 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:29 crc kubenswrapper[4752]: I0929 10:44:29.114157 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:29 crc kubenswrapper[4752]: I0929 10:44:29.114172 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:29 crc kubenswrapper[4752]: I0929 10:44:29.610478 4752 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Sep 29 10:44:29 crc kubenswrapper[4752]: I0929 10:44:29.610662 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Sep 29 10:44:30 crc kubenswrapper[4752]: I0929 10:44:30.115698 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 10:44:30 crc kubenswrapper[4752]: I0929 10:44:30.117030 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:30 crc kubenswrapper[4752]: I0929 10:44:30.117093 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:30 crc kubenswrapper[4752]: I0929 10:44:30.117107 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:30 crc kubenswrapper[4752]: E0929 10:44:30.124156 4752 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Sep 29 10:44:32 crc kubenswrapper[4752]: I0929 10:44:32.327928 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Sep 29 10:44:32 crc kubenswrapper[4752]: I0929 10:44:32.328195 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 10:44:32 crc kubenswrapper[4752]: I0929 10:44:32.329704 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:32 crc kubenswrapper[4752]: I0929 10:44:32.329758 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:32 crc kubenswrapper[4752]: I0929 10:44:32.329770 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:33 crc kubenswrapper[4752]: I0929 10:44:33.973607 4752 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Sep 29 10:44:34 crc kubenswrapper[4752]: W0929 10:44:34.095270 4752 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Sep 29 10:44:34 crc kubenswrapper[4752]: I0929 10:44:34.095400 4752 trace.go:236] Trace[7410631]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (29-Sep-2025 10:44:24.093) (total time: 10001ms): Sep 29 10:44:34 crc kubenswrapper[4752]: Trace[7410631]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (10:44:34.095) Sep 29 10:44:34 crc kubenswrapper[4752]: Trace[7410631]: [10.001496425s] [10.001496425s] END Sep 29 10:44:34 crc kubenswrapper[4752]: E0929 10:44:34.095429 4752 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Sep 29 10:44:35 crc kubenswrapper[4752]: I0929 10:44:35.132314 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Sep 29 10:44:35 crc kubenswrapper[4752]: I0929 10:44:35.134500 4752 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="243027f874c8561246fc694e29dc97e29bdfa821afba81016f1c3dd7433a43d3" exitCode=255 Sep 29 10:44:35 crc kubenswrapper[4752]: I0929 10:44:35.134552 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"243027f874c8561246fc694e29dc97e29bdfa821afba81016f1c3dd7433a43d3"} Sep 29 10:44:35 crc kubenswrapper[4752]: I0929 10:44:35.134711 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 10:44:35 crc kubenswrapper[4752]: I0929 10:44:35.135664 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:35 crc kubenswrapper[4752]: I0929 10:44:35.135718 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:35 crc kubenswrapper[4752]: I0929 10:44:35.135732 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:35 crc kubenswrapper[4752]: I0929 10:44:35.138750 4752 scope.go:117] "RemoveContainer" containerID="243027f874c8561246fc694e29dc97e29bdfa821afba81016f1c3dd7433a43d3" Sep 29 10:44:35 crc kubenswrapper[4752]: I0929 10:44:35.376866 4752 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Sep 29 10:44:35 crc kubenswrapper[4752]: I0929 10:44:35.376955 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Sep 29 10:44:35 crc kubenswrapper[4752]: I0929 10:44:35.381568 4752 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Sep 29 10:44:35 crc kubenswrapper[4752]: I0929 10:44:35.381622 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Sep 29 10:44:36 crc kubenswrapper[4752]: I0929 10:44:36.140133 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Sep 29 10:44:36 crc kubenswrapper[4752]: I0929 10:44:36.142390 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c927118840179fccacbe6a18a329c117cef73a6e914bf38d20fc2439d6a5c1ee"} Sep 29 10:44:36 crc kubenswrapper[4752]: I0929 10:44:36.142573 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 10:44:36 crc kubenswrapper[4752]: I0929 10:44:36.143504 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:36 crc kubenswrapper[4752]: I0929 10:44:36.143542 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:36 crc kubenswrapper[4752]: I0929 10:44:36.143551 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:38 crc kubenswrapper[4752]: I0929 10:44:38.401623 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 29 10:44:38 crc kubenswrapper[4752]: I0929 10:44:38.401866 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 10:44:38 crc kubenswrapper[4752]: I0929 10:44:38.401968 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 29 10:44:38 crc kubenswrapper[4752]: I0929 10:44:38.403304 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:38 crc kubenswrapper[4752]: I0929 10:44:38.403360 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:38 crc kubenswrapper[4752]: I0929 10:44:38.403383 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:38 crc kubenswrapper[4752]: I0929 10:44:38.406208 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 29 10:44:38 crc kubenswrapper[4752]: I0929 10:44:38.676609 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 29 10:44:38 crc kubenswrapper[4752]: I0929 10:44:38.676765 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 10:44:38 crc kubenswrapper[4752]: I0929 10:44:38.678001 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:38 crc kubenswrapper[4752]: I0929 10:44:38.678049 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:38 crc kubenswrapper[4752]: I0929 10:44:38.678073 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:39 crc kubenswrapper[4752]: I0929 10:44:39.150507 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 10:44:39 crc kubenswrapper[4752]: I0929 10:44:39.151476 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:39 crc kubenswrapper[4752]: I0929 10:44:39.151507 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:39 crc kubenswrapper[4752]: I0929 10:44:39.151517 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:39 crc kubenswrapper[4752]: I0929 10:44:39.611188 4752 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Sep 29 10:44:39 crc kubenswrapper[4752]: I0929 10:44:39.611289 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Sep 29 10:44:39 crc kubenswrapper[4752]: I0929 10:44:39.711761 4752 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Sep 29 10:44:40 crc kubenswrapper[4752]: E0929 10:44:40.124290 4752 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.152534 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.153532 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.153576 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.153588 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:40 crc kubenswrapper[4752]: E0929 10:44:40.380967 4752 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.383030 4752 trace.go:236] Trace[594863160]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (29-Sep-2025 10:44:26.974) (total time: 13408ms): Sep 29 10:44:40 crc kubenswrapper[4752]: Trace[594863160]: ---"Objects listed" error: 13408ms (10:44:40.382) Sep 29 10:44:40 crc kubenswrapper[4752]: Trace[594863160]: [13.408579029s] [13.408579029s] END Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.383071 4752 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.383514 4752 trace.go:236] Trace[2099903695]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (29-Sep-2025 10:44:26.980) (total time: 13402ms): Sep 29 10:44:40 crc kubenswrapper[4752]: Trace[2099903695]: ---"Objects listed" error: 13402ms (10:44:40.383) Sep 29 10:44:40 crc kubenswrapper[4752]: Trace[2099903695]: [13.402923315s] [13.402923315s] END Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.383541 4752 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Sep 29 10:44:40 crc kubenswrapper[4752]: E0929 10:44:40.383691 4752 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.387168 4752 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.387172 4752 trace.go:236] Trace[1982623392]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (29-Sep-2025 10:44:28.960) (total time: 11426ms): Sep 29 10:44:40 crc kubenswrapper[4752]: Trace[1982623392]: ---"Objects listed" error: 11426ms (10:44:40.387) Sep 29 10:44:40 crc kubenswrapper[4752]: Trace[1982623392]: [11.426486608s] [11.426486608s] END Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.387209 4752 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.971897 4752 apiserver.go:52] "Watching apiserver" Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.974215 4752 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.975306 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h"] Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.976440 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 10:44:40 crc kubenswrapper[4752]: E0929 10:44:40.976545 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.976897 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.976922 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.976987 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 10:44:40 crc kubenswrapper[4752]: E0929 10:44:40.977017 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.977064 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 10:44:40 crc kubenswrapper[4752]: E0929 10:44:40.977090 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.977134 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.979528 4752 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.981644 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.982876 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.984469 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.984563 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.984748 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.984771 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.985042 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.986455 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.988982 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.991290 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.991341 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.991370 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.991397 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.991418 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.991438 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.991460 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.991489 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.991514 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.991536 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.991561 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.991582 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.991604 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.991627 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.991649 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.991674 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.991697 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.991721 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.991688 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.991742 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.991766 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.991793 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.991793 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.991844 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.991868 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.992039 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.992065 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.992084 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.992108 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.992131 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.992153 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.992175 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.992198 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.992221 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.992245 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.992642 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.992655 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.992849 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.992856 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.993037 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.993077 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.993218 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.993316 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.993398 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.993451 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.993469 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.993487 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.993538 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.993684 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.993840 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.993926 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.993947 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.994042 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.994043 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.994179 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.994188 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.994253 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.994231 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.994311 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.994735 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.994336 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.994411 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.994973 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Sep 29 10:44:40 crc kubenswrapper[4752]: E0929 10:44:40.995157 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 10:44:41.49512345 +0000 UTC m=+22.284265277 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.995164 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.995012 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.995262 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.995436 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.995516 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.995555 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.995590 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.995604 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.995855 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.996015 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.996109 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.996185 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.996226 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.996251 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.996271 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.996583 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.996612 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.996302 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.996704 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.996735 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.996819 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.996839 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.996857 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.996881 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.996906 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.996924 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.996947 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.996964 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.996981 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.996989 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.996999 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.997021 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.997042 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.997065 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.997087 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.997111 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.997132 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.997154 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.997176 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.997201 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.997234 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.997262 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.997288 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.997316 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.997342 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.997365 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.997392 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.997422 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.997451 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.997477 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.997502 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.997534 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.997558 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.997597 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.997619 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.997643 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.997666 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.997689 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.997747 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.997768 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.997789 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.997830 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.997857 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.997877 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.997898 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.997920 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.997942 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.997966 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.997988 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.998034 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.998058 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.998079 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.998103 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.998129 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.998151 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.998176 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.998197 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.998222 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.998243 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.998269 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.998292 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.998315 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.998340 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.998368 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.998397 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.998428 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.998454 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.998478 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.998505 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.998529 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.998553 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.998577 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.998599 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.998623 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.998644 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.998665 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.998687 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.998739 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.998764 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.998787 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.998839 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.998864 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.998888 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.998913 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.998938 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.998961 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.998983 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.999007 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.999028 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.999052 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.999074 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.999096 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.999120 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.999145 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.999166 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.999195 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.999219 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.999239 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.999260 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Sep 29 10:44:40 crc kubenswrapper[4752]: I0929 10:44:40.999288 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:40.999315 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:40.999339 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:40.999362 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:40.999386 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:40.999412 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:40.999437 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:40.999461 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:40.999487 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:40.999511 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:40.999536 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:40.999559 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:40.999583 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:40.999605 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:40.999630 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:40.999652 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:40.999674 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:40.999696 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:40.999718 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:40.999742 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:40.999766 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:40.999790 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.004758 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.004838 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.004872 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.004931 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.004960 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.004989 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.005019 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.005044 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.005068 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.005091 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.005115 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.005138 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.005168 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.005192 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.005216 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.005241 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.005265 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.005296 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.005323 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.005350 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.005375 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.005399 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.005423 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.005448 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.005471 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.005497 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.005521 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.005547 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.005570 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.005597 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.005633 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.005724 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.005754 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.006900 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.007030 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.007930 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.007956 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.011314 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.011378 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.011416 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.011446 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.011471 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.011492 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.011520 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.011540 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.011611 4752 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.011625 4752 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.011636 4752 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.011647 4752 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.011658 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.011669 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.011679 4752 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.011689 4752 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.011700 4752 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.011710 4752 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.011720 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.011730 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.011740 4752 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.011750 4752 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.011760 4752 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.011769 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.011781 4752 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.011790 4752 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.011819 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.011829 4752 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.011840 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.011850 4752 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.011895 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.011909 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.011919 4752 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.011929 4752 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.011939 4752 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.011951 4752 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.011962 4752 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.011973 4752 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.011983 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.011995 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.012004 4752 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.012014 4752 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.012024 4752 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.012034 4752 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.012044 4752 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.012053 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.012064 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.012074 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:40.997393 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:40.997718 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:40.998916 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:40.999245 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:40.999565 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.000165 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.000624 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.000658 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.000944 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.001209 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.001565 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.001574 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.014358 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.001968 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.001972 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.002043 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.002181 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.004729 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.005150 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.005241 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.005334 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.005844 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.005918 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.006189 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.006297 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.006465 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.006569 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.006849 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.006888 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.006891 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.007086 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.007440 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.007744 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.007837 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.008347 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.008750 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.009579 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.011892 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: E0929 10:44:41.012149 4752 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.012211 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.012282 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.012312 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.012567 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.012640 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.012786 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.012965 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.013056 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.013201 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.013410 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.013448 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.013658 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.013757 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.014099 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.014195 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.022489 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.022688 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.022958 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.022997 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.023146 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.023257 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.023138 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.023232 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.023516 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.023902 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.023936 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.023986 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.024038 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.024281 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.024260 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.024367 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.024839 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.025006 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.025450 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.025677 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.026188 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.026400 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.026466 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.026600 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.026749 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.026757 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.027020 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.027103 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.027427 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: E0929 10:44:41.027637 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-29 10:44:41.527618672 +0000 UTC m=+22.316760339 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.027763 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.028147 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.028186 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.028335 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.028396 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.028493 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.028627 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.028821 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.029103 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.029266 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.029469 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.030108 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.030845 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.031520 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.031755 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.032026 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.033145 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.033876 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.033960 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.034182 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.034195 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.034342 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.034457 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.034643 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.034706 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.034747 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.035356 4752 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.035926 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.036216 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.036392 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.037087 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.037129 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.037325 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.037392 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.037514 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.037672 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.037718 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.037702 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.037982 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: E0929 10:44:41.038171 4752 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.038238 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: E0929 10:44:41.038355 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-29 10:44:41.538334317 +0000 UTC m=+22.327475984 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.039458 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.041093 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.041658 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.041756 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.042246 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.042578 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.042647 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.042727 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.042819 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.043104 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.043324 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.044345 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.045323 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.045530 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.046057 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.046211 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.046557 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.047013 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.047499 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.047823 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.048612 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.051700 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.052047 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.052906 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: E0929 10:44:41.054689 4752 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 29 10:44:41 crc kubenswrapper[4752]: E0929 10:44:41.054722 4752 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 29 10:44:41 crc kubenswrapper[4752]: E0929 10:44:41.054736 4752 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 10:44:41 crc kubenswrapper[4752]: E0929 10:44:41.054824 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-29 10:44:41.554789838 +0000 UTC m=+22.343931505 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.055118 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.055321 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.055345 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.058996 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.059101 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.059598 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.061109 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.061117 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 10:44:41 crc kubenswrapper[4752]: E0929 10:44:41.061467 4752 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 29 10:44:41 crc kubenswrapper[4752]: E0929 10:44:41.061518 4752 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 29 10:44:41 crc kubenswrapper[4752]: E0929 10:44:41.061536 4752 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.061539 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 29 10:44:41 crc kubenswrapper[4752]: E0929 10:44:41.061615 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-29 10:44:41.561591832 +0000 UTC m=+22.350733499 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.061967 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.062452 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.064563 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.064648 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.067560 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.079206 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.079603 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.080082 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.084622 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.090532 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.099275 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.103513 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.112625 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.112819 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.112884 4752 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.112899 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.112913 4752 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.112925 4752 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.112937 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.112948 4752 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.112960 4752 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.112972 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.112984 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.112995 4752 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.113011 4752 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.113023 4752 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.113036 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.113047 4752 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.113059 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.113072 4752 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.113083 4752 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.113095 4752 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.113107 4752 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.113117 4752 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.113128 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.113139 4752 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.113151 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.113164 4752 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.113176 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.113187 4752 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.113197 4752 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.113208 4752 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.113219 4752 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.113230 4752 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.113244 4752 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.113256 4752 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.113268 4752 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.113279 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.113292 4752 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.113303 4752 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.113316 4752 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.113328 4752 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.113341 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.113353 4752 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.113364 4752 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.113374 4752 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.113385 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.113398 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.113411 4752 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.113424 4752 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.113437 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.113449 4752 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.113460 4752 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.113472 4752 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.113468 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.113485 4752 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.113534 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.113538 4752 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.113586 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.113600 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.113613 4752 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.113591 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.113627 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.114027 4752 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.114079 4752 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.114097 4752 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.114110 4752 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.114124 4752 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.114137 4752 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.114150 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.114163 4752 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.114175 4752 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.114187 4752 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.114200 4752 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.114214 4752 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.114226 4752 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.114238 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.114250 4752 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.114263 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.114284 4752 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.114298 4752 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.114310 4752 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.114323 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.114336 4752 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.114349 4752 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.114360 4752 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.114372 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.114384 4752 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.114396 4752 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.114408 4752 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.114420 4752 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.114432 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.114445 4752 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.114458 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.114473 4752 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.114501 4752 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.114523 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.114537 4752 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.114549 4752 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.114561 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.114574 4752 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.114584 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.114595 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.114608 4752 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.114619 4752 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.114631 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.114644 4752 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.114655 4752 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.114666 4752 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.114678 4752 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.114688 4752 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.114699 4752 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.114712 4752 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.114724 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.114736 4752 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.114748 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.114762 4752 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.114776 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.114789 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.114829 4752 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.114844 4752 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.114856 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.114870 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.114892 4752 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.114907 4752 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.114920 4752 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.114935 4752 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.114949 4752 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.114964 4752 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.114977 4752 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.114990 4752 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.115002 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.115015 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.115028 4752 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.115041 4752 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.115053 4752 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.115065 4752 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.115076 4752 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.115088 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.115099 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.115111 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.115123 4752 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.115135 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.115147 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.115159 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.115172 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.115184 4752 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.115196 4752 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.115209 4752 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.115223 4752 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.115236 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.115247 4752 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.115259 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.115272 4752 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.115284 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.115295 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.115306 4752 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.115318 4752 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.115328 4752 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.115340 4752 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.115352 4752 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.125570 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.139234 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.151434 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.165591 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.186268 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.199586 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.211573 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.290873 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.305278 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.325686 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 29 10:44:41 crc kubenswrapper[4752]: W0929 10:44:41.341503 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-4e8502f53c68786a0caafd06500d3e850c51e0f397bea92ad496eca159b1af5b WatchSource:0}: Error finding container 4e8502f53c68786a0caafd06500d3e850c51e0f397bea92ad496eca159b1af5b: Status 404 returned error can't find the container with id 4e8502f53c68786a0caafd06500d3e850c51e0f397bea92ad496eca159b1af5b Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.456250 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-7kp7q"] Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.456822 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-7kp7q" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.458480 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.459090 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.459185 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.471248 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.482320 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.491975 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7kp7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66a61a7f-9be6-486b-a425-62ed62ec0ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kgr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7kp7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.502523 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.514115 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.517295 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.517393 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/66a61a7f-9be6-486b-a425-62ed62ec0ebd-hosts-file\") pod \"node-resolver-7kp7q\" (UID: \"66a61a7f-9be6-486b-a425-62ed62ec0ebd\") " pod="openshift-dns/node-resolver-7kp7q" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.517418 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kgr2\" (UniqueName: \"kubernetes.io/projected/66a61a7f-9be6-486b-a425-62ed62ec0ebd-kube-api-access-2kgr2\") pod \"node-resolver-7kp7q\" (UID: \"66a61a7f-9be6-486b-a425-62ed62ec0ebd\") " pod="openshift-dns/node-resolver-7kp7q" Sep 29 10:44:41 crc kubenswrapper[4752]: E0929 10:44:41.517560 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 10:44:42.517543209 +0000 UTC m=+23.306684876 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.524385 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.539045 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.618072 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.618127 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.618152 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.618193 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/66a61a7f-9be6-486b-a425-62ed62ec0ebd-hosts-file\") pod \"node-resolver-7kp7q\" (UID: \"66a61a7f-9be6-486b-a425-62ed62ec0ebd\") " pod="openshift-dns/node-resolver-7kp7q" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.618219 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kgr2\" (UniqueName: \"kubernetes.io/projected/66a61a7f-9be6-486b-a425-62ed62ec0ebd-kube-api-access-2kgr2\") pod \"node-resolver-7kp7q\" (UID: \"66a61a7f-9be6-486b-a425-62ed62ec0ebd\") " pod="openshift-dns/node-resolver-7kp7q" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.618242 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 10:44:41 crc kubenswrapper[4752]: E0929 10:44:41.618315 4752 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 29 10:44:41 crc kubenswrapper[4752]: E0929 10:44:41.618354 4752 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 29 10:44:41 crc kubenswrapper[4752]: E0929 10:44:41.618373 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-29 10:44:42.618358231 +0000 UTC m=+23.407499898 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 29 10:44:41 crc kubenswrapper[4752]: E0929 10:44:41.618502 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-29 10:44:42.618474064 +0000 UTC m=+23.407615791 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.618583 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/66a61a7f-9be6-486b-a425-62ed62ec0ebd-hosts-file\") pod \"node-resolver-7kp7q\" (UID: \"66a61a7f-9be6-486b-a425-62ed62ec0ebd\") " pod="openshift-dns/node-resolver-7kp7q" Sep 29 10:44:41 crc kubenswrapper[4752]: E0929 10:44:41.618705 4752 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 29 10:44:41 crc kubenswrapper[4752]: E0929 10:44:41.618723 4752 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 29 10:44:41 crc kubenswrapper[4752]: E0929 10:44:41.618739 4752 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 10:44:41 crc kubenswrapper[4752]: E0929 10:44:41.618776 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-29 10:44:42.618765471 +0000 UTC m=+23.407907138 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 10:44:41 crc kubenswrapper[4752]: E0929 10:44:41.618863 4752 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 29 10:44:41 crc kubenswrapper[4752]: E0929 10:44:41.618878 4752 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 29 10:44:41 crc kubenswrapper[4752]: E0929 10:44:41.618888 4752 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 10:44:41 crc kubenswrapper[4752]: E0929 10:44:41.618922 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-29 10:44:42.618911835 +0000 UTC m=+23.408053572 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.643122 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kgr2\" (UniqueName: \"kubernetes.io/projected/66a61a7f-9be6-486b-a425-62ed62ec0ebd-kube-api-access-2kgr2\") pod \"node-resolver-7kp7q\" (UID: \"66a61a7f-9be6-486b-a425-62ed62ec0ebd\") " pod="openshift-dns/node-resolver-7kp7q" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.794092 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-7kp7q" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.826303 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-mgrvs"] Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.826884 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.827734 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-vm6zb"] Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.828321 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.828543 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-xv5q7"] Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.828646 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.828759 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-vm6zb" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.828769 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-xv5q7" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.829074 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.829184 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.830745 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.830854 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.831053 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.831087 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.831237 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.831271 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.831386 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.833306 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.842054 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.854258 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.866996 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5863c243-797d-462a-b11f-71aaf005f8d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdtpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdtpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mgrvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.878774 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.890623 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.901443 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.912493 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.920555 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7kp7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66a61a7f-9be6-486b-a425-62ed62ec0ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kgr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7kp7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.922041 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/5863c243-797d-462a-b11f-71aaf005f8d1-rootfs\") pod \"machine-config-daemon-mgrvs\" (UID: \"5863c243-797d-462a-b11f-71aaf005f8d1\") " pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.922105 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9f30a1f9-86ef-450e-9f8c-8ef8d4ac380a-system-cni-dir\") pod \"multus-additional-cni-plugins-vm6zb\" (UID: \"9f30a1f9-86ef-450e-9f8c-8ef8d4ac380a\") " pod="openshift-multus/multus-additional-cni-plugins-vm6zb" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.922141 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/52fc9378-c37b-424b-afde-7b191bab5fde-host-run-k8s-cni-cncf-io\") pod \"multus-xv5q7\" (UID: \"52fc9378-c37b-424b-afde-7b191bab5fde\") " pod="openshift-multus/multus-xv5q7" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.922199 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9f30a1f9-86ef-450e-9f8c-8ef8d4ac380a-cnibin\") pod \"multus-additional-cni-plugins-vm6zb\" (UID: \"9f30a1f9-86ef-450e-9f8c-8ef8d4ac380a\") " pod="openshift-multus/multus-additional-cni-plugins-vm6zb" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.922268 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/52fc9378-c37b-424b-afde-7b191bab5fde-hostroot\") pod \"multus-xv5q7\" (UID: \"52fc9378-c37b-424b-afde-7b191bab5fde\") " pod="openshift-multus/multus-xv5q7" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.922297 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/52fc9378-c37b-424b-afde-7b191bab5fde-cni-binary-copy\") pod \"multus-xv5q7\" (UID: \"52fc9378-c37b-424b-afde-7b191bab5fde\") " pod="openshift-multus/multus-xv5q7" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.922365 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/52fc9378-c37b-424b-afde-7b191bab5fde-multus-socket-dir-parent\") pod \"multus-xv5q7\" (UID: \"52fc9378-c37b-424b-afde-7b191bab5fde\") " pod="openshift-multus/multus-xv5q7" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.922427 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5863c243-797d-462a-b11f-71aaf005f8d1-mcd-auth-proxy-config\") pod \"machine-config-daemon-mgrvs\" (UID: \"5863c243-797d-462a-b11f-71aaf005f8d1\") " pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.922457 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdtpd\" (UniqueName: \"kubernetes.io/projected/5863c243-797d-462a-b11f-71aaf005f8d1-kube-api-access-qdtpd\") pod \"machine-config-daemon-mgrvs\" (UID: \"5863c243-797d-462a-b11f-71aaf005f8d1\") " pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.922485 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/9f30a1f9-86ef-450e-9f8c-8ef8d4ac380a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-vm6zb\" (UID: \"9f30a1f9-86ef-450e-9f8c-8ef8d4ac380a\") " pod="openshift-multus/multus-additional-cni-plugins-vm6zb" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.922524 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9f30a1f9-86ef-450e-9f8c-8ef8d4ac380a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-vm6zb\" (UID: \"9f30a1f9-86ef-450e-9f8c-8ef8d4ac380a\") " pod="openshift-multus/multus-additional-cni-plugins-vm6zb" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.922589 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/52fc9378-c37b-424b-afde-7b191bab5fde-cnibin\") pod \"multus-xv5q7\" (UID: \"52fc9378-c37b-424b-afde-7b191bab5fde\") " pod="openshift-multus/multus-xv5q7" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.922614 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/52fc9378-c37b-424b-afde-7b191bab5fde-multus-conf-dir\") pod \"multus-xv5q7\" (UID: \"52fc9378-c37b-424b-afde-7b191bab5fde\") " pod="openshift-multus/multus-xv5q7" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.922639 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/52fc9378-c37b-424b-afde-7b191bab5fde-host-run-multus-certs\") pod \"multus-xv5q7\" (UID: \"52fc9378-c37b-424b-afde-7b191bab5fde\") " pod="openshift-multus/multus-xv5q7" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.922656 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9f30a1f9-86ef-450e-9f8c-8ef8d4ac380a-os-release\") pod \"multus-additional-cni-plugins-vm6zb\" (UID: \"9f30a1f9-86ef-450e-9f8c-8ef8d4ac380a\") " pod="openshift-multus/multus-additional-cni-plugins-vm6zb" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.922718 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/52fc9378-c37b-424b-afde-7b191bab5fde-host-var-lib-kubelet\") pod \"multus-xv5q7\" (UID: \"52fc9378-c37b-424b-afde-7b191bab5fde\") " pod="openshift-multus/multus-xv5q7" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.922776 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/52fc9378-c37b-424b-afde-7b191bab5fde-os-release\") pod \"multus-xv5q7\" (UID: \"52fc9378-c37b-424b-afde-7b191bab5fde\") " pod="openshift-multus/multus-xv5q7" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.922820 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/52fc9378-c37b-424b-afde-7b191bab5fde-host-run-netns\") pod \"multus-xv5q7\" (UID: \"52fc9378-c37b-424b-afde-7b191bab5fde\") " pod="openshift-multus/multus-xv5q7" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.922838 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/52fc9378-c37b-424b-afde-7b191bab5fde-etc-kubernetes\") pod \"multus-xv5q7\" (UID: \"52fc9378-c37b-424b-afde-7b191bab5fde\") " pod="openshift-multus/multus-xv5q7" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.922937 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/52fc9378-c37b-424b-afde-7b191bab5fde-system-cni-dir\") pod \"multus-xv5q7\" (UID: \"52fc9378-c37b-424b-afde-7b191bab5fde\") " pod="openshift-multus/multus-xv5q7" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.923001 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5863c243-797d-462a-b11f-71aaf005f8d1-proxy-tls\") pod \"machine-config-daemon-mgrvs\" (UID: \"5863c243-797d-462a-b11f-71aaf005f8d1\") " pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.923077 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/52fc9378-c37b-424b-afde-7b191bab5fde-multus-cni-dir\") pod \"multus-xv5q7\" (UID: \"52fc9378-c37b-424b-afde-7b191bab5fde\") " pod="openshift-multus/multus-xv5q7" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.923101 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/52fc9378-c37b-424b-afde-7b191bab5fde-host-var-lib-cni-multus\") pod \"multus-xv5q7\" (UID: \"52fc9378-c37b-424b-afde-7b191bab5fde\") " pod="openshift-multus/multus-xv5q7" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.923130 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4rqn\" (UniqueName: \"kubernetes.io/projected/52fc9378-c37b-424b-afde-7b191bab5fde-kube-api-access-v4rqn\") pod \"multus-xv5q7\" (UID: \"52fc9378-c37b-424b-afde-7b191bab5fde\") " pod="openshift-multus/multus-xv5q7" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.923156 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4znxf\" (UniqueName: \"kubernetes.io/projected/9f30a1f9-86ef-450e-9f8c-8ef8d4ac380a-kube-api-access-4znxf\") pod \"multus-additional-cni-plugins-vm6zb\" (UID: \"9f30a1f9-86ef-450e-9f8c-8ef8d4ac380a\") " pod="openshift-multus/multus-additional-cni-plugins-vm6zb" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.923178 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/52fc9378-c37b-424b-afde-7b191bab5fde-host-var-lib-cni-bin\") pod \"multus-xv5q7\" (UID: \"52fc9378-c37b-424b-afde-7b191bab5fde\") " pod="openshift-multus/multus-xv5q7" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.923195 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/52fc9378-c37b-424b-afde-7b191bab5fde-multus-daemon-config\") pod \"multus-xv5q7\" (UID: \"52fc9378-c37b-424b-afde-7b191bab5fde\") " pod="openshift-multus/multus-xv5q7" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.923212 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9f30a1f9-86ef-450e-9f8c-8ef8d4ac380a-cni-binary-copy\") pod \"multus-additional-cni-plugins-vm6zb\" (UID: \"9f30a1f9-86ef-450e-9f8c-8ef8d4ac380a\") " pod="openshift-multus/multus-additional-cni-plugins-vm6zb" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.933963 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vm6zb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f30a1f9-86ef-450e-9f8c-8ef8d4ac380a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vm6zb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.946279 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.954878 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7kp7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66a61a7f-9be6-486b-a425-62ed62ec0ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kgr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7kp7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.966505 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.977138 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xv5q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52fc9378-c37b-424b-afde-7b191bab5fde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4rqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xv5q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.987024 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 10:44:41 crc kubenswrapper[4752]: I0929 10:44:41.999453 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.008368 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5863c243-797d-462a-b11f-71aaf005f8d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdtpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdtpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mgrvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.019507 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.023671 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/52fc9378-c37b-424b-afde-7b191bab5fde-host-run-multus-certs\") pod \"multus-xv5q7\" (UID: \"52fc9378-c37b-424b-afde-7b191bab5fde\") " pod="openshift-multus/multus-xv5q7" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.023713 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9f30a1f9-86ef-450e-9f8c-8ef8d4ac380a-os-release\") pod \"multus-additional-cni-plugins-vm6zb\" (UID: \"9f30a1f9-86ef-450e-9f8c-8ef8d4ac380a\") " pod="openshift-multus/multus-additional-cni-plugins-vm6zb" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.023743 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/52fc9378-c37b-424b-afde-7b191bab5fde-host-var-lib-kubelet\") pod \"multus-xv5q7\" (UID: \"52fc9378-c37b-424b-afde-7b191bab5fde\") " pod="openshift-multus/multus-xv5q7" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.023764 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/52fc9378-c37b-424b-afde-7b191bab5fde-host-run-netns\") pod \"multus-xv5q7\" (UID: \"52fc9378-c37b-424b-afde-7b191bab5fde\") " pod="openshift-multus/multus-xv5q7" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.023794 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/52fc9378-c37b-424b-afde-7b191bab5fde-etc-kubernetes\") pod \"multus-xv5q7\" (UID: \"52fc9378-c37b-424b-afde-7b191bab5fde\") " pod="openshift-multus/multus-xv5q7" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.023823 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/52fc9378-c37b-424b-afde-7b191bab5fde-host-run-multus-certs\") pod \"multus-xv5q7\" (UID: \"52fc9378-c37b-424b-afde-7b191bab5fde\") " pod="openshift-multus/multus-xv5q7" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.023846 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/52fc9378-c37b-424b-afde-7b191bab5fde-os-release\") pod \"multus-xv5q7\" (UID: \"52fc9378-c37b-424b-afde-7b191bab5fde\") " pod="openshift-multus/multus-xv5q7" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.023874 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/52fc9378-c37b-424b-afde-7b191bab5fde-system-cni-dir\") pod \"multus-xv5q7\" (UID: \"52fc9378-c37b-424b-afde-7b191bab5fde\") " pod="openshift-multus/multus-xv5q7" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.023871 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/52fc9378-c37b-424b-afde-7b191bab5fde-host-var-lib-kubelet\") pod \"multus-xv5q7\" (UID: \"52fc9378-c37b-424b-afde-7b191bab5fde\") " pod="openshift-multus/multus-xv5q7" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.023893 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/52fc9378-c37b-424b-afde-7b191bab5fde-multus-cni-dir\") pod \"multus-xv5q7\" (UID: \"52fc9378-c37b-424b-afde-7b191bab5fde\") " pod="openshift-multus/multus-xv5q7" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.023880 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/52fc9378-c37b-424b-afde-7b191bab5fde-etc-kubernetes\") pod \"multus-xv5q7\" (UID: \"52fc9378-c37b-424b-afde-7b191bab5fde\") " pod="openshift-multus/multus-xv5q7" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.023913 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/52fc9378-c37b-424b-afde-7b191bab5fde-host-var-lib-cni-multus\") pod \"multus-xv5q7\" (UID: \"52fc9378-c37b-424b-afde-7b191bab5fde\") " pod="openshift-multus/multus-xv5q7" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.023936 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/52fc9378-c37b-424b-afde-7b191bab5fde-system-cni-dir\") pod \"multus-xv5q7\" (UID: \"52fc9378-c37b-424b-afde-7b191bab5fde\") " pod="openshift-multus/multus-xv5q7" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.023889 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9f30a1f9-86ef-450e-9f8c-8ef8d4ac380a-os-release\") pod \"multus-additional-cni-plugins-vm6zb\" (UID: \"9f30a1f9-86ef-450e-9f8c-8ef8d4ac380a\") " pod="openshift-multus/multus-additional-cni-plugins-vm6zb" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.023955 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4rqn\" (UniqueName: \"kubernetes.io/projected/52fc9378-c37b-424b-afde-7b191bab5fde-kube-api-access-v4rqn\") pod \"multus-xv5q7\" (UID: \"52fc9378-c37b-424b-afde-7b191bab5fde\") " pod="openshift-multus/multus-xv5q7" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.023947 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/52fc9378-c37b-424b-afde-7b191bab5fde-os-release\") pod \"multus-xv5q7\" (UID: \"52fc9378-c37b-424b-afde-7b191bab5fde\") " pod="openshift-multus/multus-xv5q7" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.023940 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/52fc9378-c37b-424b-afde-7b191bab5fde-host-var-lib-cni-multus\") pod \"multus-xv5q7\" (UID: \"52fc9378-c37b-424b-afde-7b191bab5fde\") " pod="openshift-multus/multus-xv5q7" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.023987 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5863c243-797d-462a-b11f-71aaf005f8d1-proxy-tls\") pod \"machine-config-daemon-mgrvs\" (UID: \"5863c243-797d-462a-b11f-71aaf005f8d1\") " pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.023973 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/52fc9378-c37b-424b-afde-7b191bab5fde-host-run-netns\") pod \"multus-xv5q7\" (UID: \"52fc9378-c37b-424b-afde-7b191bab5fde\") " pod="openshift-multus/multus-xv5q7" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.024015 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4znxf\" (UniqueName: \"kubernetes.io/projected/9f30a1f9-86ef-450e-9f8c-8ef8d4ac380a-kube-api-access-4znxf\") pod \"multus-additional-cni-plugins-vm6zb\" (UID: \"9f30a1f9-86ef-450e-9f8c-8ef8d4ac380a\") " pod="openshift-multus/multus-additional-cni-plugins-vm6zb" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.024079 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/52fc9378-c37b-424b-afde-7b191bab5fde-multus-daemon-config\") pod \"multus-xv5q7\" (UID: \"52fc9378-c37b-424b-afde-7b191bab5fde\") " pod="openshift-multus/multus-xv5q7" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.024100 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/52fc9378-c37b-424b-afde-7b191bab5fde-multus-cni-dir\") pod \"multus-xv5q7\" (UID: \"52fc9378-c37b-424b-afde-7b191bab5fde\") " pod="openshift-multus/multus-xv5q7" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.024116 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9f30a1f9-86ef-450e-9f8c-8ef8d4ac380a-cni-binary-copy\") pod \"multus-additional-cni-plugins-vm6zb\" (UID: \"9f30a1f9-86ef-450e-9f8c-8ef8d4ac380a\") " pod="openshift-multus/multus-additional-cni-plugins-vm6zb" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.024143 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/52fc9378-c37b-424b-afde-7b191bab5fde-host-var-lib-cni-bin\") pod \"multus-xv5q7\" (UID: \"52fc9378-c37b-424b-afde-7b191bab5fde\") " pod="openshift-multus/multus-xv5q7" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.024162 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9f30a1f9-86ef-450e-9f8c-8ef8d4ac380a-system-cni-dir\") pod \"multus-additional-cni-plugins-vm6zb\" (UID: \"9f30a1f9-86ef-450e-9f8c-8ef8d4ac380a\") " pod="openshift-multus/multus-additional-cni-plugins-vm6zb" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.024198 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/5863c243-797d-462a-b11f-71aaf005f8d1-rootfs\") pod \"machine-config-daemon-mgrvs\" (UID: \"5863c243-797d-462a-b11f-71aaf005f8d1\") " pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.024218 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/52fc9378-c37b-424b-afde-7b191bab5fde-host-run-k8s-cni-cncf-io\") pod \"multus-xv5q7\" (UID: \"52fc9378-c37b-424b-afde-7b191bab5fde\") " pod="openshift-multus/multus-xv5q7" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.024236 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9f30a1f9-86ef-450e-9f8c-8ef8d4ac380a-cnibin\") pod \"multus-additional-cni-plugins-vm6zb\" (UID: \"9f30a1f9-86ef-450e-9f8c-8ef8d4ac380a\") " pod="openshift-multus/multus-additional-cni-plugins-vm6zb" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.024260 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/52fc9378-c37b-424b-afde-7b191bab5fde-hostroot\") pod \"multus-xv5q7\" (UID: \"52fc9378-c37b-424b-afde-7b191bab5fde\") " pod="openshift-multus/multus-xv5q7" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.024275 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/52fc9378-c37b-424b-afde-7b191bab5fde-multus-socket-dir-parent\") pod \"multus-xv5q7\" (UID: \"52fc9378-c37b-424b-afde-7b191bab5fde\") " pod="openshift-multus/multus-xv5q7" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.024301 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5863c243-797d-462a-b11f-71aaf005f8d1-mcd-auth-proxy-config\") pod \"machine-config-daemon-mgrvs\" (UID: \"5863c243-797d-462a-b11f-71aaf005f8d1\") " pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.024319 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdtpd\" (UniqueName: \"kubernetes.io/projected/5863c243-797d-462a-b11f-71aaf005f8d1-kube-api-access-qdtpd\") pod \"machine-config-daemon-mgrvs\" (UID: \"5863c243-797d-462a-b11f-71aaf005f8d1\") " pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.024335 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/9f30a1f9-86ef-450e-9f8c-8ef8d4ac380a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-vm6zb\" (UID: \"9f30a1f9-86ef-450e-9f8c-8ef8d4ac380a\") " pod="openshift-multus/multus-additional-cni-plugins-vm6zb" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.024355 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/52fc9378-c37b-424b-afde-7b191bab5fde-cni-binary-copy\") pod \"multus-xv5q7\" (UID: \"52fc9378-c37b-424b-afde-7b191bab5fde\") " pod="openshift-multus/multus-xv5q7" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.024373 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9f30a1f9-86ef-450e-9f8c-8ef8d4ac380a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-vm6zb\" (UID: \"9f30a1f9-86ef-450e-9f8c-8ef8d4ac380a\") " pod="openshift-multus/multus-additional-cni-plugins-vm6zb" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.024411 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/52fc9378-c37b-424b-afde-7b191bab5fde-cnibin\") pod \"multus-xv5q7\" (UID: \"52fc9378-c37b-424b-afde-7b191bab5fde\") " pod="openshift-multus/multus-xv5q7" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.024420 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/52fc9378-c37b-424b-afde-7b191bab5fde-hostroot\") pod \"multus-xv5q7\" (UID: \"52fc9378-c37b-424b-afde-7b191bab5fde\") " pod="openshift-multus/multus-xv5q7" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.024468 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/52fc9378-c37b-424b-afde-7b191bab5fde-multus-conf-dir\") pod \"multus-xv5q7\" (UID: \"52fc9378-c37b-424b-afde-7b191bab5fde\") " pod="openshift-multus/multus-xv5q7" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.024501 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/52fc9378-c37b-424b-afde-7b191bab5fde-multus-socket-dir-parent\") pod \"multus-xv5q7\" (UID: \"52fc9378-c37b-424b-afde-7b191bab5fde\") " pod="openshift-multus/multus-xv5q7" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.024537 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9f30a1f9-86ef-450e-9f8c-8ef8d4ac380a-system-cni-dir\") pod \"multus-additional-cni-plugins-vm6zb\" (UID: \"9f30a1f9-86ef-450e-9f8c-8ef8d4ac380a\") " pod="openshift-multus/multus-additional-cni-plugins-vm6zb" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.024539 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/5863c243-797d-462a-b11f-71aaf005f8d1-rootfs\") pod \"machine-config-daemon-mgrvs\" (UID: \"5863c243-797d-462a-b11f-71aaf005f8d1\") " pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.024563 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/52fc9378-c37b-424b-afde-7b191bab5fde-host-run-k8s-cni-cncf-io\") pod \"multus-xv5q7\" (UID: \"52fc9378-c37b-424b-afde-7b191bab5fde\") " pod="openshift-multus/multus-xv5q7" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.024428 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/52fc9378-c37b-424b-afde-7b191bab5fde-multus-conf-dir\") pod \"multus-xv5q7\" (UID: \"52fc9378-c37b-424b-afde-7b191bab5fde\") " pod="openshift-multus/multus-xv5q7" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.024612 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9f30a1f9-86ef-450e-9f8c-8ef8d4ac380a-cnibin\") pod \"multus-additional-cni-plugins-vm6zb\" (UID: \"9f30a1f9-86ef-450e-9f8c-8ef8d4ac380a\") " pod="openshift-multus/multus-additional-cni-plugins-vm6zb" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.024708 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/52fc9378-c37b-424b-afde-7b191bab5fde-multus-daemon-config\") pod \"multus-xv5q7\" (UID: \"52fc9378-c37b-424b-afde-7b191bab5fde\") " pod="openshift-multus/multus-xv5q7" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.024752 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/52fc9378-c37b-424b-afde-7b191bab5fde-cnibin\") pod \"multus-xv5q7\" (UID: \"52fc9378-c37b-424b-afde-7b191bab5fde\") " pod="openshift-multus/multus-xv5q7" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.024504 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/52fc9378-c37b-424b-afde-7b191bab5fde-host-var-lib-cni-bin\") pod \"multus-xv5q7\" (UID: \"52fc9378-c37b-424b-afde-7b191bab5fde\") " pod="openshift-multus/multus-xv5q7" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.025031 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9f30a1f9-86ef-450e-9f8c-8ef8d4ac380a-cni-binary-copy\") pod \"multus-additional-cni-plugins-vm6zb\" (UID: \"9f30a1f9-86ef-450e-9f8c-8ef8d4ac380a\") " pod="openshift-multus/multus-additional-cni-plugins-vm6zb" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.025339 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/52fc9378-c37b-424b-afde-7b191bab5fde-cni-binary-copy\") pod \"multus-xv5q7\" (UID: \"52fc9378-c37b-424b-afde-7b191bab5fde\") " pod="openshift-multus/multus-xv5q7" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.025428 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/9f30a1f9-86ef-450e-9f8c-8ef8d4ac380a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-vm6zb\" (UID: \"9f30a1f9-86ef-450e-9f8c-8ef8d4ac380a\") " pod="openshift-multus/multus-additional-cni-plugins-vm6zb" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.025454 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9f30a1f9-86ef-450e-9f8c-8ef8d4ac380a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-vm6zb\" (UID: \"9f30a1f9-86ef-450e-9f8c-8ef8d4ac380a\") " pod="openshift-multus/multus-additional-cni-plugins-vm6zb" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.025592 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5863c243-797d-462a-b11f-71aaf005f8d1-mcd-auth-proxy-config\") pod \"machine-config-daemon-mgrvs\" (UID: \"5863c243-797d-462a-b11f-71aaf005f8d1\") " pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.027091 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5863c243-797d-462a-b11f-71aaf005f8d1-proxy-tls\") pod \"machine-config-daemon-mgrvs\" (UID: \"5863c243-797d-462a-b11f-71aaf005f8d1\") " pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.032290 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.036057 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.036764 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.037907 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.038526 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.039746 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.040319 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.040929 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.041865 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.042520 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.042513 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdtpd\" (UniqueName: \"kubernetes.io/projected/5863c243-797d-462a-b11f-71aaf005f8d1-kube-api-access-qdtpd\") pod \"machine-config-daemon-mgrvs\" (UID: \"5863c243-797d-462a-b11f-71aaf005f8d1\") " pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.043441 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.044013 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.045125 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.045438 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4znxf\" (UniqueName: \"kubernetes.io/projected/9f30a1f9-86ef-450e-9f8c-8ef8d4ac380a-kube-api-access-4znxf\") pod \"multus-additional-cni-plugins-vm6zb\" (UID: \"9f30a1f9-86ef-450e-9f8c-8ef8d4ac380a\") " pod="openshift-multus/multus-additional-cni-plugins-vm6zb" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.045729 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.045960 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4rqn\" (UniqueName: \"kubernetes.io/projected/52fc9378-c37b-424b-afde-7b191bab5fde-kube-api-access-v4rqn\") pod \"multus-xv5q7\" (UID: \"52fc9378-c37b-424b-afde-7b191bab5fde\") " pod="openshift-multus/multus-xv5q7" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.046271 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.047235 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.047763 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.048697 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.049092 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.049630 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.050583 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.051130 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.051684 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.052463 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.053089 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.053922 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.054638 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.055957 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.056521 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.057537 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.058031 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.058518 4752 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.058998 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.060626 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.061132 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.062044 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.063616 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.064292 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.065337 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.066214 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.067384 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.067844 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.068774 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.069411 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.070404 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.070862 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.071960 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.072468 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.073554 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.074049 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.074928 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.075382 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.075912 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.076939 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.077451 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.144560 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.152581 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-vm6zb" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.159089 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-7kp7q" event={"ID":"66a61a7f-9be6-486b-a425-62ed62ec0ebd","Type":"ContainerStarted","Data":"a4170732970e5e7c429279d239eb2d4b9d8249ff254b35f38ff80d0321087be2"} Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.159144 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-7kp7q" event={"ID":"66a61a7f-9be6-486b-a425-62ed62ec0ebd","Type":"ContainerStarted","Data":"feb6e2db0cbfcf7d4f19723f00a9b43d5af3a29f76184aa621f9c15dd433f93d"} Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.161693 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"5fb781fd16d4a9f56202eb1724ed1a4ed6700ff7b81819573b955bcb07e563a3"} Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.161699 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-xv5q7" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.161717 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"706a56093987e37432c71cf0a73d952cd857117630d7b2e48d9d3a61f8d1ed15"} Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.165869 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"ca4f637cfcb1e52fa69f0ffa46b3a53459225d9ad4afd1178bff709e812c5418"} Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.166040 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"8b70242846937de5b4dda37a2b8c48947fded378c299ea4ad857168589d7c175"} Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.166129 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"4e8502f53c68786a0caafd06500d3e850c51e0f397bea92ad496eca159b1af5b"} Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.167005 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"90d4d733b933f332b494d9540503720af39a97bd0af8d676ccf729cade845e12"} Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.168604 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.169130 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.172463 4752 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c927118840179fccacbe6a18a329c117cef73a6e914bf38d20fc2439d6a5c1ee" exitCode=255 Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.172551 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"c927118840179fccacbe6a18a329c117cef73a6e914bf38d20fc2439d6a5c1ee"} Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.172616 4752 scope.go:117] "RemoveContainer" containerID="243027f874c8561246fc694e29dc97e29bdfa821afba81016f1c3dd7433a43d3" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.175241 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:42Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.186998 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-c2vrh"] Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.187460 4752 scope.go:117] "RemoveContainer" containerID="c927118840179fccacbe6a18a329c117cef73a6e914bf38d20fc2439d6a5c1ee" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.187818 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.187946 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" Sep 29 10:44:42 crc kubenswrapper[4752]: E0929 10:44:42.188757 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.190687 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.191095 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.191176 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.191277 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.191470 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7kp7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66a61a7f-9be6-486b-a425-62ed62ec0ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4170732970e5e7c429279d239eb2d4b9d8249ff254b35f38ff80d0321087be2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kgr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7kp7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:42Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.191621 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.192984 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.194899 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Sep 29 10:44:42 crc kubenswrapper[4752]: W0929 10:44:42.196048 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52fc9378_c37b_424b_afde_7b191bab5fde.slice/crio-97b5023af583817d96783976ecc57d9931baa6ca5dab4662e35dfcab3673926f WatchSource:0}: Error finding container 97b5023af583817d96783976ecc57d9931baa6ca5dab4662e35dfcab3673926f: Status 404 returned error can't find the container with id 97b5023af583817d96783976ecc57d9931baa6ca5dab4662e35dfcab3673926f Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.209482 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vm6zb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f30a1f9-86ef-450e-9f8c-8ef8d4ac380a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vm6zb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:42Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.225883 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xv5q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52fc9378-c37b-424b-afde-7b191bab5fde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4rqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xv5q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:42Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.226511 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/94028c24-ec10-4d5c-b32c-1700e677d539-var-lib-openvswitch\") pod \"ovnkube-node-c2vrh\" (UID: \"94028c24-ec10-4d5c-b32c-1700e677d539\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.226571 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/94028c24-ec10-4d5c-b32c-1700e677d539-run-ovn\") pod \"ovnkube-node-c2vrh\" (UID: \"94028c24-ec10-4d5c-b32c-1700e677d539\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.226605 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/94028c24-ec10-4d5c-b32c-1700e677d539-ovnkube-script-lib\") pod \"ovnkube-node-c2vrh\" (UID: \"94028c24-ec10-4d5c-b32c-1700e677d539\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.227640 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/94028c24-ec10-4d5c-b32c-1700e677d539-run-openvswitch\") pod \"ovnkube-node-c2vrh\" (UID: \"94028c24-ec10-4d5c-b32c-1700e677d539\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.227749 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/94028c24-ec10-4d5c-b32c-1700e677d539-etc-openvswitch\") pod \"ovnkube-node-c2vrh\" (UID: \"94028c24-ec10-4d5c-b32c-1700e677d539\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.227778 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/94028c24-ec10-4d5c-b32c-1700e677d539-log-socket\") pod \"ovnkube-node-c2vrh\" (UID: \"94028c24-ec10-4d5c-b32c-1700e677d539\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.227846 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/94028c24-ec10-4d5c-b32c-1700e677d539-host-cni-bin\") pod \"ovnkube-node-c2vrh\" (UID: \"94028c24-ec10-4d5c-b32c-1700e677d539\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.227904 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/94028c24-ec10-4d5c-b32c-1700e677d539-ovn-node-metrics-cert\") pod \"ovnkube-node-c2vrh\" (UID: \"94028c24-ec10-4d5c-b32c-1700e677d539\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.227938 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/94028c24-ec10-4d5c-b32c-1700e677d539-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-c2vrh\" (UID: \"94028c24-ec10-4d5c-b32c-1700e677d539\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.227971 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/94028c24-ec10-4d5c-b32c-1700e677d539-host-kubelet\") pod \"ovnkube-node-c2vrh\" (UID: \"94028c24-ec10-4d5c-b32c-1700e677d539\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.227995 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/94028c24-ec10-4d5c-b32c-1700e677d539-ovnkube-config\") pod \"ovnkube-node-c2vrh\" (UID: \"94028c24-ec10-4d5c-b32c-1700e677d539\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.228164 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/94028c24-ec10-4d5c-b32c-1700e677d539-systemd-units\") pod \"ovnkube-node-c2vrh\" (UID: \"94028c24-ec10-4d5c-b32c-1700e677d539\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.228202 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9v6qn\" (UniqueName: \"kubernetes.io/projected/94028c24-ec10-4d5c-b32c-1700e677d539-kube-api-access-9v6qn\") pod \"ovnkube-node-c2vrh\" (UID: \"94028c24-ec10-4d5c-b32c-1700e677d539\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.228228 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/94028c24-ec10-4d5c-b32c-1700e677d539-host-cni-netd\") pod \"ovnkube-node-c2vrh\" (UID: \"94028c24-ec10-4d5c-b32c-1700e677d539\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.228286 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/94028c24-ec10-4d5c-b32c-1700e677d539-host-slash\") pod \"ovnkube-node-c2vrh\" (UID: \"94028c24-ec10-4d5c-b32c-1700e677d539\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.228306 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/94028c24-ec10-4d5c-b32c-1700e677d539-node-log\") pod \"ovnkube-node-c2vrh\" (UID: \"94028c24-ec10-4d5c-b32c-1700e677d539\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.228350 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/94028c24-ec10-4d5c-b32c-1700e677d539-run-systemd\") pod \"ovnkube-node-c2vrh\" (UID: \"94028c24-ec10-4d5c-b32c-1700e677d539\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.228858 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/94028c24-ec10-4d5c-b32c-1700e677d539-host-run-ovn-kubernetes\") pod \"ovnkube-node-c2vrh\" (UID: \"94028c24-ec10-4d5c-b32c-1700e677d539\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.228876 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/94028c24-ec10-4d5c-b32c-1700e677d539-env-overrides\") pod \"ovnkube-node-c2vrh\" (UID: \"94028c24-ec10-4d5c-b32c-1700e677d539\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.228910 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/94028c24-ec10-4d5c-b32c-1700e677d539-host-run-netns\") pod \"ovnkube-node-c2vrh\" (UID: \"94028c24-ec10-4d5c-b32c-1700e677d539\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.243461 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:42Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.259190 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:42Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.272550 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:42Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.288384 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:42Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.306014 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:42Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.326208 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5863c243-797d-462a-b11f-71aaf005f8d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdtpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdtpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mgrvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:42Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.331270 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/94028c24-ec10-4d5c-b32c-1700e677d539-ovnkube-script-lib\") pod \"ovnkube-node-c2vrh\" (UID: \"94028c24-ec10-4d5c-b32c-1700e677d539\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.331320 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/94028c24-ec10-4d5c-b32c-1700e677d539-run-openvswitch\") pod \"ovnkube-node-c2vrh\" (UID: \"94028c24-ec10-4d5c-b32c-1700e677d539\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.331346 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/94028c24-ec10-4d5c-b32c-1700e677d539-host-cni-bin\") pod \"ovnkube-node-c2vrh\" (UID: \"94028c24-ec10-4d5c-b32c-1700e677d539\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.331369 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/94028c24-ec10-4d5c-b32c-1700e677d539-ovn-node-metrics-cert\") pod \"ovnkube-node-c2vrh\" (UID: \"94028c24-ec10-4d5c-b32c-1700e677d539\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.331394 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/94028c24-ec10-4d5c-b32c-1700e677d539-etc-openvswitch\") pod \"ovnkube-node-c2vrh\" (UID: \"94028c24-ec10-4d5c-b32c-1700e677d539\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.331413 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/94028c24-ec10-4d5c-b32c-1700e677d539-log-socket\") pod \"ovnkube-node-c2vrh\" (UID: \"94028c24-ec10-4d5c-b32c-1700e677d539\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.331443 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/94028c24-ec10-4d5c-b32c-1700e677d539-run-openvswitch\") pod \"ovnkube-node-c2vrh\" (UID: \"94028c24-ec10-4d5c-b32c-1700e677d539\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.331489 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/94028c24-ec10-4d5c-b32c-1700e677d539-host-cni-bin\") pod \"ovnkube-node-c2vrh\" (UID: \"94028c24-ec10-4d5c-b32c-1700e677d539\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.331491 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/94028c24-ec10-4d5c-b32c-1700e677d539-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-c2vrh\" (UID: \"94028c24-ec10-4d5c-b32c-1700e677d539\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.331511 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/94028c24-ec10-4d5c-b32c-1700e677d539-etc-openvswitch\") pod \"ovnkube-node-c2vrh\" (UID: \"94028c24-ec10-4d5c-b32c-1700e677d539\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.331452 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/94028c24-ec10-4d5c-b32c-1700e677d539-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-c2vrh\" (UID: \"94028c24-ec10-4d5c-b32c-1700e677d539\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.331537 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/94028c24-ec10-4d5c-b32c-1700e677d539-log-socket\") pod \"ovnkube-node-c2vrh\" (UID: \"94028c24-ec10-4d5c-b32c-1700e677d539\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.331555 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/94028c24-ec10-4d5c-b32c-1700e677d539-host-kubelet\") pod \"ovnkube-node-c2vrh\" (UID: \"94028c24-ec10-4d5c-b32c-1700e677d539\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.331539 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/94028c24-ec10-4d5c-b32c-1700e677d539-host-kubelet\") pod \"ovnkube-node-c2vrh\" (UID: \"94028c24-ec10-4d5c-b32c-1700e677d539\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.331593 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/94028c24-ec10-4d5c-b32c-1700e677d539-ovnkube-config\") pod \"ovnkube-node-c2vrh\" (UID: \"94028c24-ec10-4d5c-b32c-1700e677d539\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.331612 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/94028c24-ec10-4d5c-b32c-1700e677d539-systemd-units\") pod \"ovnkube-node-c2vrh\" (UID: \"94028c24-ec10-4d5c-b32c-1700e677d539\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.331631 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9v6qn\" (UniqueName: \"kubernetes.io/projected/94028c24-ec10-4d5c-b32c-1700e677d539-kube-api-access-9v6qn\") pod \"ovnkube-node-c2vrh\" (UID: \"94028c24-ec10-4d5c-b32c-1700e677d539\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.331668 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/94028c24-ec10-4d5c-b32c-1700e677d539-host-cni-netd\") pod \"ovnkube-node-c2vrh\" (UID: \"94028c24-ec10-4d5c-b32c-1700e677d539\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.331691 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/94028c24-ec10-4d5c-b32c-1700e677d539-host-slash\") pod \"ovnkube-node-c2vrh\" (UID: \"94028c24-ec10-4d5c-b32c-1700e677d539\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.331709 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/94028c24-ec10-4d5c-b32c-1700e677d539-node-log\") pod \"ovnkube-node-c2vrh\" (UID: \"94028c24-ec10-4d5c-b32c-1700e677d539\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.331728 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/94028c24-ec10-4d5c-b32c-1700e677d539-run-systemd\") pod \"ovnkube-node-c2vrh\" (UID: \"94028c24-ec10-4d5c-b32c-1700e677d539\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.331746 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/94028c24-ec10-4d5c-b32c-1700e677d539-host-run-ovn-kubernetes\") pod \"ovnkube-node-c2vrh\" (UID: \"94028c24-ec10-4d5c-b32c-1700e677d539\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.331768 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/94028c24-ec10-4d5c-b32c-1700e677d539-env-overrides\") pod \"ovnkube-node-c2vrh\" (UID: \"94028c24-ec10-4d5c-b32c-1700e677d539\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.331790 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/94028c24-ec10-4d5c-b32c-1700e677d539-host-run-netns\") pod \"ovnkube-node-c2vrh\" (UID: \"94028c24-ec10-4d5c-b32c-1700e677d539\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.331835 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/94028c24-ec10-4d5c-b32c-1700e677d539-var-lib-openvswitch\") pod \"ovnkube-node-c2vrh\" (UID: \"94028c24-ec10-4d5c-b32c-1700e677d539\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.331855 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/94028c24-ec10-4d5c-b32c-1700e677d539-run-ovn\") pod \"ovnkube-node-c2vrh\" (UID: \"94028c24-ec10-4d5c-b32c-1700e677d539\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.331901 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/94028c24-ec10-4d5c-b32c-1700e677d539-run-ovn\") pod \"ovnkube-node-c2vrh\" (UID: \"94028c24-ec10-4d5c-b32c-1700e677d539\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.334294 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/94028c24-ec10-4d5c-b32c-1700e677d539-ovnkube-script-lib\") pod \"ovnkube-node-c2vrh\" (UID: \"94028c24-ec10-4d5c-b32c-1700e677d539\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.334596 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/94028c24-ec10-4d5c-b32c-1700e677d539-systemd-units\") pod \"ovnkube-node-c2vrh\" (UID: \"94028c24-ec10-4d5c-b32c-1700e677d539\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.334901 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/94028c24-ec10-4d5c-b32c-1700e677d539-run-systemd\") pod \"ovnkube-node-c2vrh\" (UID: \"94028c24-ec10-4d5c-b32c-1700e677d539\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.335097 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/94028c24-ec10-4d5c-b32c-1700e677d539-host-run-ovn-kubernetes\") pod \"ovnkube-node-c2vrh\" (UID: \"94028c24-ec10-4d5c-b32c-1700e677d539\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.336584 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/94028c24-ec10-4d5c-b32c-1700e677d539-host-cni-netd\") pod \"ovnkube-node-c2vrh\" (UID: \"94028c24-ec10-4d5c-b32c-1700e677d539\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.336715 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/94028c24-ec10-4d5c-b32c-1700e677d539-host-slash\") pod \"ovnkube-node-c2vrh\" (UID: \"94028c24-ec10-4d5c-b32c-1700e677d539\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.336776 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/94028c24-ec10-4d5c-b32c-1700e677d539-node-log\") pod \"ovnkube-node-c2vrh\" (UID: \"94028c24-ec10-4d5c-b32c-1700e677d539\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.336902 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/94028c24-ec10-4d5c-b32c-1700e677d539-host-run-netns\") pod \"ovnkube-node-c2vrh\" (UID: \"94028c24-ec10-4d5c-b32c-1700e677d539\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.337036 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/94028c24-ec10-4d5c-b32c-1700e677d539-var-lib-openvswitch\") pod \"ovnkube-node-c2vrh\" (UID: \"94028c24-ec10-4d5c-b32c-1700e677d539\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.337350 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/94028c24-ec10-4d5c-b32c-1700e677d539-ovnkube-config\") pod \"ovnkube-node-c2vrh\" (UID: \"94028c24-ec10-4d5c-b32c-1700e677d539\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.340952 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:42Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.345240 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/94028c24-ec10-4d5c-b32c-1700e677d539-ovn-node-metrics-cert\") pod \"ovnkube-node-c2vrh\" (UID: \"94028c24-ec10-4d5c-b32c-1700e677d539\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.347105 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/94028c24-ec10-4d5c-b32c-1700e677d539-env-overrides\") pod \"ovnkube-node-c2vrh\" (UID: \"94028c24-ec10-4d5c-b32c-1700e677d539\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.355493 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5863c243-797d-462a-b11f-71aaf005f8d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdtpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdtpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mgrvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:42Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.355599 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9v6qn\" (UniqueName: \"kubernetes.io/projected/94028c24-ec10-4d5c-b32c-1700e677d539-kube-api-access-9v6qn\") pod \"ovnkube-node-c2vrh\" (UID: \"94028c24-ec10-4d5c-b32c-1700e677d539\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.364952 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.371180 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:42Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.378520 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.384220 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.384634 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:42Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.400686 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vm6zb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f30a1f9-86ef-450e-9f8c-8ef8d4ac380a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vm6zb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:42Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.422565 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94028c24-ec10-4d5c-b32c-1700e677d539\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c2vrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:42Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.436944 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520a5d33-312c-4033-8b69-5dd582f13ccc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6223734bbce461c09916aea7629bba0cfa97ea17050bca7417020ece9ae031a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1157b82d6f3337270d30abdceadaa1f0a01b3c6d8de6bc8e9edf083a8264f19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://854abd6205c2eec2229d0d65aec3edb7cf1cc1e77759df41bd22deda4a08c8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c927118840179fccacbe6a18a329c117cef73a6e914bf38d20fc2439d6a5c1ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://243027f874c8561246fc694e29dc97e29bdfa821afba81016f1c3dd7433a43d3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T10:44:34Z\\\",\\\"message\\\":\\\"W0929 10:44:23.196348 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0929 10:44:23.196765 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759142663 cert, and key in /tmp/serving-cert-3078923962/serving-signer.crt, /tmp/serving-cert-3078923962/serving-signer.key\\\\nI0929 10:44:23.592602 1 observer_polling.go:159] Starting file observer\\\\nW0929 10:44:23.594941 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0929 10:44:23.595069 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 10:44:23.597588 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3078923962/tls.crt::/tmp/serving-cert-3078923962/tls.key\\\\\\\"\\\\nF0929 10:44:34.019478 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c927118840179fccacbe6a18a329c117cef73a6e914bf38d20fc2439d6a5c1ee\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"message\\\":\\\"le observer\\\\nW0929 10:44:40.787758 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0929 10:44:40.787900 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 10:44:40.788558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1487283959/tls.crt::/tmp/serving-cert-1487283959/tls.key\\\\\\\"\\\\nI0929 10:44:41.256284 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 10:44:41.261265 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 10:44:41.261291 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 10:44:41.261311 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 10:44:41.261316 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 10:44:41.267824 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0929 10:44:41.267847 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0929 10:44:41.267849 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 10:44:41.267871 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 10:44:41.267876 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 10:44:41.267879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 10:44:41.267882 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 10:44:41.267884 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0929 10:44:41.270258 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbe61bb570ef2be352bb3a0e55da353ce7b618b397e3bf9f0d66da0c9b6f1d4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80f961b58569cce6d634f225369902695ccda2e78efb1c6fd635f1535467cc1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80f961b58569cce6d634f225369902695ccda2e78efb1c6fd635f1535467cc1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:42Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.451536 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca4f637cfcb1e52fa69f0ffa46b3a53459225d9ad4afd1178bff709e812c5418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b70242846937de5b4dda37a2b8c48947fded378c299ea4ad857168589d7c175\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:42Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.466242 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7kp7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66a61a7f-9be6-486b-a425-62ed62ec0ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4170732970e5e7c429279d239eb2d4b9d8249ff254b35f38ff80d0321087be2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kgr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7kp7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:42Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.482527 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fb781fd16d4a9f56202eb1724ed1a4ed6700ff7b81819573b955bcb07e563a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:42Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.497841 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xv5q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52fc9378-c37b-424b-afde-7b191bab5fde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4rqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xv5q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:42Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.509275 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.513822 4752 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.513922 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:42Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:42 crc kubenswrapper[4752]: W0929 10:44:42.522605 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94028c24_ec10_4d5c_b32c_1700e677d539.slice/crio-905a8e32917410182bb8374b5bc38a9b93ec66b303093ada9c998360c91433a2 WatchSource:0}: Error finding container 905a8e32917410182bb8374b5bc38a9b93ec66b303093ada9c998360c91433a2: Status 404 returned error can't find the container with id 905a8e32917410182bb8374b5bc38a9b93ec66b303093ada9c998360c91433a2 Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.534231 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 10:44:42 crc kubenswrapper[4752]: E0929 10:44:42.534510 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 10:44:44.534486572 +0000 UTC m=+25.323628239 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.536474 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48ad7053-6039-4b1a-9729-fcbe1d938928\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00965359c30aa25677d4b114c00b339b155ab4b5316d5e355536bea5b65eaba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e2d86e0821e0155affe296e5cc70e9904f04c800943101e62509e3a5e4e0808\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9378a6f1ac902b030f4ecabac1eae40f884dc1546a360e178f38300e137d8b0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a174bcfad22c2a58c48792478272705c80a56775b45b14919ea1de1dd92b4cbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://828d416b69696f709d91feb8df8fead0f95be74a91c5dab25756e341e29413dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e4ae4f6e0a6df2f1e370b0ff37704c0b0252752c0d8e8a1cdd83088ca9ec951\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e4ae4f6e0a6df2f1e370b0ff37704c0b0252752c0d8e8a1cdd83088ca9ec951\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40c90938f79ba960fa16979dd5f239674df4b13cae8b0b5d3bb48b0e46219a34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40c90938f79ba960fa16979dd5f239674df4b13cae8b0b5d3bb48b0e46219a34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f99c6fe84624f3e518bbe35ee9b700effb126ff1f36d995262b7ed8b73364780\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f99c6fe84624f3e518bbe35ee9b700effb126ff1f36d995262b7ed8b73364780\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:42Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.552532 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:42Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.566265 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:42Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.578596 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5863c243-797d-462a-b11f-71aaf005f8d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdtpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdtpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mgrvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:42Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.596163 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520a5d33-312c-4033-8b69-5dd582f13ccc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6223734bbce461c09916aea7629bba0cfa97ea17050bca7417020ece9ae031a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1157b82d6f3337270d30abdceadaa1f0a01b3c6d8de6bc8e9edf083a8264f19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://854abd6205c2eec2229d0d65aec3edb7cf1cc1e77759df41bd22deda4a08c8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c927118840179fccacbe6a18a329c117cef73a6e914bf38d20fc2439d6a5c1ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://243027f874c8561246fc694e29dc97e29bdfa821afba81016f1c3dd7433a43d3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T10:44:34Z\\\",\\\"message\\\":\\\"W0929 10:44:23.196348 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0929 10:44:23.196765 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759142663 cert, and key in /tmp/serving-cert-3078923962/serving-signer.crt, /tmp/serving-cert-3078923962/serving-signer.key\\\\nI0929 10:44:23.592602 1 observer_polling.go:159] Starting file observer\\\\nW0929 10:44:23.594941 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0929 10:44:23.595069 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 10:44:23.597588 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3078923962/tls.crt::/tmp/serving-cert-3078923962/tls.key\\\\\\\"\\\\nF0929 10:44:34.019478 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c927118840179fccacbe6a18a329c117cef73a6e914bf38d20fc2439d6a5c1ee\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"message\\\":\\\"le observer\\\\nW0929 10:44:40.787758 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0929 10:44:40.787900 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 10:44:40.788558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1487283959/tls.crt::/tmp/serving-cert-1487283959/tls.key\\\\\\\"\\\\nI0929 10:44:41.256284 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 10:44:41.261265 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 10:44:41.261291 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 10:44:41.261311 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 10:44:41.261316 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 10:44:41.267824 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0929 10:44:41.267847 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0929 10:44:41.267849 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 10:44:41.267871 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 10:44:41.267876 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 10:44:41.267879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 10:44:41.267882 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 10:44:41.267884 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0929 10:44:41.270258 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbe61bb570ef2be352bb3a0e55da353ce7b618b397e3bf9f0d66da0c9b6f1d4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80f961b58569cce6d634f225369902695ccda2e78efb1c6fd635f1535467cc1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80f961b58569cce6d634f225369902695ccda2e78efb1c6fd635f1535467cc1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:42Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.611053 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca4f637cfcb1e52fa69f0ffa46b3a53459225d9ad4afd1178bff709e812c5418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b70242846937de5b4dda37a2b8c48947fded378c299ea4ad857168589d7c175\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:42Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.632392 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7kp7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66a61a7f-9be6-486b-a425-62ed62ec0ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4170732970e5e7c429279d239eb2d4b9d8249ff254b35f38ff80d0321087be2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kgr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7kp7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:42Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.635679 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.635734 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.635757 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.635816 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 10:44:42 crc kubenswrapper[4752]: E0929 10:44:42.635945 4752 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 29 10:44:42 crc kubenswrapper[4752]: E0929 10:44:42.635969 4752 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 29 10:44:42 crc kubenswrapper[4752]: E0929 10:44:42.635982 4752 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 10:44:42 crc kubenswrapper[4752]: E0929 10:44:42.636035 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-29 10:44:44.636020593 +0000 UTC m=+25.425162260 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 10:44:42 crc kubenswrapper[4752]: E0929 10:44:42.636375 4752 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 29 10:44:42 crc kubenswrapper[4752]: E0929 10:44:42.636409 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-29 10:44:44.636402053 +0000 UTC m=+25.425543720 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 29 10:44:42 crc kubenswrapper[4752]: E0929 10:44:42.636459 4752 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 29 10:44:42 crc kubenswrapper[4752]: E0929 10:44:42.636469 4752 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 29 10:44:42 crc kubenswrapper[4752]: E0929 10:44:42.636477 4752 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 10:44:42 crc kubenswrapper[4752]: E0929 10:44:42.636497 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-29 10:44:44.636490725 +0000 UTC m=+25.425632382 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 10:44:42 crc kubenswrapper[4752]: E0929 10:44:42.636538 4752 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 29 10:44:42 crc kubenswrapper[4752]: E0929 10:44:42.636559 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-29 10:44:44.636553106 +0000 UTC m=+25.425694773 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.650021 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vm6zb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f30a1f9-86ef-450e-9f8c-8ef8d4ac380a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vm6zb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:42Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.669728 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94028c24-ec10-4d5c-b32c-1700e677d539\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c2vrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:42Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.683038 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:42Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.701230 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fb781fd16d4a9f56202eb1724ed1a4ed6700ff7b81819573b955bcb07e563a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:42Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.715784 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xv5q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52fc9378-c37b-424b-afde-7b191bab5fde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4rqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xv5q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:42Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:42 crc kubenswrapper[4752]: I0929 10:44:42.743778 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:42Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:43 crc kubenswrapper[4752]: I0929 10:44:43.030828 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 10:44:43 crc kubenswrapper[4752]: I0929 10:44:43.030907 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 10:44:43 crc kubenswrapper[4752]: I0929 10:44:43.030851 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 10:44:43 crc kubenswrapper[4752]: E0929 10:44:43.031025 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 10:44:43 crc kubenswrapper[4752]: E0929 10:44:43.031118 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 10:44:43 crc kubenswrapper[4752]: E0929 10:44:43.031214 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 10:44:43 crc kubenswrapper[4752]: I0929 10:44:43.179969 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" event={"ID":"5863c243-797d-462a-b11f-71aaf005f8d1","Type":"ContainerStarted","Data":"166738b29f01996ec981fd00b49f422e4a97fe774396e7ea153ad29ef30a7370"} Sep 29 10:44:43 crc kubenswrapper[4752]: I0929 10:44:43.180047 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" event={"ID":"5863c243-797d-462a-b11f-71aaf005f8d1","Type":"ContainerStarted","Data":"32155f6078e9c15abe4c659ac79b064ec182a232ea1d816998da4de273b7aa67"} Sep 29 10:44:43 crc kubenswrapper[4752]: I0929 10:44:43.180063 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" event={"ID":"5863c243-797d-462a-b11f-71aaf005f8d1","Type":"ContainerStarted","Data":"29ed23e834d6010b56d38597e0144ab4292658749c85fb4bfbb0507ea85c9cf7"} Sep 29 10:44:43 crc kubenswrapper[4752]: I0929 10:44:43.181912 4752 generic.go:334] "Generic (PLEG): container finished" podID="9f30a1f9-86ef-450e-9f8c-8ef8d4ac380a" containerID="239ca1f17b9f1e1d6ba63b196e34066fe7fb37373453460261044f5fcaf819af" exitCode=0 Sep 29 10:44:43 crc kubenswrapper[4752]: I0929 10:44:43.181990 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vm6zb" event={"ID":"9f30a1f9-86ef-450e-9f8c-8ef8d4ac380a","Type":"ContainerDied","Data":"239ca1f17b9f1e1d6ba63b196e34066fe7fb37373453460261044f5fcaf819af"} Sep 29 10:44:43 crc kubenswrapper[4752]: I0929 10:44:43.182029 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vm6zb" event={"ID":"9f30a1f9-86ef-450e-9f8c-8ef8d4ac380a","Type":"ContainerStarted","Data":"e18c8891b41bad8445b3e11283a9c30587e5ec798a539593827d8c53e75b9c79"} Sep 29 10:44:43 crc kubenswrapper[4752]: I0929 10:44:43.183853 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Sep 29 10:44:43 crc kubenswrapper[4752]: I0929 10:44:43.189565 4752 scope.go:117] "RemoveContainer" containerID="c927118840179fccacbe6a18a329c117cef73a6e914bf38d20fc2439d6a5c1ee" Sep 29 10:44:43 crc kubenswrapper[4752]: E0929 10:44:43.189755 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Sep 29 10:44:43 crc kubenswrapper[4752]: I0929 10:44:43.190845 4752 generic.go:334] "Generic (PLEG): container finished" podID="94028c24-ec10-4d5c-b32c-1700e677d539" containerID="f22dfbbd26fb3ebf4869b46406913cc1963e33c11794193c815235be5acee338" exitCode=0 Sep 29 10:44:43 crc kubenswrapper[4752]: I0929 10:44:43.190900 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" event={"ID":"94028c24-ec10-4d5c-b32c-1700e677d539","Type":"ContainerDied","Data":"f22dfbbd26fb3ebf4869b46406913cc1963e33c11794193c815235be5acee338"} Sep 29 10:44:43 crc kubenswrapper[4752]: I0929 10:44:43.191152 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" event={"ID":"94028c24-ec10-4d5c-b32c-1700e677d539","Type":"ContainerStarted","Data":"905a8e32917410182bb8374b5bc38a9b93ec66b303093ada9c998360c91433a2"} Sep 29 10:44:43 crc kubenswrapper[4752]: I0929 10:44:43.197203 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xv5q7" event={"ID":"52fc9378-c37b-424b-afde-7b191bab5fde","Type":"ContainerStarted","Data":"30ee75a35da106cc9424c7a3f97f28d0c711200667372c023612db4a9701c189"} Sep 29 10:44:43 crc kubenswrapper[4752]: I0929 10:44:43.197460 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xv5q7" event={"ID":"52fc9378-c37b-424b-afde-7b191bab5fde","Type":"ContainerStarted","Data":"97b5023af583817d96783976ecc57d9931baa6ca5dab4662e35dfcab3673926f"} Sep 29 10:44:43 crc kubenswrapper[4752]: I0929 10:44:43.212154 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520a5d33-312c-4033-8b69-5dd582f13ccc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6223734bbce461c09916aea7629bba0cfa97ea17050bca7417020ece9ae031a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1157b82d6f3337270d30abdceadaa1f0a01b3c6d8de6bc8e9edf083a8264f19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://854abd6205c2eec2229d0d65aec3edb7cf1cc1e77759df41bd22deda4a08c8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c927118840179fccacbe6a18a329c117cef73a6e914bf38d20fc2439d6a5c1ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://243027f874c8561246fc694e29dc97e29bdfa821afba81016f1c3dd7433a43d3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T10:44:34Z\\\",\\\"message\\\":\\\"W0929 10:44:23.196348 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0929 10:44:23.196765 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759142663 cert, and key in /tmp/serving-cert-3078923962/serving-signer.crt, /tmp/serving-cert-3078923962/serving-signer.key\\\\nI0929 10:44:23.592602 1 observer_polling.go:159] Starting file observer\\\\nW0929 10:44:23.594941 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0929 10:44:23.595069 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 10:44:23.597588 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3078923962/tls.crt::/tmp/serving-cert-3078923962/tls.key\\\\\\\"\\\\nF0929 10:44:34.019478 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c927118840179fccacbe6a18a329c117cef73a6e914bf38d20fc2439d6a5c1ee\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"message\\\":\\\"le observer\\\\nW0929 10:44:40.787758 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0929 10:44:40.787900 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 10:44:40.788558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1487283959/tls.crt::/tmp/serving-cert-1487283959/tls.key\\\\\\\"\\\\nI0929 10:44:41.256284 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 10:44:41.261265 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 10:44:41.261291 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 10:44:41.261311 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 10:44:41.261316 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 10:44:41.267824 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0929 10:44:41.267847 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0929 10:44:41.267849 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 10:44:41.267871 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 10:44:41.267876 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 10:44:41.267879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 10:44:41.267882 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 10:44:41.267884 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0929 10:44:41.270258 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbe61bb570ef2be352bb3a0e55da353ce7b618b397e3bf9f0d66da0c9b6f1d4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80f961b58569cce6d634f225369902695ccda2e78efb1c6fd635f1535467cc1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80f961b58569cce6d634f225369902695ccda2e78efb1c6fd635f1535467cc1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:43Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:43 crc kubenswrapper[4752]: I0929 10:44:43.230356 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca4f637cfcb1e52fa69f0ffa46b3a53459225d9ad4afd1178bff709e812c5418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b70242846937de5b4dda37a2b8c48947fded378c299ea4ad857168589d7c175\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:43Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:43 crc kubenswrapper[4752]: I0929 10:44:43.245433 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7kp7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66a61a7f-9be6-486b-a425-62ed62ec0ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4170732970e5e7c429279d239eb2d4b9d8249ff254b35f38ff80d0321087be2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kgr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7kp7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:43Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:43 crc kubenswrapper[4752]: I0929 10:44:43.265938 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vm6zb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f30a1f9-86ef-450e-9f8c-8ef8d4ac380a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vm6zb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:43Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:43 crc kubenswrapper[4752]: I0929 10:44:43.286571 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94028c24-ec10-4d5c-b32c-1700e677d539\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c2vrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:43Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:43 crc kubenswrapper[4752]: I0929 10:44:43.300290 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:43Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:43 crc kubenswrapper[4752]: I0929 10:44:43.318472 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fb781fd16d4a9f56202eb1724ed1a4ed6700ff7b81819573b955bcb07e563a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:43Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:43 crc kubenswrapper[4752]: I0929 10:44:43.333893 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xv5q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52fc9378-c37b-424b-afde-7b191bab5fde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4rqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xv5q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:43Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:43 crc kubenswrapper[4752]: I0929 10:44:43.355776 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:43Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:43 crc kubenswrapper[4752]: I0929 10:44:43.381279 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48ad7053-6039-4b1a-9729-fcbe1d938928\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00965359c30aa25677d4b114c00b339b155ab4b5316d5e355536bea5b65eaba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e2d86e0821e0155affe296e5cc70e9904f04c800943101e62509e3a5e4e0808\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9378a6f1ac902b030f4ecabac1eae40f884dc1546a360e178f38300e137d8b0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a174bcfad22c2a58c48792478272705c80a56775b45b14919ea1de1dd92b4cbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://828d416b69696f709d91feb8df8fead0f95be74a91c5dab25756e341e29413dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e4ae4f6e0a6df2f1e370b0ff37704c0b0252752c0d8e8a1cdd83088ca9ec951\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e4ae4f6e0a6df2f1e370b0ff37704c0b0252752c0d8e8a1cdd83088ca9ec951\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40c90938f79ba960fa16979dd5f239674df4b13cae8b0b5d3bb48b0e46219a34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40c90938f79ba960fa16979dd5f239674df4b13cae8b0b5d3bb48b0e46219a34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f99c6fe84624f3e518bbe35ee9b700effb126ff1f36d995262b7ed8b73364780\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f99c6fe84624f3e518bbe35ee9b700effb126ff1f36d995262b7ed8b73364780\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:43Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:43 crc kubenswrapper[4752]: I0929 10:44:43.402019 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:43Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:43 crc kubenswrapper[4752]: I0929 10:44:43.416921 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:43Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:43 crc kubenswrapper[4752]: I0929 10:44:43.434992 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5863c243-797d-462a-b11f-71aaf005f8d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://166738b29f01996ec981fd00b49f422e4a97fe774396e7ea153ad29ef30a7370\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdtpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32155f6078e9c15abe4c659ac79b064ec182a232ea1d816998da4de273b7aa67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdtpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mgrvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:43Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:43 crc kubenswrapper[4752]: I0929 10:44:43.452991 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:43Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:43 crc kubenswrapper[4752]: I0929 10:44:43.473738 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:43Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:43 crc kubenswrapper[4752]: I0929 10:44:43.489902 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:43Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:43 crc kubenswrapper[4752]: I0929 10:44:43.502306 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5863c243-797d-462a-b11f-71aaf005f8d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://166738b29f01996ec981fd00b49f422e4a97fe774396e7ea153ad29ef30a7370\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdtpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32155f6078e9c15abe4c659ac79b064ec182a232ea1d816998da4de273b7aa67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdtpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mgrvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:43Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:43 crc kubenswrapper[4752]: I0929 10:44:43.525560 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48ad7053-6039-4b1a-9729-fcbe1d938928\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00965359c30aa25677d4b114c00b339b155ab4b5316d5e355536bea5b65eaba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e2d86e0821e0155affe296e5cc70e9904f04c800943101e62509e3a5e4e0808\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9378a6f1ac902b030f4ecabac1eae40f884dc1546a360e178f38300e137d8b0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a174bcfad22c2a58c48792478272705c80a56775b45b14919ea1de1dd92b4cbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://828d416b69696f709d91feb8df8fead0f95be74a91c5dab25756e341e29413dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e4ae4f6e0a6df2f1e370b0ff37704c0b0252752c0d8e8a1cdd83088ca9ec951\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e4ae4f6e0a6df2f1e370b0ff37704c0b0252752c0d8e8a1cdd83088ca9ec951\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40c90938f79ba960fa16979dd5f239674df4b13cae8b0b5d3bb48b0e46219a34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40c90938f79ba960fa16979dd5f239674df4b13cae8b0b5d3bb48b0e46219a34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f99c6fe84624f3e518bbe35ee9b700effb126ff1f36d995262b7ed8b73364780\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f99c6fe84624f3e518bbe35ee9b700effb126ff1f36d995262b7ed8b73364780\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:43Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:43 crc kubenswrapper[4752]: I0929 10:44:43.539557 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca4f637cfcb1e52fa69f0ffa46b3a53459225d9ad4afd1178bff709e812c5418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b70242846937de5b4dda37a2b8c48947fded378c299ea4ad857168589d7c175\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:43Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:43 crc kubenswrapper[4752]: I0929 10:44:43.551051 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7kp7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66a61a7f-9be6-486b-a425-62ed62ec0ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4170732970e5e7c429279d239eb2d4b9d8249ff254b35f38ff80d0321087be2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kgr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7kp7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:43Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:43 crc kubenswrapper[4752]: I0929 10:44:43.588647 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vm6zb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f30a1f9-86ef-450e-9f8c-8ef8d4ac380a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239ca1f17b9f1e1d6ba63b196e34066fe7fb37373453460261044f5fcaf819af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://239ca1f17b9f1e1d6ba63b196e34066fe7fb37373453460261044f5fcaf819af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vm6zb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:43Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:43 crc kubenswrapper[4752]: I0929 10:44:43.630359 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94028c24-ec10-4d5c-b32c-1700e677d539\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f22dfbbd26fb3ebf4869b46406913cc1963e33c11794193c815235be5acee338\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f22dfbbd26fb3ebf4869b46406913cc1963e33c11794193c815235be5acee338\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c2vrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:43Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:43 crc kubenswrapper[4752]: I0929 10:44:43.663824 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520a5d33-312c-4033-8b69-5dd582f13ccc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6223734bbce461c09916aea7629bba0cfa97ea17050bca7417020ece9ae031a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1157b82d6f3337270d30abdceadaa1f0a01b3c6d8de6bc8e9edf083a8264f19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://854abd6205c2eec2229d0d65aec3edb7cf1cc1e77759df41bd22deda4a08c8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c927118840179fccacbe6a18a329c117cef73a6e914bf38d20fc2439d6a5c1ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c927118840179fccacbe6a18a329c117cef73a6e914bf38d20fc2439d6a5c1ee\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"message\\\":\\\"le observer\\\\nW0929 10:44:40.787758 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0929 10:44:40.787900 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 10:44:40.788558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1487283959/tls.crt::/tmp/serving-cert-1487283959/tls.key\\\\\\\"\\\\nI0929 10:44:41.256284 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 10:44:41.261265 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 10:44:41.261291 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 10:44:41.261311 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 10:44:41.261316 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 10:44:41.267824 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0929 10:44:41.267847 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0929 10:44:41.267849 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 10:44:41.267871 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 10:44:41.267876 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 10:44:41.267879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 10:44:41.267882 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 10:44:41.267884 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0929 10:44:41.270258 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbe61bb570ef2be352bb3a0e55da353ce7b618b397e3bf9f0d66da0c9b6f1d4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80f961b58569cce6d634f225369902695ccda2e78efb1c6fd635f1535467cc1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80f961b58569cce6d634f225369902695ccda2e78efb1c6fd635f1535467cc1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:43Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:43 crc kubenswrapper[4752]: I0929 10:44:43.703388 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:43Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:43 crc kubenswrapper[4752]: I0929 10:44:43.743694 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fb781fd16d4a9f56202eb1724ed1a4ed6700ff7b81819573b955bcb07e563a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:43Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:43 crc kubenswrapper[4752]: I0929 10:44:43.783167 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xv5q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52fc9378-c37b-424b-afde-7b191bab5fde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30ee75a35da106cc9424c7a3f97f28d0c711200667372c023612db4a9701c189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4rqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xv5q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:43Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:44 crc kubenswrapper[4752]: I0929 10:44:44.069026 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-4whp8"] Sep 29 10:44:44 crc kubenswrapper[4752]: I0929 10:44:44.069972 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-4whp8" Sep 29 10:44:44 crc kubenswrapper[4752]: I0929 10:44:44.073178 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Sep 29 10:44:44 crc kubenswrapper[4752]: I0929 10:44:44.073178 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Sep 29 10:44:44 crc kubenswrapper[4752]: I0929 10:44:44.073685 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Sep 29 10:44:44 crc kubenswrapper[4752]: I0929 10:44:44.073826 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Sep 29 10:44:44 crc kubenswrapper[4752]: I0929 10:44:44.090985 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xv5q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52fc9378-c37b-424b-afde-7b191bab5fde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30ee75a35da106cc9424c7a3f97f28d0c711200667372c023612db4a9701c189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4rqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xv5q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:44Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:44 crc kubenswrapper[4752]: I0929 10:44:44.104493 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:44Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:44 crc kubenswrapper[4752]: I0929 10:44:44.117644 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fb781fd16d4a9f56202eb1724ed1a4ed6700ff7b81819573b955bcb07e563a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:44Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:44 crc kubenswrapper[4752]: I0929 10:44:44.134039 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:44Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:44 crc kubenswrapper[4752]: I0929 10:44:44.146740 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4whp8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"398b6e5c-29ac-4701-9207-d3d269b62224\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9hp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4whp8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:44Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:44 crc kubenswrapper[4752]: I0929 10:44:44.153440 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/398b6e5c-29ac-4701-9207-d3d269b62224-serviceca\") pod \"node-ca-4whp8\" (UID: \"398b6e5c-29ac-4701-9207-d3d269b62224\") " pod="openshift-image-registry/node-ca-4whp8" Sep 29 10:44:44 crc kubenswrapper[4752]: I0929 10:44:44.153497 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9hp4\" (UniqueName: \"kubernetes.io/projected/398b6e5c-29ac-4701-9207-d3d269b62224-kube-api-access-f9hp4\") pod \"node-ca-4whp8\" (UID: \"398b6e5c-29ac-4701-9207-d3d269b62224\") " pod="openshift-image-registry/node-ca-4whp8" Sep 29 10:44:44 crc kubenswrapper[4752]: I0929 10:44:44.153541 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/398b6e5c-29ac-4701-9207-d3d269b62224-host\") pod \"node-ca-4whp8\" (UID: \"398b6e5c-29ac-4701-9207-d3d269b62224\") " pod="openshift-image-registry/node-ca-4whp8" Sep 29 10:44:44 crc kubenswrapper[4752]: I0929 10:44:44.174099 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48ad7053-6039-4b1a-9729-fcbe1d938928\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00965359c30aa25677d4b114c00b339b155ab4b5316d5e355536bea5b65eaba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e2d86e0821e0155affe296e5cc70e9904f04c800943101e62509e3a5e4e0808\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9378a6f1ac902b030f4ecabac1eae40f884dc1546a360e178f38300e137d8b0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a174bcfad22c2a58c48792478272705c80a56775b45b14919ea1de1dd92b4cbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://828d416b69696f709d91feb8df8fead0f95be74a91c5dab25756e341e29413dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e4ae4f6e0a6df2f1e370b0ff37704c0b0252752c0d8e8a1cdd83088ca9ec951\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e4ae4f6e0a6df2f1e370b0ff37704c0b0252752c0d8e8a1cdd83088ca9ec951\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40c90938f79ba960fa16979dd5f239674df4b13cae8b0b5d3bb48b0e46219a34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40c90938f79ba960fa16979dd5f239674df4b13cae8b0b5d3bb48b0e46219a34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f99c6fe84624f3e518bbe35ee9b700effb126ff1f36d995262b7ed8b73364780\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f99c6fe84624f3e518bbe35ee9b700effb126ff1f36d995262b7ed8b73364780\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:44Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:44 crc kubenswrapper[4752]: I0929 10:44:44.187849 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:44Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:44 crc kubenswrapper[4752]: I0929 10:44:44.203567 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:44Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:44 crc kubenswrapper[4752]: I0929 10:44:44.206727 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" event={"ID":"94028c24-ec10-4d5c-b32c-1700e677d539","Type":"ContainerStarted","Data":"e2860691a355a598f52a1f13213198fa7889748e67cca21a617ed5714f5eabcc"} Sep 29 10:44:44 crc kubenswrapper[4752]: I0929 10:44:44.206780 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" event={"ID":"94028c24-ec10-4d5c-b32c-1700e677d539","Type":"ContainerStarted","Data":"486ac9c45cc8e6cc88a199b152343c1db14c51125b4357c85d5d082467fc4560"} Sep 29 10:44:44 crc kubenswrapper[4752]: I0929 10:44:44.206794 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" event={"ID":"94028c24-ec10-4d5c-b32c-1700e677d539","Type":"ContainerStarted","Data":"b46368b26939edaf377aa86ef45fc9dc3ec4fa274dfe1cba458bafb8d32309e4"} Sep 29 10:44:44 crc kubenswrapper[4752]: I0929 10:44:44.206822 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" event={"ID":"94028c24-ec10-4d5c-b32c-1700e677d539","Type":"ContainerStarted","Data":"8a98f237ee9baeb799b2ea76ccbe7b349ed70b50f47738fc514ae56b46ee8d1a"} Sep 29 10:44:44 crc kubenswrapper[4752]: I0929 10:44:44.210026 4752 generic.go:334] "Generic (PLEG): container finished" podID="9f30a1f9-86ef-450e-9f8c-8ef8d4ac380a" containerID="dd5b369dc688f11e4ab502a3886b722cba392fce0d3ac7850bd59abffbf7dee2" exitCode=0 Sep 29 10:44:44 crc kubenswrapper[4752]: I0929 10:44:44.210106 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vm6zb" event={"ID":"9f30a1f9-86ef-450e-9f8c-8ef8d4ac380a","Type":"ContainerDied","Data":"dd5b369dc688f11e4ab502a3886b722cba392fce0d3ac7850bd59abffbf7dee2"} Sep 29 10:44:44 crc kubenswrapper[4752]: I0929 10:44:44.213015 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"131d2c8a72fc6a373ebf6835840e6b9c1829db4c78b4961bf36642fd0e8a5636"} Sep 29 10:44:44 crc kubenswrapper[4752]: I0929 10:44:44.214107 4752 scope.go:117] "RemoveContainer" containerID="c927118840179fccacbe6a18a329c117cef73a6e914bf38d20fc2439d6a5c1ee" Sep 29 10:44:44 crc kubenswrapper[4752]: E0929 10:44:44.214345 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Sep 29 10:44:44 crc kubenswrapper[4752]: I0929 10:44:44.222345 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5863c243-797d-462a-b11f-71aaf005f8d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://166738b29f01996ec981fd00b49f422e4a97fe774396e7ea153ad29ef30a7370\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdtpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32155f6078e9c15abe4c659ac79b064ec182a232ea1d816998da4de273b7aa67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdtpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mgrvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:44Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:44 crc kubenswrapper[4752]: I0929 10:44:44.254530 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/398b6e5c-29ac-4701-9207-d3d269b62224-serviceca\") pod \"node-ca-4whp8\" (UID: \"398b6e5c-29ac-4701-9207-d3d269b62224\") " pod="openshift-image-registry/node-ca-4whp8" Sep 29 10:44:44 crc kubenswrapper[4752]: I0929 10:44:44.254629 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9hp4\" (UniqueName: \"kubernetes.io/projected/398b6e5c-29ac-4701-9207-d3d269b62224-kube-api-access-f9hp4\") pod \"node-ca-4whp8\" (UID: \"398b6e5c-29ac-4701-9207-d3d269b62224\") " pod="openshift-image-registry/node-ca-4whp8" Sep 29 10:44:44 crc kubenswrapper[4752]: I0929 10:44:44.254729 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/398b6e5c-29ac-4701-9207-d3d269b62224-host\") pod \"node-ca-4whp8\" (UID: \"398b6e5c-29ac-4701-9207-d3d269b62224\") " pod="openshift-image-registry/node-ca-4whp8" Sep 29 10:44:44 crc kubenswrapper[4752]: I0929 10:44:44.254842 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/398b6e5c-29ac-4701-9207-d3d269b62224-host\") pod \"node-ca-4whp8\" (UID: \"398b6e5c-29ac-4701-9207-d3d269b62224\") " pod="openshift-image-registry/node-ca-4whp8" Sep 29 10:44:44 crc kubenswrapper[4752]: I0929 10:44:44.256647 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/398b6e5c-29ac-4701-9207-d3d269b62224-serviceca\") pod \"node-ca-4whp8\" (UID: \"398b6e5c-29ac-4701-9207-d3d269b62224\") " pod="openshift-image-registry/node-ca-4whp8" Sep 29 10:44:44 crc kubenswrapper[4752]: I0929 10:44:44.269969 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94028c24-ec10-4d5c-b32c-1700e677d539\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f22dfbbd26fb3ebf4869b46406913cc1963e33c11794193c815235be5acee338\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f22dfbbd26fb3ebf4869b46406913cc1963e33c11794193c815235be5acee338\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c2vrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:44Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:44 crc kubenswrapper[4752]: I0929 10:44:44.302222 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9hp4\" (UniqueName: \"kubernetes.io/projected/398b6e5c-29ac-4701-9207-d3d269b62224-kube-api-access-f9hp4\") pod \"node-ca-4whp8\" (UID: \"398b6e5c-29ac-4701-9207-d3d269b62224\") " pod="openshift-image-registry/node-ca-4whp8" Sep 29 10:44:44 crc kubenswrapper[4752]: I0929 10:44:44.329320 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520a5d33-312c-4033-8b69-5dd582f13ccc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6223734bbce461c09916aea7629bba0cfa97ea17050bca7417020ece9ae031a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1157b82d6f3337270d30abdceadaa1f0a01b3c6d8de6bc8e9edf083a8264f19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://854abd6205c2eec2229d0d65aec3edb7cf1cc1e77759df41bd22deda4a08c8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c927118840179fccacbe6a18a329c117cef73a6e914bf38d20fc2439d6a5c1ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c927118840179fccacbe6a18a329c117cef73a6e914bf38d20fc2439d6a5c1ee\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"message\\\":\\\"le observer\\\\nW0929 10:44:40.787758 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0929 10:44:40.787900 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 10:44:40.788558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1487283959/tls.crt::/tmp/serving-cert-1487283959/tls.key\\\\\\\"\\\\nI0929 10:44:41.256284 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 10:44:41.261265 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 10:44:41.261291 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 10:44:41.261311 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 10:44:41.261316 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 10:44:41.267824 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0929 10:44:41.267847 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0929 10:44:41.267849 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 10:44:41.267871 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 10:44:41.267876 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 10:44:41.267879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 10:44:41.267882 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 10:44:41.267884 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0929 10:44:41.270258 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbe61bb570ef2be352bb3a0e55da353ce7b618b397e3bf9f0d66da0c9b6f1d4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80f961b58569cce6d634f225369902695ccda2e78efb1c6fd635f1535467cc1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80f961b58569cce6d634f225369902695ccda2e78efb1c6fd635f1535467cc1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:44Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:44 crc kubenswrapper[4752]: I0929 10:44:44.363628 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca4f637cfcb1e52fa69f0ffa46b3a53459225d9ad4afd1178bff709e812c5418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b70242846937de5b4dda37a2b8c48947fded378c299ea4ad857168589d7c175\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:44Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:44 crc kubenswrapper[4752]: I0929 10:44:44.410296 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7kp7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66a61a7f-9be6-486b-a425-62ed62ec0ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4170732970e5e7c429279d239eb2d4b9d8249ff254b35f38ff80d0321087be2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kgr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7kp7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:44Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:44 crc kubenswrapper[4752]: I0929 10:44:44.424072 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-4whp8" Sep 29 10:44:44 crc kubenswrapper[4752]: W0929 10:44:44.435277 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod398b6e5c_29ac_4701_9207_d3d269b62224.slice/crio-a84ac519180471ed40368fe9fd485f7f8738d31d28c9f092f1a8b112cc93ee88 WatchSource:0}: Error finding container a84ac519180471ed40368fe9fd485f7f8738d31d28c9f092f1a8b112cc93ee88: Status 404 returned error can't find the container with id a84ac519180471ed40368fe9fd485f7f8738d31d28c9f092f1a8b112cc93ee88 Sep 29 10:44:44 crc kubenswrapper[4752]: I0929 10:44:44.451056 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vm6zb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f30a1f9-86ef-450e-9f8c-8ef8d4ac380a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239ca1f17b9f1e1d6ba63b196e34066fe7fb37373453460261044f5fcaf819af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://239ca1f17b9f1e1d6ba63b196e34066fe7fb37373453460261044f5fcaf819af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vm6zb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:44Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:44 crc kubenswrapper[4752]: I0929 10:44:44.485138 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:44Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:44 crc kubenswrapper[4752]: I0929 10:44:44.523715 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fb781fd16d4a9f56202eb1724ed1a4ed6700ff7b81819573b955bcb07e563a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:44Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:44 crc kubenswrapper[4752]: I0929 10:44:44.557533 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 10:44:44 crc kubenswrapper[4752]: E0929 10:44:44.557751 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 10:44:48.557732787 +0000 UTC m=+29.346874454 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:44:44 crc kubenswrapper[4752]: I0929 10:44:44.571208 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xv5q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52fc9378-c37b-424b-afde-7b191bab5fde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30ee75a35da106cc9424c7a3f97f28d0c711200667372c023612db4a9701c189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4rqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xv5q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:44Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:44 crc kubenswrapper[4752]: I0929 10:44:44.612122 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:44Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:44 crc kubenswrapper[4752]: I0929 10:44:44.646185 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131d2c8a72fc6a373ebf6835840e6b9c1829db4c78b4961bf36642fd0e8a5636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:44Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:44 crc kubenswrapper[4752]: I0929 10:44:44.658127 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 10:44:44 crc kubenswrapper[4752]: I0929 10:44:44.658172 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 10:44:44 crc kubenswrapper[4752]: I0929 10:44:44.658200 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 10:44:44 crc kubenswrapper[4752]: I0929 10:44:44.658239 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 10:44:44 crc kubenswrapper[4752]: E0929 10:44:44.658355 4752 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 29 10:44:44 crc kubenswrapper[4752]: E0929 10:44:44.658403 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-29 10:44:48.658390055 +0000 UTC m=+29.447531722 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 29 10:44:44 crc kubenswrapper[4752]: E0929 10:44:44.658733 4752 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 29 10:44:44 crc kubenswrapper[4752]: E0929 10:44:44.658754 4752 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 29 10:44:44 crc kubenswrapper[4752]: E0929 10:44:44.658767 4752 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 10:44:44 crc kubenswrapper[4752]: E0929 10:44:44.658792 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-29 10:44:48.658784876 +0000 UTC m=+29.447926543 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 10:44:44 crc kubenswrapper[4752]: E0929 10:44:44.658850 4752 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 29 10:44:44 crc kubenswrapper[4752]: E0929 10:44:44.658872 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-29 10:44:48.658865988 +0000 UTC m=+29.448007655 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 29 10:44:44 crc kubenswrapper[4752]: E0929 10:44:44.658922 4752 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 29 10:44:44 crc kubenswrapper[4752]: E0929 10:44:44.658940 4752 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 29 10:44:44 crc kubenswrapper[4752]: E0929 10:44:44.658950 4752 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 10:44:44 crc kubenswrapper[4752]: E0929 10:44:44.658975 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-29 10:44:48.65896805 +0000 UTC m=+29.448109717 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 10:44:44 crc kubenswrapper[4752]: I0929 10:44:44.682201 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5863c243-797d-462a-b11f-71aaf005f8d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://166738b29f01996ec981fd00b49f422e4a97fe774396e7ea153ad29ef30a7370\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdtpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32155f6078e9c15abe4c659ac79b064ec182a232ea1d816998da4de273b7aa67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdtpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mgrvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:44Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:44 crc kubenswrapper[4752]: I0929 10:44:44.721838 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4whp8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"398b6e5c-29ac-4701-9207-d3d269b62224\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9hp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4whp8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:44Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:44 crc kubenswrapper[4752]: I0929 10:44:44.769188 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48ad7053-6039-4b1a-9729-fcbe1d938928\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00965359c30aa25677d4b114c00b339b155ab4b5316d5e355536bea5b65eaba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e2d86e0821e0155affe296e5cc70e9904f04c800943101e62509e3a5e4e0808\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9378a6f1ac902b030f4ecabac1eae40f884dc1546a360e178f38300e137d8b0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a174bcfad22c2a58c48792478272705c80a56775b45b14919ea1de1dd92b4cbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://828d416b69696f709d91feb8df8fead0f95be74a91c5dab25756e341e29413dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e4ae4f6e0a6df2f1e370b0ff37704c0b0252752c0d8e8a1cdd83088ca9ec951\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e4ae4f6e0a6df2f1e370b0ff37704c0b0252752c0d8e8a1cdd83088ca9ec951\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40c90938f79ba960fa16979dd5f239674df4b13cae8b0b5d3bb48b0e46219a34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40c90938f79ba960fa16979dd5f239674df4b13cae8b0b5d3bb48b0e46219a34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f99c6fe84624f3e518bbe35ee9b700effb126ff1f36d995262b7ed8b73364780\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f99c6fe84624f3e518bbe35ee9b700effb126ff1f36d995262b7ed8b73364780\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:44Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:44 crc kubenswrapper[4752]: I0929 10:44:44.802188 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:44Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:44 crc kubenswrapper[4752]: I0929 10:44:44.840475 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7kp7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66a61a7f-9be6-486b-a425-62ed62ec0ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4170732970e5e7c429279d239eb2d4b9d8249ff254b35f38ff80d0321087be2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kgr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7kp7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:44Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:44 crc kubenswrapper[4752]: I0929 10:44:44.884270 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vm6zb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f30a1f9-86ef-450e-9f8c-8ef8d4ac380a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239ca1f17b9f1e1d6ba63b196e34066fe7fb37373453460261044f5fcaf819af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://239ca1f17b9f1e1d6ba63b196e34066fe7fb37373453460261044f5fcaf819af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd5b369dc688f11e4ab502a3886b722cba392fce0d3ac7850bd59abffbf7dee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd5b369dc688f11e4ab502a3886b722cba392fce0d3ac7850bd59abffbf7dee2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vm6zb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:44Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:44 crc kubenswrapper[4752]: I0929 10:44:44.928499 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94028c24-ec10-4d5c-b32c-1700e677d539\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f22dfbbd26fb3ebf4869b46406913cc1963e33c11794193c815235be5acee338\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f22dfbbd26fb3ebf4869b46406913cc1963e33c11794193c815235be5acee338\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c2vrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:44Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:44 crc kubenswrapper[4752]: I0929 10:44:44.964388 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520a5d33-312c-4033-8b69-5dd582f13ccc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6223734bbce461c09916aea7629bba0cfa97ea17050bca7417020ece9ae031a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1157b82d6f3337270d30abdceadaa1f0a01b3c6d8de6bc8e9edf083a8264f19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://854abd6205c2eec2229d0d65aec3edb7cf1cc1e77759df41bd22deda4a08c8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c927118840179fccacbe6a18a329c117cef73a6e914bf38d20fc2439d6a5c1ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c927118840179fccacbe6a18a329c117cef73a6e914bf38d20fc2439d6a5c1ee\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"message\\\":\\\"le observer\\\\nW0929 10:44:40.787758 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0929 10:44:40.787900 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 10:44:40.788558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1487283959/tls.crt::/tmp/serving-cert-1487283959/tls.key\\\\\\\"\\\\nI0929 10:44:41.256284 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 10:44:41.261265 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 10:44:41.261291 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 10:44:41.261311 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 10:44:41.261316 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 10:44:41.267824 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0929 10:44:41.267847 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0929 10:44:41.267849 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 10:44:41.267871 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 10:44:41.267876 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 10:44:41.267879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 10:44:41.267882 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 10:44:41.267884 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0929 10:44:41.270258 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbe61bb570ef2be352bb3a0e55da353ce7b618b397e3bf9f0d66da0c9b6f1d4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80f961b58569cce6d634f225369902695ccda2e78efb1c6fd635f1535467cc1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80f961b58569cce6d634f225369902695ccda2e78efb1c6fd635f1535467cc1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:44Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:45 crc kubenswrapper[4752]: I0929 10:44:45.007244 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca4f637cfcb1e52fa69f0ffa46b3a53459225d9ad4afd1178bff709e812c5418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b70242846937de5b4dda37a2b8c48947fded378c299ea4ad857168589d7c175\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:45Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:45 crc kubenswrapper[4752]: I0929 10:44:45.030647 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 10:44:45 crc kubenswrapper[4752]: I0929 10:44:45.030705 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 10:44:45 crc kubenswrapper[4752]: I0929 10:44:45.030738 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 10:44:45 crc kubenswrapper[4752]: E0929 10:44:45.030885 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 10:44:45 crc kubenswrapper[4752]: E0929 10:44:45.030986 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 10:44:45 crc kubenswrapper[4752]: E0929 10:44:45.031120 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 10:44:45 crc kubenswrapper[4752]: I0929 10:44:45.217966 4752 generic.go:334] "Generic (PLEG): container finished" podID="9f30a1f9-86ef-450e-9f8c-8ef8d4ac380a" containerID="88d17821abed9aca5c20373738f44ca9a61e954d1eee46f0d16c3e9b34d810a8" exitCode=0 Sep 29 10:44:45 crc kubenswrapper[4752]: I0929 10:44:45.218033 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vm6zb" event={"ID":"9f30a1f9-86ef-450e-9f8c-8ef8d4ac380a","Type":"ContainerDied","Data":"88d17821abed9aca5c20373738f44ca9a61e954d1eee46f0d16c3e9b34d810a8"} Sep 29 10:44:45 crc kubenswrapper[4752]: I0929 10:44:45.221627 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" event={"ID":"94028c24-ec10-4d5c-b32c-1700e677d539","Type":"ContainerStarted","Data":"e34a55130babbc5fbe9fb81d05fc687dc1b06c3bffea762ba699f9f6c317b312"} Sep 29 10:44:45 crc kubenswrapper[4752]: I0929 10:44:45.221673 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" event={"ID":"94028c24-ec10-4d5c-b32c-1700e677d539","Type":"ContainerStarted","Data":"5985eb5ebc8fa2ca986873aea235335770621597493b43eaa58d98329cd37009"} Sep 29 10:44:45 crc kubenswrapper[4752]: I0929 10:44:45.225598 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-4whp8" event={"ID":"398b6e5c-29ac-4701-9207-d3d269b62224","Type":"ContainerStarted","Data":"63db080ebca3f5ea23ddc9af874b6b500abe8044c73794ae0749df2949fb9520"} Sep 29 10:44:45 crc kubenswrapper[4752]: I0929 10:44:45.225646 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-4whp8" event={"ID":"398b6e5c-29ac-4701-9207-d3d269b62224","Type":"ContainerStarted","Data":"a84ac519180471ed40368fe9fd485f7f8738d31d28c9f092f1a8b112cc93ee88"} Sep 29 10:44:45 crc kubenswrapper[4752]: I0929 10:44:45.235608 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:45Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:45 crc kubenswrapper[4752]: I0929 10:44:45.257938 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48ad7053-6039-4b1a-9729-fcbe1d938928\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00965359c30aa25677d4b114c00b339b155ab4b5316d5e355536bea5b65eaba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e2d86e0821e0155affe296e5cc70e9904f04c800943101e62509e3a5e4e0808\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9378a6f1ac902b030f4ecabac1eae40f884dc1546a360e178f38300e137d8b0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a174bcfad22c2a58c48792478272705c80a56775b45b14919ea1de1dd92b4cbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://828d416b69696f709d91feb8df8fead0f95be74a91c5dab25756e341e29413dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e4ae4f6e0a6df2f1e370b0ff37704c0b0252752c0d8e8a1cdd83088ca9ec951\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e4ae4f6e0a6df2f1e370b0ff37704c0b0252752c0d8e8a1cdd83088ca9ec951\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40c90938f79ba960fa16979dd5f239674df4b13cae8b0b5d3bb48b0e46219a34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40c90938f79ba960fa16979dd5f239674df4b13cae8b0b5d3bb48b0e46219a34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f99c6fe84624f3e518bbe35ee9b700effb126ff1f36d995262b7ed8b73364780\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f99c6fe84624f3e518bbe35ee9b700effb126ff1f36d995262b7ed8b73364780\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:45Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:45 crc kubenswrapper[4752]: I0929 10:44:45.275499 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:45Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:45 crc kubenswrapper[4752]: I0929 10:44:45.287718 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131d2c8a72fc6a373ebf6835840e6b9c1829db4c78b4961bf36642fd0e8a5636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:45Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:45 crc kubenswrapper[4752]: I0929 10:44:45.301753 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5863c243-797d-462a-b11f-71aaf005f8d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://166738b29f01996ec981fd00b49f422e4a97fe774396e7ea153ad29ef30a7370\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdtpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32155f6078e9c15abe4c659ac79b064ec182a232ea1d816998da4de273b7aa67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdtpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mgrvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:45Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:45 crc kubenswrapper[4752]: I0929 10:44:45.313245 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4whp8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"398b6e5c-29ac-4701-9207-d3d269b62224\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9hp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4whp8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:45Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:45 crc kubenswrapper[4752]: I0929 10:44:45.330543 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520a5d33-312c-4033-8b69-5dd582f13ccc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6223734bbce461c09916aea7629bba0cfa97ea17050bca7417020ece9ae031a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1157b82d6f3337270d30abdceadaa1f0a01b3c6d8de6bc8e9edf083a8264f19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://854abd6205c2eec2229d0d65aec3edb7cf1cc1e77759df41bd22deda4a08c8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c927118840179fccacbe6a18a329c117cef73a6e914bf38d20fc2439d6a5c1ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c927118840179fccacbe6a18a329c117cef73a6e914bf38d20fc2439d6a5c1ee\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"message\\\":\\\"le observer\\\\nW0929 10:44:40.787758 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0929 10:44:40.787900 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 10:44:40.788558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1487283959/tls.crt::/tmp/serving-cert-1487283959/tls.key\\\\\\\"\\\\nI0929 10:44:41.256284 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 10:44:41.261265 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 10:44:41.261291 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 10:44:41.261311 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 10:44:41.261316 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 10:44:41.267824 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0929 10:44:41.267847 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0929 10:44:41.267849 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 10:44:41.267871 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 10:44:41.267876 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 10:44:41.267879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 10:44:41.267882 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 10:44:41.267884 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0929 10:44:41.270258 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbe61bb570ef2be352bb3a0e55da353ce7b618b397e3bf9f0d66da0c9b6f1d4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80f961b58569cce6d634f225369902695ccda2e78efb1c6fd635f1535467cc1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80f961b58569cce6d634f225369902695ccda2e78efb1c6fd635f1535467cc1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:45Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:45 crc kubenswrapper[4752]: I0929 10:44:45.344929 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca4f637cfcb1e52fa69f0ffa46b3a53459225d9ad4afd1178bff709e812c5418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b70242846937de5b4dda37a2b8c48947fded378c299ea4ad857168589d7c175\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:45Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:45 crc kubenswrapper[4752]: I0929 10:44:45.362726 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7kp7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66a61a7f-9be6-486b-a425-62ed62ec0ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4170732970e5e7c429279d239eb2d4b9d8249ff254b35f38ff80d0321087be2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kgr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7kp7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:45Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:45 crc kubenswrapper[4752]: I0929 10:44:45.408252 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vm6zb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f30a1f9-86ef-450e-9f8c-8ef8d4ac380a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239ca1f17b9f1e1d6ba63b196e34066fe7fb37373453460261044f5fcaf819af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://239ca1f17b9f1e1d6ba63b196e34066fe7fb37373453460261044f5fcaf819af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd5b369dc688f11e4ab502a3886b722cba392fce0d3ac7850bd59abffbf7dee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd5b369dc688f11e4ab502a3886b722cba392fce0d3ac7850bd59abffbf7dee2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d17821abed9aca5c20373738f44ca9a61e954d1eee46f0d16c3e9b34d810a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88d17821abed9aca5c20373738f44ca9a61e954d1eee46f0d16c3e9b34d810a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vm6zb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:45Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:45 crc kubenswrapper[4752]: I0929 10:44:45.454784 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94028c24-ec10-4d5c-b32c-1700e677d539\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f22dfbbd26fb3ebf4869b46406913cc1963e33c11794193c815235be5acee338\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f22dfbbd26fb3ebf4869b46406913cc1963e33c11794193c815235be5acee338\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c2vrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:45Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:45 crc kubenswrapper[4752]: I0929 10:44:45.486557 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:45Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:45 crc kubenswrapper[4752]: I0929 10:44:45.523959 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fb781fd16d4a9f56202eb1724ed1a4ed6700ff7b81819573b955bcb07e563a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:45Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:45 crc kubenswrapper[4752]: I0929 10:44:45.565914 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xv5q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52fc9378-c37b-424b-afde-7b191bab5fde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30ee75a35da106cc9424c7a3f97f28d0c711200667372c023612db4a9701c189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4rqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xv5q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:45Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:45 crc kubenswrapper[4752]: I0929 10:44:45.604865 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:45Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:45 crc kubenswrapper[4752]: I0929 10:44:45.642854 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5863c243-797d-462a-b11f-71aaf005f8d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://166738b29f01996ec981fd00b49f422e4a97fe774396e7ea153ad29ef30a7370\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdtpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32155f6078e9c15abe4c659ac79b064ec182a232ea1d816998da4de273b7aa67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdtpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mgrvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:45Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:45 crc kubenswrapper[4752]: I0929 10:44:45.683301 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4whp8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"398b6e5c-29ac-4701-9207-d3d269b62224\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63db080ebca3f5ea23ddc9af874b6b500abe8044c73794ae0749df2949fb9520\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9hp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4whp8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:45Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:45 crc kubenswrapper[4752]: I0929 10:44:45.730551 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48ad7053-6039-4b1a-9729-fcbe1d938928\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00965359c30aa25677d4b114c00b339b155ab4b5316d5e355536bea5b65eaba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e2d86e0821e0155affe296e5cc70e9904f04c800943101e62509e3a5e4e0808\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9378a6f1ac902b030f4ecabac1eae40f884dc1546a360e178f38300e137d8b0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a174bcfad22c2a58c48792478272705c80a56775b45b14919ea1de1dd92b4cbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://828d416b69696f709d91feb8df8fead0f95be74a91c5dab25756e341e29413dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e4ae4f6e0a6df2f1e370b0ff37704c0b0252752c0d8e8a1cdd83088ca9ec951\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e4ae4f6e0a6df2f1e370b0ff37704c0b0252752c0d8e8a1cdd83088ca9ec951\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40c90938f79ba960fa16979dd5f239674df4b13cae8b0b5d3bb48b0e46219a34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40c90938f79ba960fa16979dd5f239674df4b13cae8b0b5d3bb48b0e46219a34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f99c6fe84624f3e518bbe35ee9b700effb126ff1f36d995262b7ed8b73364780\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f99c6fe84624f3e518bbe35ee9b700effb126ff1f36d995262b7ed8b73364780\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:45Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:45 crc kubenswrapper[4752]: I0929 10:44:45.764923 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:45Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:45 crc kubenswrapper[4752]: I0929 10:44:45.802881 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131d2c8a72fc6a373ebf6835840e6b9c1829db4c78b4961bf36642fd0e8a5636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:45Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:45 crc kubenswrapper[4752]: I0929 10:44:45.847733 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vm6zb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f30a1f9-86ef-450e-9f8c-8ef8d4ac380a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239ca1f17b9f1e1d6ba63b196e34066fe7fb37373453460261044f5fcaf819af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://239ca1f17b9f1e1d6ba63b196e34066fe7fb37373453460261044f5fcaf819af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd5b369dc688f11e4ab502a3886b722cba392fce0d3ac7850bd59abffbf7dee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd5b369dc688f11e4ab502a3886b722cba392fce0d3ac7850bd59abffbf7dee2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d17821abed9aca5c20373738f44ca9a61e954d1eee46f0d16c3e9b34d810a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88d17821abed9aca5c20373738f44ca9a61e954d1eee46f0d16c3e9b34d810a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vm6zb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:45Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:45 crc kubenswrapper[4752]: I0929 10:44:45.886050 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94028c24-ec10-4d5c-b32c-1700e677d539\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f22dfbbd26fb3ebf4869b46406913cc1963e33c11794193c815235be5acee338\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f22dfbbd26fb3ebf4869b46406913cc1963e33c11794193c815235be5acee338\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c2vrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:45Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:45 crc kubenswrapper[4752]: I0929 10:44:45.925822 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520a5d33-312c-4033-8b69-5dd582f13ccc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6223734bbce461c09916aea7629bba0cfa97ea17050bca7417020ece9ae031a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1157b82d6f3337270d30abdceadaa1f0a01b3c6d8de6bc8e9edf083a8264f19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://854abd6205c2eec2229d0d65aec3edb7cf1cc1e77759df41bd22deda4a08c8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c927118840179fccacbe6a18a329c117cef73a6e914bf38d20fc2439d6a5c1ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c927118840179fccacbe6a18a329c117cef73a6e914bf38d20fc2439d6a5c1ee\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"message\\\":\\\"le observer\\\\nW0929 10:44:40.787758 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0929 10:44:40.787900 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 10:44:40.788558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1487283959/tls.crt::/tmp/serving-cert-1487283959/tls.key\\\\\\\"\\\\nI0929 10:44:41.256284 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 10:44:41.261265 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 10:44:41.261291 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 10:44:41.261311 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 10:44:41.261316 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 10:44:41.267824 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0929 10:44:41.267847 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0929 10:44:41.267849 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 10:44:41.267871 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 10:44:41.267876 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 10:44:41.267879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 10:44:41.267882 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 10:44:41.267884 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0929 10:44:41.270258 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbe61bb570ef2be352bb3a0e55da353ce7b618b397e3bf9f0d66da0c9b6f1d4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80f961b58569cce6d634f225369902695ccda2e78efb1c6fd635f1535467cc1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80f961b58569cce6d634f225369902695ccda2e78efb1c6fd635f1535467cc1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:45Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:45 crc kubenswrapper[4752]: I0929 10:44:45.964382 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca4f637cfcb1e52fa69f0ffa46b3a53459225d9ad4afd1178bff709e812c5418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b70242846937de5b4dda37a2b8c48947fded378c299ea4ad857168589d7c175\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:45Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:46 crc kubenswrapper[4752]: I0929 10:44:46.002344 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7kp7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66a61a7f-9be6-486b-a425-62ed62ec0ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4170732970e5e7c429279d239eb2d4b9d8249ff254b35f38ff80d0321087be2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kgr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7kp7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:46Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:46 crc kubenswrapper[4752]: I0929 10:44:46.048845 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fb781fd16d4a9f56202eb1724ed1a4ed6700ff7b81819573b955bcb07e563a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:46Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:46 crc kubenswrapper[4752]: I0929 10:44:46.086084 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xv5q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52fc9378-c37b-424b-afde-7b191bab5fde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30ee75a35da106cc9424c7a3f97f28d0c711200667372c023612db4a9701c189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4rqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xv5q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:46Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:46 crc kubenswrapper[4752]: I0929 10:44:46.134928 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:46Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:46 crc kubenswrapper[4752]: I0929 10:44:46.232738 4752 generic.go:334] "Generic (PLEG): container finished" podID="9f30a1f9-86ef-450e-9f8c-8ef8d4ac380a" containerID="50f5727e0bd53639ba6b6632f2d62c7c62ae74b07a60aa1cb58c2020990cae42" exitCode=0 Sep 29 10:44:46 crc kubenswrapper[4752]: I0929 10:44:46.232825 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vm6zb" event={"ID":"9f30a1f9-86ef-450e-9f8c-8ef8d4ac380a","Type":"ContainerDied","Data":"50f5727e0bd53639ba6b6632f2d62c7c62ae74b07a60aa1cb58c2020990cae42"} Sep 29 10:44:46 crc kubenswrapper[4752]: I0929 10:44:46.260356 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48ad7053-6039-4b1a-9729-fcbe1d938928\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00965359c30aa25677d4b114c00b339b155ab4b5316d5e355536bea5b65eaba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e2d86e0821e0155affe296e5cc70e9904f04c800943101e62509e3a5e4e0808\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9378a6f1ac902b030f4ecabac1eae40f884dc1546a360e178f38300e137d8b0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a174bcfad22c2a58c48792478272705c80a56775b45b14919ea1de1dd92b4cbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://828d416b69696f709d91feb8df8fead0f95be74a91c5dab25756e341e29413dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e4ae4f6e0a6df2f1e370b0ff37704c0b0252752c0d8e8a1cdd83088ca9ec951\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e4ae4f6e0a6df2f1e370b0ff37704c0b0252752c0d8e8a1cdd83088ca9ec951\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40c90938f79ba960fa16979dd5f239674df4b13cae8b0b5d3bb48b0e46219a34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40c90938f79ba960fa16979dd5f239674df4b13cae8b0b5d3bb48b0e46219a34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f99c6fe84624f3e518bbe35ee9b700effb126ff1f36d995262b7ed8b73364780\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f99c6fe84624f3e518bbe35ee9b700effb126ff1f36d995262b7ed8b73364780\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:46Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:46 crc kubenswrapper[4752]: I0929 10:44:46.277935 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:46Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:46 crc kubenswrapper[4752]: I0929 10:44:46.293151 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131d2c8a72fc6a373ebf6835840e6b9c1829db4c78b4961bf36642fd0e8a5636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:46Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:46 crc kubenswrapper[4752]: I0929 10:44:46.308930 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5863c243-797d-462a-b11f-71aaf005f8d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://166738b29f01996ec981fd00b49f422e4a97fe774396e7ea153ad29ef30a7370\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdtpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32155f6078e9c15abe4c659ac79b064ec182a232ea1d816998da4de273b7aa67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdtpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mgrvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:46Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:46 crc kubenswrapper[4752]: I0929 10:44:46.324589 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4whp8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"398b6e5c-29ac-4701-9207-d3d269b62224\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63db080ebca3f5ea23ddc9af874b6b500abe8044c73794ae0749df2949fb9520\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9hp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4whp8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:46Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:46 crc kubenswrapper[4752]: I0929 10:44:46.366076 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520a5d33-312c-4033-8b69-5dd582f13ccc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6223734bbce461c09916aea7629bba0cfa97ea17050bca7417020ece9ae031a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1157b82d6f3337270d30abdceadaa1f0a01b3c6d8de6bc8e9edf083a8264f19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://854abd6205c2eec2229d0d65aec3edb7cf1cc1e77759df41bd22deda4a08c8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c927118840179fccacbe6a18a329c117cef73a6e914bf38d20fc2439d6a5c1ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c927118840179fccacbe6a18a329c117cef73a6e914bf38d20fc2439d6a5c1ee\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"message\\\":\\\"le observer\\\\nW0929 10:44:40.787758 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0929 10:44:40.787900 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 10:44:40.788558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1487283959/tls.crt::/tmp/serving-cert-1487283959/tls.key\\\\\\\"\\\\nI0929 10:44:41.256284 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 10:44:41.261265 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 10:44:41.261291 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 10:44:41.261311 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 10:44:41.261316 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 10:44:41.267824 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0929 10:44:41.267847 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0929 10:44:41.267849 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 10:44:41.267871 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 10:44:41.267876 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 10:44:41.267879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 10:44:41.267882 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 10:44:41.267884 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0929 10:44:41.270258 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbe61bb570ef2be352bb3a0e55da353ce7b618b397e3bf9f0d66da0c9b6f1d4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80f961b58569cce6d634f225369902695ccda2e78efb1c6fd635f1535467cc1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80f961b58569cce6d634f225369902695ccda2e78efb1c6fd635f1535467cc1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:46Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:46 crc kubenswrapper[4752]: I0929 10:44:46.406925 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca4f637cfcb1e52fa69f0ffa46b3a53459225d9ad4afd1178bff709e812c5418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b70242846937de5b4dda37a2b8c48947fded378c299ea4ad857168589d7c175\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:46Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:46 crc kubenswrapper[4752]: I0929 10:44:46.442795 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7kp7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66a61a7f-9be6-486b-a425-62ed62ec0ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4170732970e5e7c429279d239eb2d4b9d8249ff254b35f38ff80d0321087be2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kgr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7kp7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:46Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:46 crc kubenswrapper[4752]: I0929 10:44:46.485708 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vm6zb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f30a1f9-86ef-450e-9f8c-8ef8d4ac380a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239ca1f17b9f1e1d6ba63b196e34066fe7fb37373453460261044f5fcaf819af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://239ca1f17b9f1e1d6ba63b196e34066fe7fb37373453460261044f5fcaf819af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd5b369dc688f11e4ab502a3886b722cba392fce0d3ac7850bd59abffbf7dee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd5b369dc688f11e4ab502a3886b722cba392fce0d3ac7850bd59abffbf7dee2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d17821abed9aca5c20373738f44ca9a61e954d1eee46f0d16c3e9b34d810a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88d17821abed9aca5c20373738f44ca9a61e954d1eee46f0d16c3e9b34d810a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50f5727e0bd53639ba6b6632f2d62c7c62ae74b07a60aa1cb58c2020990cae42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f5727e0bd53639ba6b6632f2d62c7c62ae74b07a60aa1cb58c2020990cae42\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vm6zb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:46Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:46 crc kubenswrapper[4752]: I0929 10:44:46.534624 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94028c24-ec10-4d5c-b32c-1700e677d539\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f22dfbbd26fb3ebf4869b46406913cc1963e33c11794193c815235be5acee338\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f22dfbbd26fb3ebf4869b46406913cc1963e33c11794193c815235be5acee338\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c2vrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:46Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:46 crc kubenswrapper[4752]: I0929 10:44:46.565224 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:46Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:46 crc kubenswrapper[4752]: I0929 10:44:46.604657 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fb781fd16d4a9f56202eb1724ed1a4ed6700ff7b81819573b955bcb07e563a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:46Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:46 crc kubenswrapper[4752]: I0929 10:44:46.614835 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 29 10:44:46 crc kubenswrapper[4752]: I0929 10:44:46.618935 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 29 10:44:46 crc kubenswrapper[4752]: I0929 10:44:46.643904 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Sep 29 10:44:46 crc kubenswrapper[4752]: I0929 10:44:46.664328 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xv5q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52fc9378-c37b-424b-afde-7b191bab5fde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30ee75a35da106cc9424c7a3f97f28d0c711200667372c023612db4a9701c189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4rqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xv5q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:46Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:46 crc kubenswrapper[4752]: I0929 10:44:46.706210 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:46Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:46 crc kubenswrapper[4752]: I0929 10:44:46.745620 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:46Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:46 crc kubenswrapper[4752]: I0929 10:44:46.784100 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 10:44:46 crc kubenswrapper[4752]: I0929 10:44:46.786866 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:46 crc kubenswrapper[4752]: I0929 10:44:46.786917 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:46 crc kubenswrapper[4752]: I0929 10:44:46.786931 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:46 crc kubenswrapper[4752]: I0929 10:44:46.787112 4752 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 29 10:44:46 crc kubenswrapper[4752]: I0929 10:44:46.790253 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48ad7053-6039-4b1a-9729-fcbe1d938928\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00965359c30aa25677d4b114c00b339b155ab4b5316d5e355536bea5b65eaba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e2d86e0821e0155affe296e5cc70e9904f04c800943101e62509e3a5e4e0808\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9378a6f1ac902b030f4ecabac1eae40f884dc1546a360e178f38300e137d8b0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a174bcfad22c2a58c48792478272705c80a56775b45b14919ea1de1dd92b4cbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://828d416b69696f709d91feb8df8fead0f95be74a91c5dab25756e341e29413dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e4ae4f6e0a6df2f1e370b0ff37704c0b0252752c0d8e8a1cdd83088ca9ec951\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e4ae4f6e0a6df2f1e370b0ff37704c0b0252752c0d8e8a1cdd83088ca9ec951\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40c90938f79ba960fa16979dd5f239674df4b13cae8b0b5d3bb48b0e46219a34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40c90938f79ba960fa16979dd5f239674df4b13cae8b0b5d3bb48b0e46219a34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f99c6fe84624f3e518bbe35ee9b700effb126ff1f36d995262b7ed8b73364780\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f99c6fe84624f3e518bbe35ee9b700effb126ff1f36d995262b7ed8b73364780\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:46Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:46 crc kubenswrapper[4752]: I0929 10:44:46.834571 4752 kubelet_node_status.go:115] "Node was previously registered" node="crc" Sep 29 10:44:46 crc kubenswrapper[4752]: I0929 10:44:46.834875 4752 kubelet_node_status.go:79] "Successfully registered node" node="crc" Sep 29 10:44:46 crc kubenswrapper[4752]: I0929 10:44:46.836074 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:46 crc kubenswrapper[4752]: I0929 10:44:46.836119 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:46 crc kubenswrapper[4752]: I0929 10:44:46.836133 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:46 crc kubenswrapper[4752]: I0929 10:44:46.836152 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:46 crc kubenswrapper[4752]: I0929 10:44:46.836167 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:46Z","lastTransitionTime":"2025-09-29T10:44:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:46 crc kubenswrapper[4752]: E0929 10:44:46.849258 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"67757396-6dfe-4e60-ba89-bdfd50031eb3\\\",\\\"systemUUID\\\":\\\"d8106fc8-56a6-4aa2-998a-aa38bb8caa68\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:46Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:46 crc kubenswrapper[4752]: I0929 10:44:46.853346 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:46 crc kubenswrapper[4752]: I0929 10:44:46.853393 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:46 crc kubenswrapper[4752]: I0929 10:44:46.853403 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:46 crc kubenswrapper[4752]: I0929 10:44:46.853423 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:46 crc kubenswrapper[4752]: I0929 10:44:46.853437 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:46Z","lastTransitionTime":"2025-09-29T10:44:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:46 crc kubenswrapper[4752]: I0929 10:44:46.863828 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:46Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:46 crc kubenswrapper[4752]: E0929 10:44:46.863864 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"67757396-6dfe-4e60-ba89-bdfd50031eb3\\\",\\\"systemUUID\\\":\\\"d8106fc8-56a6-4aa2-998a-aa38bb8caa68\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:46Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:46 crc kubenswrapper[4752]: I0929 10:44:46.867188 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:46 crc kubenswrapper[4752]: I0929 10:44:46.867233 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:46 crc kubenswrapper[4752]: I0929 10:44:46.867242 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:46 crc kubenswrapper[4752]: I0929 10:44:46.867259 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:46 crc kubenswrapper[4752]: I0929 10:44:46.867271 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:46Z","lastTransitionTime":"2025-09-29T10:44:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:46 crc kubenswrapper[4752]: E0929 10:44:46.879408 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"67757396-6dfe-4e60-ba89-bdfd50031eb3\\\",\\\"systemUUID\\\":\\\"d8106fc8-56a6-4aa2-998a-aa38bb8caa68\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:46Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:46 crc kubenswrapper[4752]: I0929 10:44:46.883288 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:46 crc kubenswrapper[4752]: I0929 10:44:46.883343 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:46 crc kubenswrapper[4752]: I0929 10:44:46.883353 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:46 crc kubenswrapper[4752]: I0929 10:44:46.883374 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:46 crc kubenswrapper[4752]: I0929 10:44:46.883385 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:46Z","lastTransitionTime":"2025-09-29T10:44:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:46 crc kubenswrapper[4752]: E0929 10:44:46.895397 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"67757396-6dfe-4e60-ba89-bdfd50031eb3\\\",\\\"systemUUID\\\":\\\"d8106fc8-56a6-4aa2-998a-aa38bb8caa68\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:46Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:46 crc kubenswrapper[4752]: I0929 10:44:46.900114 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:46 crc kubenswrapper[4752]: I0929 10:44:46.900175 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:46 crc kubenswrapper[4752]: I0929 10:44:46.900196 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:46 crc kubenswrapper[4752]: I0929 10:44:46.900223 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:46 crc kubenswrapper[4752]: I0929 10:44:46.900246 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:46Z","lastTransitionTime":"2025-09-29T10:44:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:46 crc kubenswrapper[4752]: I0929 10:44:46.903894 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131d2c8a72fc6a373ebf6835840e6b9c1829db4c78b4961bf36642fd0e8a5636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:46Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:46 crc kubenswrapper[4752]: E0929 10:44:46.912297 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"67757396-6dfe-4e60-ba89-bdfd50031eb3\\\",\\\"systemUUID\\\":\\\"d8106fc8-56a6-4aa2-998a-aa38bb8caa68\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:46Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:46 crc kubenswrapper[4752]: E0929 10:44:46.912619 4752 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 29 10:44:46 crc kubenswrapper[4752]: I0929 10:44:46.914378 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:46 crc kubenswrapper[4752]: I0929 10:44:46.914418 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:46 crc kubenswrapper[4752]: I0929 10:44:46.914428 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:46 crc kubenswrapper[4752]: I0929 10:44:46.914445 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:46 crc kubenswrapper[4752]: I0929 10:44:46.914456 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:46Z","lastTransitionTime":"2025-09-29T10:44:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:46 crc kubenswrapper[4752]: I0929 10:44:46.942770 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5863c243-797d-462a-b11f-71aaf005f8d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://166738b29f01996ec981fd00b49f422e4a97fe774396e7ea153ad29ef30a7370\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdtpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32155f6078e9c15abe4c659ac79b064ec182a232ea1d816998da4de273b7aa67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdtpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mgrvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:46Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:46 crc kubenswrapper[4752]: I0929 10:44:46.982577 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4whp8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"398b6e5c-29ac-4701-9207-d3d269b62224\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63db080ebca3f5ea23ddc9af874b6b500abe8044c73794ae0749df2949fb9520\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9hp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4whp8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:46Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:47 crc kubenswrapper[4752]: I0929 10:44:47.017364 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:47 crc kubenswrapper[4752]: I0929 10:44:47.017412 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:47 crc kubenswrapper[4752]: I0929 10:44:47.017421 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:47 crc kubenswrapper[4752]: I0929 10:44:47.017439 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:47 crc kubenswrapper[4752]: I0929 10:44:47.017451 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:47Z","lastTransitionTime":"2025-09-29T10:44:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:47 crc kubenswrapper[4752]: I0929 10:44:47.025835 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520a5d33-312c-4033-8b69-5dd582f13ccc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6223734bbce461c09916aea7629bba0cfa97ea17050bca7417020ece9ae031a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1157b82d6f3337270d30abdceadaa1f0a01b3c6d8de6bc8e9edf083a8264f19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://854abd6205c2eec2229d0d65aec3edb7cf1cc1e77759df41bd22deda4a08c8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c927118840179fccacbe6a18a329c117cef73a6e914bf38d20fc2439d6a5c1ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c927118840179fccacbe6a18a329c117cef73a6e914bf38d20fc2439d6a5c1ee\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"message\\\":\\\"le observer\\\\nW0929 10:44:40.787758 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0929 10:44:40.787900 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 10:44:40.788558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1487283959/tls.crt::/tmp/serving-cert-1487283959/tls.key\\\\\\\"\\\\nI0929 10:44:41.256284 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 10:44:41.261265 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 10:44:41.261291 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 10:44:41.261311 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 10:44:41.261316 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 10:44:41.267824 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0929 10:44:41.267847 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0929 10:44:41.267849 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 10:44:41.267871 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 10:44:41.267876 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 10:44:41.267879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 10:44:41.267882 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 10:44:41.267884 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0929 10:44:41.270258 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbe61bb570ef2be352bb3a0e55da353ce7b618b397e3bf9f0d66da0c9b6f1d4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80f961b58569cce6d634f225369902695ccda2e78efb1c6fd635f1535467cc1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80f961b58569cce6d634f225369902695ccda2e78efb1c6fd635f1535467cc1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:47Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:47 crc kubenswrapper[4752]: I0929 10:44:47.031145 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 10:44:47 crc kubenswrapper[4752]: I0929 10:44:47.031274 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 10:44:47 crc kubenswrapper[4752]: E0929 10:44:47.031399 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 10:44:47 crc kubenswrapper[4752]: I0929 10:44:47.031429 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 10:44:47 crc kubenswrapper[4752]: E0929 10:44:47.031569 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 10:44:47 crc kubenswrapper[4752]: E0929 10:44:47.031727 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 10:44:47 crc kubenswrapper[4752]: I0929 10:44:47.064202 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca4f637cfcb1e52fa69f0ffa46b3a53459225d9ad4afd1178bff709e812c5418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b70242846937de5b4dda37a2b8c48947fded378c299ea4ad857168589d7c175\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:47Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:47 crc kubenswrapper[4752]: I0929 10:44:47.103356 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7kp7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66a61a7f-9be6-486b-a425-62ed62ec0ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4170732970e5e7c429279d239eb2d4b9d8249ff254b35f38ff80d0321087be2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kgr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7kp7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:47Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:47 crc kubenswrapper[4752]: I0929 10:44:47.124153 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:47 crc kubenswrapper[4752]: I0929 10:44:47.124195 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:47 crc kubenswrapper[4752]: I0929 10:44:47.124207 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:47 crc kubenswrapper[4752]: I0929 10:44:47.124224 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:47 crc kubenswrapper[4752]: I0929 10:44:47.124236 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:47Z","lastTransitionTime":"2025-09-29T10:44:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:47 crc kubenswrapper[4752]: I0929 10:44:47.145766 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vm6zb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f30a1f9-86ef-450e-9f8c-8ef8d4ac380a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239ca1f17b9f1e1d6ba63b196e34066fe7fb37373453460261044f5fcaf819af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://239ca1f17b9f1e1d6ba63b196e34066fe7fb37373453460261044f5fcaf819af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd5b369dc688f11e4ab502a3886b722cba392fce0d3ac7850bd59abffbf7dee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd5b369dc688f11e4ab502a3886b722cba392fce0d3ac7850bd59abffbf7dee2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d17821abed9aca5c20373738f44ca9a61e954d1eee46f0d16c3e9b34d810a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88d17821abed9aca5c20373738f44ca9a61e954d1eee46f0d16c3e9b34d810a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50f5727e0bd53639ba6b6632f2d62c7c62ae74b07a60aa1cb58c2020990cae42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f5727e0bd53639ba6b6632f2d62c7c62ae74b07a60aa1cb58c2020990cae42\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vm6zb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:47Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:47 crc kubenswrapper[4752]: I0929 10:44:47.188766 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94028c24-ec10-4d5c-b32c-1700e677d539\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f22dfbbd26fb3ebf4869b46406913cc1963e33c11794193c815235be5acee338\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f22dfbbd26fb3ebf4869b46406913cc1963e33c11794193c815235be5acee338\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c2vrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:47Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:47 crc kubenswrapper[4752]: I0929 10:44:47.225134 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3e5d3a3-2f2d-4f61-ae95-26ebd1f72342\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66d77cd5048e199a6eae84be4079c3b00305f4f5223b5176a49df0feb2f0bf8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74b270e951a827068c908168bf04d4cd3bcba62e472e4a3f415de8b7463fdccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dd4d83f6d6b5db7fc93239bc1a6b731c67bc15ef1ca1990b53589e4ad36bfa7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39ef26bf3e7b95ac9a59199bbabe11fd4e831baba1b120ef97a4839c0c4aab7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:47Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:47 crc kubenswrapper[4752]: I0929 10:44:47.227195 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:47 crc kubenswrapper[4752]: I0929 10:44:47.227256 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:47 crc kubenswrapper[4752]: I0929 10:44:47.227272 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:47 crc kubenswrapper[4752]: I0929 10:44:47.227293 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:47 crc kubenswrapper[4752]: I0929 10:44:47.227306 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:47Z","lastTransitionTime":"2025-09-29T10:44:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:47 crc kubenswrapper[4752]: I0929 10:44:47.241695 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" event={"ID":"94028c24-ec10-4d5c-b32c-1700e677d539","Type":"ContainerStarted","Data":"ea11fb795febf50e35263b0a02c32a01fd69937dfbfe196696cd1792e40cc191"} Sep 29 10:44:47 crc kubenswrapper[4752]: I0929 10:44:47.245297 4752 generic.go:334] "Generic (PLEG): container finished" podID="9f30a1f9-86ef-450e-9f8c-8ef8d4ac380a" containerID="fd84740e3b0a970decedcc3960fb987fa618f9627f06be1d2d0b034d0361f805" exitCode=0 Sep 29 10:44:47 crc kubenswrapper[4752]: I0929 10:44:47.246022 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vm6zb" event={"ID":"9f30a1f9-86ef-450e-9f8c-8ef8d4ac380a","Type":"ContainerDied","Data":"fd84740e3b0a970decedcc3960fb987fa618f9627f06be1d2d0b034d0361f805"} Sep 29 10:44:47 crc kubenswrapper[4752]: I0929 10:44:47.269783 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:47Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:47 crc kubenswrapper[4752]: E0929 10:44:47.282110 4752 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 29 10:44:47 crc kubenswrapper[4752]: I0929 10:44:47.325822 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fb781fd16d4a9f56202eb1724ed1a4ed6700ff7b81819573b955bcb07e563a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:47Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:47 crc kubenswrapper[4752]: I0929 10:44:47.329831 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:47 crc kubenswrapper[4752]: I0929 10:44:47.329888 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:47 crc kubenswrapper[4752]: I0929 10:44:47.329903 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:47 crc kubenswrapper[4752]: I0929 10:44:47.329926 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:47 crc kubenswrapper[4752]: I0929 10:44:47.329942 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:47Z","lastTransitionTime":"2025-09-29T10:44:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:47 crc kubenswrapper[4752]: I0929 10:44:47.365245 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xv5q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52fc9378-c37b-424b-afde-7b191bab5fde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30ee75a35da106cc9424c7a3f97f28d0c711200667372c023612db4a9701c189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4rqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xv5q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:47Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:47 crc kubenswrapper[4752]: I0929 10:44:47.403926 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:47Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:47 crc kubenswrapper[4752]: I0929 10:44:47.432445 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:47 crc kubenswrapper[4752]: I0929 10:44:47.432491 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:47 crc kubenswrapper[4752]: I0929 10:44:47.432507 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:47 crc kubenswrapper[4752]: I0929 10:44:47.432524 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:47 crc kubenswrapper[4752]: I0929 10:44:47.432534 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:47Z","lastTransitionTime":"2025-09-29T10:44:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:47 crc kubenswrapper[4752]: I0929 10:44:47.450141 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48ad7053-6039-4b1a-9729-fcbe1d938928\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00965359c30aa25677d4b114c00b339b155ab4b5316d5e355536bea5b65eaba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e2d86e0821e0155affe296e5cc70e9904f04c800943101e62509e3a5e4e0808\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9378a6f1ac902b030f4ecabac1eae40f884dc1546a360e178f38300e137d8b0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a174bcfad22c2a58c48792478272705c80a56775b45b14919ea1de1dd92b4cbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://828d416b69696f709d91feb8df8fead0f95be74a91c5dab25756e341e29413dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e4ae4f6e0a6df2f1e370b0ff37704c0b0252752c0d8e8a1cdd83088ca9ec951\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e4ae4f6e0a6df2f1e370b0ff37704c0b0252752c0d8e8a1cdd83088ca9ec951\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40c90938f79ba960fa16979dd5f239674df4b13cae8b0b5d3bb48b0e46219a34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40c90938f79ba960fa16979dd5f239674df4b13cae8b0b5d3bb48b0e46219a34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f99c6fe84624f3e518bbe35ee9b700effb126ff1f36d995262b7ed8b73364780\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f99c6fe84624f3e518bbe35ee9b700effb126ff1f36d995262b7ed8b73364780\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:47Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:47 crc kubenswrapper[4752]: I0929 10:44:47.482681 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:47Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:47 crc kubenswrapper[4752]: I0929 10:44:47.524407 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131d2c8a72fc6a373ebf6835840e6b9c1829db4c78b4961bf36642fd0e8a5636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:47Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:47 crc kubenswrapper[4752]: I0929 10:44:47.535012 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:47 crc kubenswrapper[4752]: I0929 10:44:47.535097 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:47 crc kubenswrapper[4752]: I0929 10:44:47.535123 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:47 crc kubenswrapper[4752]: I0929 10:44:47.535160 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:47 crc kubenswrapper[4752]: I0929 10:44:47.535184 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:47Z","lastTransitionTime":"2025-09-29T10:44:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:47 crc kubenswrapper[4752]: I0929 10:44:47.565156 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5863c243-797d-462a-b11f-71aaf005f8d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://166738b29f01996ec981fd00b49f422e4a97fe774396e7ea153ad29ef30a7370\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdtpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32155f6078e9c15abe4c659ac79b064ec182a232ea1d816998da4de273b7aa67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdtpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mgrvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:47Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:47 crc kubenswrapper[4752]: I0929 10:44:47.604579 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4whp8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"398b6e5c-29ac-4701-9207-d3d269b62224\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63db080ebca3f5ea23ddc9af874b6b500abe8044c73794ae0749df2949fb9520\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9hp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4whp8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:47Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:47 crc kubenswrapper[4752]: I0929 10:44:47.638133 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:47 crc kubenswrapper[4752]: I0929 10:44:47.638171 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:47 crc kubenswrapper[4752]: I0929 10:44:47.638180 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:47 crc kubenswrapper[4752]: I0929 10:44:47.638195 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:47 crc kubenswrapper[4752]: I0929 10:44:47.638204 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:47Z","lastTransitionTime":"2025-09-29T10:44:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:47 crc kubenswrapper[4752]: I0929 10:44:47.646364 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520a5d33-312c-4033-8b69-5dd582f13ccc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6223734bbce461c09916aea7629bba0cfa97ea17050bca7417020ece9ae031a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1157b82d6f3337270d30abdceadaa1f0a01b3c6d8de6bc8e9edf083a8264f19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://854abd6205c2eec2229d0d65aec3edb7cf1cc1e77759df41bd22deda4a08c8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c927118840179fccacbe6a18a329c117cef73a6e914bf38d20fc2439d6a5c1ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c927118840179fccacbe6a18a329c117cef73a6e914bf38d20fc2439d6a5c1ee\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"message\\\":\\\"le observer\\\\nW0929 10:44:40.787758 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0929 10:44:40.787900 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 10:44:40.788558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1487283959/tls.crt::/tmp/serving-cert-1487283959/tls.key\\\\\\\"\\\\nI0929 10:44:41.256284 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 10:44:41.261265 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 10:44:41.261291 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 10:44:41.261311 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 10:44:41.261316 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 10:44:41.267824 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0929 10:44:41.267847 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0929 10:44:41.267849 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 10:44:41.267871 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 10:44:41.267876 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 10:44:41.267879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 10:44:41.267882 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 10:44:41.267884 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0929 10:44:41.270258 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbe61bb570ef2be352bb3a0e55da353ce7b618b397e3bf9f0d66da0c9b6f1d4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80f961b58569cce6d634f225369902695ccda2e78efb1c6fd635f1535467cc1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80f961b58569cce6d634f225369902695ccda2e78efb1c6fd635f1535467cc1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:47Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:47 crc kubenswrapper[4752]: I0929 10:44:47.686246 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca4f637cfcb1e52fa69f0ffa46b3a53459225d9ad4afd1178bff709e812c5418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b70242846937de5b4dda37a2b8c48947fded378c299ea4ad857168589d7c175\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:47Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:47 crc kubenswrapper[4752]: I0929 10:44:47.724640 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7kp7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66a61a7f-9be6-486b-a425-62ed62ec0ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4170732970e5e7c429279d239eb2d4b9d8249ff254b35f38ff80d0321087be2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kgr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7kp7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:47Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:47 crc kubenswrapper[4752]: I0929 10:44:47.741183 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:47 crc kubenswrapper[4752]: I0929 10:44:47.741250 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:47 crc kubenswrapper[4752]: I0929 10:44:47.741262 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:47 crc kubenswrapper[4752]: I0929 10:44:47.741281 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:47 crc kubenswrapper[4752]: I0929 10:44:47.741292 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:47Z","lastTransitionTime":"2025-09-29T10:44:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:47 crc kubenswrapper[4752]: I0929 10:44:47.766077 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vm6zb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f30a1f9-86ef-450e-9f8c-8ef8d4ac380a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239ca1f17b9f1e1d6ba63b196e34066fe7fb37373453460261044f5fcaf819af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://239ca1f17b9f1e1d6ba63b196e34066fe7fb37373453460261044f5fcaf819af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd5b369dc688f11e4ab502a3886b722cba392fce0d3ac7850bd59abffbf7dee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd5b369dc688f11e4ab502a3886b722cba392fce0d3ac7850bd59abffbf7dee2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d17821abed9aca5c20373738f44ca9a61e954d1eee46f0d16c3e9b34d810a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88d17821abed9aca5c20373738f44ca9a61e954d1eee46f0d16c3e9b34d810a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50f5727e0bd53639ba6b6632f2d62c7c62ae74b07a60aa1cb58c2020990cae42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f5727e0bd53639ba6b6632f2d62c7c62ae74b07a60aa1cb58c2020990cae42\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd84740e3b0a970decedcc3960fb987fa618f9627f06be1d2d0b034d0361f805\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd84740e3b0a970decedcc3960fb987fa618f9627f06be1d2d0b034d0361f805\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vm6zb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:47Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:47 crc kubenswrapper[4752]: I0929 10:44:47.811331 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94028c24-ec10-4d5c-b32c-1700e677d539\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f22dfbbd26fb3ebf4869b46406913cc1963e33c11794193c815235be5acee338\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f22dfbbd26fb3ebf4869b46406913cc1963e33c11794193c815235be5acee338\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c2vrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:47Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:47 crc kubenswrapper[4752]: I0929 10:44:47.844017 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:47 crc kubenswrapper[4752]: I0929 10:44:47.844061 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:47 crc kubenswrapper[4752]: I0929 10:44:47.844073 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:47 crc kubenswrapper[4752]: I0929 10:44:47.844092 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:47 crc kubenswrapper[4752]: I0929 10:44:47.844105 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:47Z","lastTransitionTime":"2025-09-29T10:44:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:47 crc kubenswrapper[4752]: I0929 10:44:47.845303 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3e5d3a3-2f2d-4f61-ae95-26ebd1f72342\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66d77cd5048e199a6eae84be4079c3b00305f4f5223b5176a49df0feb2f0bf8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74b270e951a827068c908168bf04d4cd3bcba62e472e4a3f415de8b7463fdccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dd4d83f6d6b5db7fc93239bc1a6b731c67bc15ef1ca1990b53589e4ad36bfa7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39ef26bf3e7b95ac9a59199bbabe11fd4e831baba1b120ef97a4839c0c4aab7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:47Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:47 crc kubenswrapper[4752]: I0929 10:44:47.885133 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:47Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:47 crc kubenswrapper[4752]: I0929 10:44:47.923082 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fb781fd16d4a9f56202eb1724ed1a4ed6700ff7b81819573b955bcb07e563a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:47Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:47 crc kubenswrapper[4752]: I0929 10:44:47.946643 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:47 crc kubenswrapper[4752]: I0929 10:44:47.946687 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:47 crc kubenswrapper[4752]: I0929 10:44:47.946700 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:47 crc kubenswrapper[4752]: I0929 10:44:47.946721 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:47 crc kubenswrapper[4752]: I0929 10:44:47.946736 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:47Z","lastTransitionTime":"2025-09-29T10:44:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:47 crc kubenswrapper[4752]: I0929 10:44:47.965673 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xv5q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52fc9378-c37b-424b-afde-7b191bab5fde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30ee75a35da106cc9424c7a3f97f28d0c711200667372c023612db4a9701c189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4rqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xv5q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:47Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:48 crc kubenswrapper[4752]: I0929 10:44:48.049537 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:48 crc kubenswrapper[4752]: I0929 10:44:48.049591 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:48 crc kubenswrapper[4752]: I0929 10:44:48.049601 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:48 crc kubenswrapper[4752]: I0929 10:44:48.049618 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:48 crc kubenswrapper[4752]: I0929 10:44:48.049628 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:48Z","lastTransitionTime":"2025-09-29T10:44:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:48 crc kubenswrapper[4752]: I0929 10:44:48.157417 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:48 crc kubenswrapper[4752]: I0929 10:44:48.158075 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:48 crc kubenswrapper[4752]: I0929 10:44:48.158101 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:48 crc kubenswrapper[4752]: I0929 10:44:48.158171 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:48 crc kubenswrapper[4752]: I0929 10:44:48.158208 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:48Z","lastTransitionTime":"2025-09-29T10:44:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:48 crc kubenswrapper[4752]: I0929 10:44:48.252897 4752 generic.go:334] "Generic (PLEG): container finished" podID="9f30a1f9-86ef-450e-9f8c-8ef8d4ac380a" containerID="6af6d9f7c1ca6625f88dcaa9ef267cf11f3ebb16a0ce12d3c2442550bc0833ec" exitCode=0 Sep 29 10:44:48 crc kubenswrapper[4752]: I0929 10:44:48.253004 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vm6zb" event={"ID":"9f30a1f9-86ef-450e-9f8c-8ef8d4ac380a","Type":"ContainerDied","Data":"6af6d9f7c1ca6625f88dcaa9ef267cf11f3ebb16a0ce12d3c2442550bc0833ec"} Sep 29 10:44:48 crc kubenswrapper[4752]: I0929 10:44:48.261010 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:48 crc kubenswrapper[4752]: I0929 10:44:48.261059 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:48 crc kubenswrapper[4752]: I0929 10:44:48.261082 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:48 crc kubenswrapper[4752]: I0929 10:44:48.261103 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:48 crc kubenswrapper[4752]: I0929 10:44:48.261113 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:48Z","lastTransitionTime":"2025-09-29T10:44:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:48 crc kubenswrapper[4752]: I0929 10:44:48.271532 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520a5d33-312c-4033-8b69-5dd582f13ccc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6223734bbce461c09916aea7629bba0cfa97ea17050bca7417020ece9ae031a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1157b82d6f3337270d30abdceadaa1f0a01b3c6d8de6bc8e9edf083a8264f19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://854abd6205c2eec2229d0d65aec3edb7cf1cc1e77759df41bd22deda4a08c8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c927118840179fccacbe6a18a329c117cef73a6e914bf38d20fc2439d6a5c1ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c927118840179fccacbe6a18a329c117cef73a6e914bf38d20fc2439d6a5c1ee\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"message\\\":\\\"le observer\\\\nW0929 10:44:40.787758 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0929 10:44:40.787900 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 10:44:40.788558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1487283959/tls.crt::/tmp/serving-cert-1487283959/tls.key\\\\\\\"\\\\nI0929 10:44:41.256284 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 10:44:41.261265 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 10:44:41.261291 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 10:44:41.261311 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 10:44:41.261316 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 10:44:41.267824 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0929 10:44:41.267847 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0929 10:44:41.267849 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 10:44:41.267871 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 10:44:41.267876 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 10:44:41.267879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 10:44:41.267882 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 10:44:41.267884 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0929 10:44:41.270258 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbe61bb570ef2be352bb3a0e55da353ce7b618b397e3bf9f0d66da0c9b6f1d4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80f961b58569cce6d634f225369902695ccda2e78efb1c6fd635f1535467cc1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80f961b58569cce6d634f225369902695ccda2e78efb1c6fd635f1535467cc1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:48Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:48 crc kubenswrapper[4752]: I0929 10:44:48.288341 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca4f637cfcb1e52fa69f0ffa46b3a53459225d9ad4afd1178bff709e812c5418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b70242846937de5b4dda37a2b8c48947fded378c299ea4ad857168589d7c175\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:48Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:48 crc kubenswrapper[4752]: I0929 10:44:48.300631 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7kp7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66a61a7f-9be6-486b-a425-62ed62ec0ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4170732970e5e7c429279d239eb2d4b9d8249ff254b35f38ff80d0321087be2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kgr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7kp7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:48Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:48 crc kubenswrapper[4752]: I0929 10:44:48.316981 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vm6zb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f30a1f9-86ef-450e-9f8c-8ef8d4ac380a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239ca1f17b9f1e1d6ba63b196e34066fe7fb37373453460261044f5fcaf819af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://239ca1f17b9f1e1d6ba63b196e34066fe7fb37373453460261044f5fcaf819af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd5b369dc688f11e4ab502a3886b722cba392fce0d3ac7850bd59abffbf7dee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd5b369dc688f11e4ab502a3886b722cba392fce0d3ac7850bd59abffbf7dee2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d17821abed9aca5c20373738f44ca9a61e954d1eee46f0d16c3e9b34d810a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88d17821abed9aca5c20373738f44ca9a61e954d1eee46f0d16c3e9b34d810a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50f5727e0bd53639ba6b6632f2d62c7c62ae74b07a60aa1cb58c2020990cae42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f5727e0bd53639ba6b6632f2d62c7c62ae74b07a60aa1cb58c2020990cae42\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd84740e3b0a970decedcc3960fb987fa618f9627f06be1d2d0b034d0361f805\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd84740e3b0a970decedcc3960fb987fa618f9627f06be1d2d0b034d0361f805\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6af6d9f7c1ca6625f88dcaa9ef267cf11f3ebb16a0ce12d3c2442550bc0833ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6af6d9f7c1ca6625f88dcaa9ef267cf11f3ebb16a0ce12d3c2442550bc0833ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vm6zb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:48Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:48 crc kubenswrapper[4752]: I0929 10:44:48.343301 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94028c24-ec10-4d5c-b32c-1700e677d539\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f22dfbbd26fb3ebf4869b46406913cc1963e33c11794193c815235be5acee338\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f22dfbbd26fb3ebf4869b46406913cc1963e33c11794193c815235be5acee338\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c2vrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:48Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:48 crc kubenswrapper[4752]: I0929 10:44:48.356926 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3e5d3a3-2f2d-4f61-ae95-26ebd1f72342\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66d77cd5048e199a6eae84be4079c3b00305f4f5223b5176a49df0feb2f0bf8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74b270e951a827068c908168bf04d4cd3bcba62e472e4a3f415de8b7463fdccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dd4d83f6d6b5db7fc93239bc1a6b731c67bc15ef1ca1990b53589e4ad36bfa7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39ef26bf3e7b95ac9a59199bbabe11fd4e831baba1b120ef97a4839c0c4aab7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:48Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:48 crc kubenswrapper[4752]: I0929 10:44:48.363766 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:48 crc kubenswrapper[4752]: I0929 10:44:48.363825 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:48 crc kubenswrapper[4752]: I0929 10:44:48.363840 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:48 crc kubenswrapper[4752]: I0929 10:44:48.363860 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:48 crc kubenswrapper[4752]: I0929 10:44:48.363872 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:48Z","lastTransitionTime":"2025-09-29T10:44:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:48 crc kubenswrapper[4752]: I0929 10:44:48.371852 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:48Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:48 crc kubenswrapper[4752]: I0929 10:44:48.388708 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fb781fd16d4a9f56202eb1724ed1a4ed6700ff7b81819573b955bcb07e563a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:48Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:48 crc kubenswrapper[4752]: I0929 10:44:48.410898 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xv5q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52fc9378-c37b-424b-afde-7b191bab5fde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30ee75a35da106cc9424c7a3f97f28d0c711200667372c023612db4a9701c189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4rqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xv5q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:48Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:48 crc kubenswrapper[4752]: I0929 10:44:48.429400 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:48Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:48 crc kubenswrapper[4752]: I0929 10:44:48.453785 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48ad7053-6039-4b1a-9729-fcbe1d938928\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00965359c30aa25677d4b114c00b339b155ab4b5316d5e355536bea5b65eaba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e2d86e0821e0155affe296e5cc70e9904f04c800943101e62509e3a5e4e0808\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9378a6f1ac902b030f4ecabac1eae40f884dc1546a360e178f38300e137d8b0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a174bcfad22c2a58c48792478272705c80a56775b45b14919ea1de1dd92b4cbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://828d416b69696f709d91feb8df8fead0f95be74a91c5dab25756e341e29413dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e4ae4f6e0a6df2f1e370b0ff37704c0b0252752c0d8e8a1cdd83088ca9ec951\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e4ae4f6e0a6df2f1e370b0ff37704c0b0252752c0d8e8a1cdd83088ca9ec951\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40c90938f79ba960fa16979dd5f239674df4b13cae8b0b5d3bb48b0e46219a34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40c90938f79ba960fa16979dd5f239674df4b13cae8b0b5d3bb48b0e46219a34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f99c6fe84624f3e518bbe35ee9b700effb126ff1f36d995262b7ed8b73364780\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f99c6fe84624f3e518bbe35ee9b700effb126ff1f36d995262b7ed8b73364780\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:48Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:48 crc kubenswrapper[4752]: I0929 10:44:48.470414 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:48 crc kubenswrapper[4752]: I0929 10:44:48.470482 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:48 crc kubenswrapper[4752]: I0929 10:44:48.470496 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:48 crc kubenswrapper[4752]: I0929 10:44:48.470520 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:48 crc kubenswrapper[4752]: I0929 10:44:48.470535 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:48Z","lastTransitionTime":"2025-09-29T10:44:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:48 crc kubenswrapper[4752]: I0929 10:44:48.483423 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:48Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:48 crc kubenswrapper[4752]: I0929 10:44:48.507724 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131d2c8a72fc6a373ebf6835840e6b9c1829db4c78b4961bf36642fd0e8a5636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:48Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:48 crc kubenswrapper[4752]: I0929 10:44:48.525110 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5863c243-797d-462a-b11f-71aaf005f8d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://166738b29f01996ec981fd00b49f422e4a97fe774396e7ea153ad29ef30a7370\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdtpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32155f6078e9c15abe4c659ac79b064ec182a232ea1d816998da4de273b7aa67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdtpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mgrvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:48Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:48 crc kubenswrapper[4752]: I0929 10:44:48.560783 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4whp8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"398b6e5c-29ac-4701-9207-d3d269b62224\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63db080ebca3f5ea23ddc9af874b6b500abe8044c73794ae0749df2949fb9520\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9hp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4whp8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:48Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:48 crc kubenswrapper[4752]: I0929 10:44:48.572910 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:48 crc kubenswrapper[4752]: I0929 10:44:48.572950 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:48 crc kubenswrapper[4752]: I0929 10:44:48.572959 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:48 crc kubenswrapper[4752]: I0929 10:44:48.572983 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:48 crc kubenswrapper[4752]: I0929 10:44:48.572994 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:48Z","lastTransitionTime":"2025-09-29T10:44:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:48 crc kubenswrapper[4752]: I0929 10:44:48.601736 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 10:44:48 crc kubenswrapper[4752]: E0929 10:44:48.602060 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 10:44:56.602024746 +0000 UTC m=+37.391166553 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:44:48 crc kubenswrapper[4752]: I0929 10:44:48.675960 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:48 crc kubenswrapper[4752]: I0929 10:44:48.676159 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:48 crc kubenswrapper[4752]: I0929 10:44:48.676194 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:48 crc kubenswrapper[4752]: I0929 10:44:48.676219 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:48 crc kubenswrapper[4752]: I0929 10:44:48.676592 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:48Z","lastTransitionTime":"2025-09-29T10:44:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:48 crc kubenswrapper[4752]: I0929 10:44:48.703309 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 10:44:48 crc kubenswrapper[4752]: I0929 10:44:48.703365 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 10:44:48 crc kubenswrapper[4752]: I0929 10:44:48.703394 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 10:44:48 crc kubenswrapper[4752]: I0929 10:44:48.703413 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 10:44:48 crc kubenswrapper[4752]: E0929 10:44:48.703562 4752 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 29 10:44:48 crc kubenswrapper[4752]: E0929 10:44:48.703581 4752 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 29 10:44:48 crc kubenswrapper[4752]: E0929 10:44:48.703592 4752 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 10:44:48 crc kubenswrapper[4752]: E0929 10:44:48.703648 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-29 10:44:56.703632526 +0000 UTC m=+37.492774193 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 10:44:48 crc kubenswrapper[4752]: E0929 10:44:48.703647 4752 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 29 10:44:48 crc kubenswrapper[4752]: E0929 10:44:48.703701 4752 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 29 10:44:48 crc kubenswrapper[4752]: E0929 10:44:48.703704 4752 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 29 10:44:48 crc kubenswrapper[4752]: E0929 10:44:48.703747 4752 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 29 10:44:48 crc kubenswrapper[4752]: E0929 10:44:48.703845 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-29 10:44:56.703823432 +0000 UTC m=+37.492965099 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 29 10:44:48 crc kubenswrapper[4752]: E0929 10:44:48.703719 4752 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 10:44:48 crc kubenswrapper[4752]: E0929 10:44:48.703866 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-29 10:44:56.703858213 +0000 UTC m=+37.492999870 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 29 10:44:48 crc kubenswrapper[4752]: E0929 10:44:48.703935 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-29 10:44:56.703911064 +0000 UTC m=+37.493052791 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 10:44:48 crc kubenswrapper[4752]: I0929 10:44:48.778749 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:48 crc kubenswrapper[4752]: I0929 10:44:48.778844 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:48 crc kubenswrapper[4752]: I0929 10:44:48.778866 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:48 crc kubenswrapper[4752]: I0929 10:44:48.778894 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:48 crc kubenswrapper[4752]: I0929 10:44:48.778913 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:48Z","lastTransitionTime":"2025-09-29T10:44:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:48 crc kubenswrapper[4752]: I0929 10:44:48.882080 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:48 crc kubenswrapper[4752]: I0929 10:44:48.882153 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:48 crc kubenswrapper[4752]: I0929 10:44:48.882172 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:48 crc kubenswrapper[4752]: I0929 10:44:48.882202 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:48 crc kubenswrapper[4752]: I0929 10:44:48.882223 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:48Z","lastTransitionTime":"2025-09-29T10:44:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:48 crc kubenswrapper[4752]: I0929 10:44:48.985475 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:48 crc kubenswrapper[4752]: I0929 10:44:48.985538 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:48 crc kubenswrapper[4752]: I0929 10:44:48.985554 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:48 crc kubenswrapper[4752]: I0929 10:44:48.985578 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:48 crc kubenswrapper[4752]: I0929 10:44:48.985595 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:48Z","lastTransitionTime":"2025-09-29T10:44:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:49 crc kubenswrapper[4752]: I0929 10:44:49.030272 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 10:44:49 crc kubenswrapper[4752]: I0929 10:44:49.030330 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 10:44:49 crc kubenswrapper[4752]: I0929 10:44:49.030409 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 10:44:49 crc kubenswrapper[4752]: E0929 10:44:49.030420 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 10:44:49 crc kubenswrapper[4752]: E0929 10:44:49.030625 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 10:44:49 crc kubenswrapper[4752]: E0929 10:44:49.030881 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 10:44:49 crc kubenswrapper[4752]: I0929 10:44:49.087345 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:49 crc kubenswrapper[4752]: I0929 10:44:49.087383 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:49 crc kubenswrapper[4752]: I0929 10:44:49.087395 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:49 crc kubenswrapper[4752]: I0929 10:44:49.087416 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:49 crc kubenswrapper[4752]: I0929 10:44:49.087430 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:49Z","lastTransitionTime":"2025-09-29T10:44:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:49 crc kubenswrapper[4752]: I0929 10:44:49.190941 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:49 crc kubenswrapper[4752]: I0929 10:44:49.191004 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:49 crc kubenswrapper[4752]: I0929 10:44:49.191021 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:49 crc kubenswrapper[4752]: I0929 10:44:49.191051 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:49 crc kubenswrapper[4752]: I0929 10:44:49.191070 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:49Z","lastTransitionTime":"2025-09-29T10:44:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:49 crc kubenswrapper[4752]: I0929 10:44:49.263105 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vm6zb" event={"ID":"9f30a1f9-86ef-450e-9f8c-8ef8d4ac380a","Type":"ContainerStarted","Data":"d6bc5aff417397c8b264553f67de7ebd1aeadb67fb83114c5bb13c2e0d10e397"} Sep 29 10:44:49 crc kubenswrapper[4752]: I0929 10:44:49.267901 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" event={"ID":"94028c24-ec10-4d5c-b32c-1700e677d539","Type":"ContainerStarted","Data":"1374b35afd65797b94d674c0e4f0932a46ac76c73c5c03e6dbd42e66182b58ee"} Sep 29 10:44:49 crc kubenswrapper[4752]: I0929 10:44:49.268198 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" Sep 29 10:44:49 crc kubenswrapper[4752]: I0929 10:44:49.280063 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3e5d3a3-2f2d-4f61-ae95-26ebd1f72342\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66d77cd5048e199a6eae84be4079c3b00305f4f5223b5176a49df0feb2f0bf8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74b270e951a827068c908168bf04d4cd3bcba62e472e4a3f415de8b7463fdccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dd4d83f6d6b5db7fc93239bc1a6b731c67bc15ef1ca1990b53589e4ad36bfa7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39ef26bf3e7b95ac9a59199bbabe11fd4e831baba1b120ef97a4839c0c4aab7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:49Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:49 crc kubenswrapper[4752]: I0929 10:44:49.295534 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:49Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:49 crc kubenswrapper[4752]: I0929 10:44:49.298903 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:49 crc kubenswrapper[4752]: I0929 10:44:49.298970 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:49 crc kubenswrapper[4752]: I0929 10:44:49.299004 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:49 crc kubenswrapper[4752]: I0929 10:44:49.299033 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:49 crc kubenswrapper[4752]: I0929 10:44:49.299065 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:49Z","lastTransitionTime":"2025-09-29T10:44:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:49 crc kubenswrapper[4752]: I0929 10:44:49.301884 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" Sep 29 10:44:49 crc kubenswrapper[4752]: I0929 10:44:49.313522 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fb781fd16d4a9f56202eb1724ed1a4ed6700ff7b81819573b955bcb07e563a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:49Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:49 crc kubenswrapper[4752]: I0929 10:44:49.333338 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xv5q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52fc9378-c37b-424b-afde-7b191bab5fde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30ee75a35da106cc9424c7a3f97f28d0c711200667372c023612db4a9701c189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4rqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xv5q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:49Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:49 crc kubenswrapper[4752]: I0929 10:44:49.347123 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:49Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:49 crc kubenswrapper[4752]: I0929 10:44:49.368695 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48ad7053-6039-4b1a-9729-fcbe1d938928\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00965359c30aa25677d4b114c00b339b155ab4b5316d5e355536bea5b65eaba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e2d86e0821e0155affe296e5cc70e9904f04c800943101e62509e3a5e4e0808\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9378a6f1ac902b030f4ecabac1eae40f884dc1546a360e178f38300e137d8b0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a174bcfad22c2a58c48792478272705c80a56775b45b14919ea1de1dd92b4cbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://828d416b69696f709d91feb8df8fead0f95be74a91c5dab25756e341e29413dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e4ae4f6e0a6df2f1e370b0ff37704c0b0252752c0d8e8a1cdd83088ca9ec951\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e4ae4f6e0a6df2f1e370b0ff37704c0b0252752c0d8e8a1cdd83088ca9ec951\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40c90938f79ba960fa16979dd5f239674df4b13cae8b0b5d3bb48b0e46219a34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40c90938f79ba960fa16979dd5f239674df4b13cae8b0b5d3bb48b0e46219a34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f99c6fe84624f3e518bbe35ee9b700effb126ff1f36d995262b7ed8b73364780\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f99c6fe84624f3e518bbe35ee9b700effb126ff1f36d995262b7ed8b73364780\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:49Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:49 crc kubenswrapper[4752]: I0929 10:44:49.385374 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:49Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:49 crc kubenswrapper[4752]: I0929 10:44:49.403025 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:49 crc kubenswrapper[4752]: I0929 10:44:49.403090 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:49 crc kubenswrapper[4752]: I0929 10:44:49.403110 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:49 crc kubenswrapper[4752]: I0929 10:44:49.403136 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:49 crc kubenswrapper[4752]: I0929 10:44:49.403152 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:49Z","lastTransitionTime":"2025-09-29T10:44:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:49 crc kubenswrapper[4752]: I0929 10:44:49.405729 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131d2c8a72fc6a373ebf6835840e6b9c1829db4c78b4961bf36642fd0e8a5636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:49Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:49 crc kubenswrapper[4752]: I0929 10:44:49.423295 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5863c243-797d-462a-b11f-71aaf005f8d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://166738b29f01996ec981fd00b49f422e4a97fe774396e7ea153ad29ef30a7370\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdtpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32155f6078e9c15abe4c659ac79b064ec182a232ea1d816998da4de273b7aa67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdtpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mgrvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:49Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:49 crc kubenswrapper[4752]: I0929 10:44:49.436477 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4whp8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"398b6e5c-29ac-4701-9207-d3d269b62224\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63db080ebca3f5ea23ddc9af874b6b500abe8044c73794ae0749df2949fb9520\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9hp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4whp8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:49Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:49 crc kubenswrapper[4752]: I0929 10:44:49.457874 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520a5d33-312c-4033-8b69-5dd582f13ccc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6223734bbce461c09916aea7629bba0cfa97ea17050bca7417020ece9ae031a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1157b82d6f3337270d30abdceadaa1f0a01b3c6d8de6bc8e9edf083a8264f19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://854abd6205c2eec2229d0d65aec3edb7cf1cc1e77759df41bd22deda4a08c8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c927118840179fccacbe6a18a329c117cef73a6e914bf38d20fc2439d6a5c1ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c927118840179fccacbe6a18a329c117cef73a6e914bf38d20fc2439d6a5c1ee\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"message\\\":\\\"le observer\\\\nW0929 10:44:40.787758 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0929 10:44:40.787900 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 10:44:40.788558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1487283959/tls.crt::/tmp/serving-cert-1487283959/tls.key\\\\\\\"\\\\nI0929 10:44:41.256284 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 10:44:41.261265 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 10:44:41.261291 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 10:44:41.261311 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 10:44:41.261316 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 10:44:41.267824 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0929 10:44:41.267847 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0929 10:44:41.267849 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 10:44:41.267871 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 10:44:41.267876 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 10:44:41.267879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 10:44:41.267882 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 10:44:41.267884 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0929 10:44:41.270258 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbe61bb570ef2be352bb3a0e55da353ce7b618b397e3bf9f0d66da0c9b6f1d4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80f961b58569cce6d634f225369902695ccda2e78efb1c6fd635f1535467cc1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80f961b58569cce6d634f225369902695ccda2e78efb1c6fd635f1535467cc1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:49Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:49 crc kubenswrapper[4752]: I0929 10:44:49.475793 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca4f637cfcb1e52fa69f0ffa46b3a53459225d9ad4afd1178bff709e812c5418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b70242846937de5b4dda37a2b8c48947fded378c299ea4ad857168589d7c175\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:49Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:49 crc kubenswrapper[4752]: I0929 10:44:49.491919 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7kp7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66a61a7f-9be6-486b-a425-62ed62ec0ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4170732970e5e7c429279d239eb2d4b9d8249ff254b35f38ff80d0321087be2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kgr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7kp7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:49Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:49 crc kubenswrapper[4752]: I0929 10:44:49.505855 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:49 crc kubenswrapper[4752]: I0929 10:44:49.505903 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:49 crc kubenswrapper[4752]: I0929 10:44:49.505917 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:49 crc kubenswrapper[4752]: I0929 10:44:49.505938 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:49 crc kubenswrapper[4752]: I0929 10:44:49.505952 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:49Z","lastTransitionTime":"2025-09-29T10:44:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:49 crc kubenswrapper[4752]: I0929 10:44:49.509019 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vm6zb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f30a1f9-86ef-450e-9f8c-8ef8d4ac380a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6bc5aff417397c8b264553f67de7ebd1aeadb67fb83114c5bb13c2e0d10e397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239ca1f17b9f1e1d6ba63b196e34066fe7fb37373453460261044f5fcaf819af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://239ca1f17b9f1e1d6ba63b196e34066fe7fb37373453460261044f5fcaf819af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd5b369dc688f11e4ab502a3886b722cba392fce0d3ac7850bd59abffbf7dee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd5b369dc688f11e4ab502a3886b722cba392fce0d3ac7850bd59abffbf7dee2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d17821abed9aca5c20373738f44ca9a61e954d1eee46f0d16c3e9b34d810a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88d17821abed9aca5c20373738f44ca9a61e954d1eee46f0d16c3e9b34d810a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50f5727e0bd53639ba6b6632f2d62c7c62ae74b07a60aa1cb58c2020990cae42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f5727e0bd53639ba6b6632f2d62c7c62ae74b07a60aa1cb58c2020990cae42\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd84740e3b0a970decedcc3960fb987fa618f9627f06be1d2d0b034d0361f805\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd84740e3b0a970decedcc3960fb987fa618f9627f06be1d2d0b034d0361f805\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6af6d9f7c1ca6625f88dcaa9ef267cf11f3ebb16a0ce12d3c2442550bc0833ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6af6d9f7c1ca6625f88dcaa9ef267cf11f3ebb16a0ce12d3c2442550bc0833ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vm6zb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:49Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:49 crc kubenswrapper[4752]: I0929 10:44:49.535711 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94028c24-ec10-4d5c-b32c-1700e677d539\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f22dfbbd26fb3ebf4869b46406913cc1963e33c11794193c815235be5acee338\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f22dfbbd26fb3ebf4869b46406913cc1963e33c11794193c815235be5acee338\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c2vrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:49Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:49 crc kubenswrapper[4752]: I0929 10:44:49.551451 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520a5d33-312c-4033-8b69-5dd582f13ccc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6223734bbce461c09916aea7629bba0cfa97ea17050bca7417020ece9ae031a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1157b82d6f3337270d30abdceadaa1f0a01b3c6d8de6bc8e9edf083a8264f19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://854abd6205c2eec2229d0d65aec3edb7cf1cc1e77759df41bd22deda4a08c8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c927118840179fccacbe6a18a329c117cef73a6e914bf38d20fc2439d6a5c1ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c927118840179fccacbe6a18a329c117cef73a6e914bf38d20fc2439d6a5c1ee\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"message\\\":\\\"le observer\\\\nW0929 10:44:40.787758 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0929 10:44:40.787900 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 10:44:40.788558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1487283959/tls.crt::/tmp/serving-cert-1487283959/tls.key\\\\\\\"\\\\nI0929 10:44:41.256284 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 10:44:41.261265 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 10:44:41.261291 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 10:44:41.261311 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 10:44:41.261316 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 10:44:41.267824 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0929 10:44:41.267847 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0929 10:44:41.267849 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 10:44:41.267871 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 10:44:41.267876 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 10:44:41.267879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 10:44:41.267882 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 10:44:41.267884 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0929 10:44:41.270258 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbe61bb570ef2be352bb3a0e55da353ce7b618b397e3bf9f0d66da0c9b6f1d4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80f961b58569cce6d634f225369902695ccda2e78efb1c6fd635f1535467cc1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80f961b58569cce6d634f225369902695ccda2e78efb1c6fd635f1535467cc1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:49Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:49 crc kubenswrapper[4752]: I0929 10:44:49.567364 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca4f637cfcb1e52fa69f0ffa46b3a53459225d9ad4afd1178bff709e812c5418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b70242846937de5b4dda37a2b8c48947fded378c299ea4ad857168589d7c175\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:49Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:49 crc kubenswrapper[4752]: I0929 10:44:49.578567 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7kp7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66a61a7f-9be6-486b-a425-62ed62ec0ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4170732970e5e7c429279d239eb2d4b9d8249ff254b35f38ff80d0321087be2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kgr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7kp7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:49Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:49 crc kubenswrapper[4752]: I0929 10:44:49.595945 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vm6zb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f30a1f9-86ef-450e-9f8c-8ef8d4ac380a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6bc5aff417397c8b264553f67de7ebd1aeadb67fb83114c5bb13c2e0d10e397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239ca1f17b9f1e1d6ba63b196e34066fe7fb37373453460261044f5fcaf819af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://239ca1f17b9f1e1d6ba63b196e34066fe7fb37373453460261044f5fcaf819af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd5b369dc688f11e4ab502a3886b722cba392fce0d3ac7850bd59abffbf7dee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd5b369dc688f11e4ab502a3886b722cba392fce0d3ac7850bd59abffbf7dee2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d17821abed9aca5c20373738f44ca9a61e954d1eee46f0d16c3e9b34d810a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88d17821abed9aca5c20373738f44ca9a61e954d1eee46f0d16c3e9b34d810a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50f5727e0bd53639ba6b6632f2d62c7c62ae74b07a60aa1cb58c2020990cae42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f5727e0bd53639ba6b6632f2d62c7c62ae74b07a60aa1cb58c2020990cae42\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd84740e3b0a970decedcc3960fb987fa618f9627f06be1d2d0b034d0361f805\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd84740e3b0a970decedcc3960fb987fa618f9627f06be1d2d0b034d0361f805\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6af6d9f7c1ca6625f88dcaa9ef267cf11f3ebb16a0ce12d3c2442550bc0833ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6af6d9f7c1ca6625f88dcaa9ef267cf11f3ebb16a0ce12d3c2442550bc0833ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vm6zb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:49Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:49 crc kubenswrapper[4752]: I0929 10:44:49.608123 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:49 crc kubenswrapper[4752]: I0929 10:44:49.608171 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:49 crc kubenswrapper[4752]: I0929 10:44:49.608188 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:49 crc kubenswrapper[4752]: I0929 10:44:49.608212 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:49 crc kubenswrapper[4752]: I0929 10:44:49.608227 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:49Z","lastTransitionTime":"2025-09-29T10:44:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:49 crc kubenswrapper[4752]: I0929 10:44:49.618244 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94028c24-ec10-4d5c-b32c-1700e677d539\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://486ac9c45cc8e6cc88a199b152343c1db14c51125b4357c85d5d082467fc4560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2860691a355a598f52a1f13213198fa7889748e67cca21a617ed5714f5eabcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e34a55130babbc5fbe9fb81d05fc687dc1b06c3bffea762ba699f9f6c317b312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5985eb5ebc8fa2ca986873aea235335770621597493b43eaa58d98329cd37009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b46368b26939edaf377aa86ef45fc9dc3ec4fa274dfe1cba458bafb8d32309e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a98f237ee9baeb799b2ea76ccbe7b349ed70b50f47738fc514ae56b46ee8d1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1374b35afd65797b94d674c0e4f0932a46ac76c73c5c03e6dbd42e66182b58ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea11fb795febf50e35263b0a02c32a01fd69937dfbfe196696cd1792e40cc191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f22dfbbd26fb3ebf4869b46406913cc1963e33c11794193c815235be5acee338\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f22dfbbd26fb3ebf4869b46406913cc1963e33c11794193c815235be5acee338\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c2vrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:49Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:49 crc kubenswrapper[4752]: I0929 10:44:49.632014 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3e5d3a3-2f2d-4f61-ae95-26ebd1f72342\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66d77cd5048e199a6eae84be4079c3b00305f4f5223b5176a49df0feb2f0bf8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74b270e951a827068c908168bf04d4cd3bcba62e472e4a3f415de8b7463fdccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dd4d83f6d6b5db7fc93239bc1a6b731c67bc15ef1ca1990b53589e4ad36bfa7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39ef26bf3e7b95ac9a59199bbabe11fd4e831baba1b120ef97a4839c0c4aab7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:49Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:49 crc kubenswrapper[4752]: I0929 10:44:49.643939 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:49Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:49 crc kubenswrapper[4752]: I0929 10:44:49.658435 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fb781fd16d4a9f56202eb1724ed1a4ed6700ff7b81819573b955bcb07e563a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:49Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:49 crc kubenswrapper[4752]: I0929 10:44:49.674699 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xv5q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52fc9378-c37b-424b-afde-7b191bab5fde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30ee75a35da106cc9424c7a3f97f28d0c711200667372c023612db4a9701c189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4rqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xv5q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:49Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:49 crc kubenswrapper[4752]: I0929 10:44:49.685691 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:49Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:49 crc kubenswrapper[4752]: I0929 10:44:49.703208 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48ad7053-6039-4b1a-9729-fcbe1d938928\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00965359c30aa25677d4b114c00b339b155ab4b5316d5e355536bea5b65eaba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e2d86e0821e0155affe296e5cc70e9904f04c800943101e62509e3a5e4e0808\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9378a6f1ac902b030f4ecabac1eae40f884dc1546a360e178f38300e137d8b0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a174bcfad22c2a58c48792478272705c80a56775b45b14919ea1de1dd92b4cbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://828d416b69696f709d91feb8df8fead0f95be74a91c5dab25756e341e29413dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e4ae4f6e0a6df2f1e370b0ff37704c0b0252752c0d8e8a1cdd83088ca9ec951\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e4ae4f6e0a6df2f1e370b0ff37704c0b0252752c0d8e8a1cdd83088ca9ec951\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40c90938f79ba960fa16979dd5f239674df4b13cae8b0b5d3bb48b0e46219a34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40c90938f79ba960fa16979dd5f239674df4b13cae8b0b5d3bb48b0e46219a34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f99c6fe84624f3e518bbe35ee9b700effb126ff1f36d995262b7ed8b73364780\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f99c6fe84624f3e518bbe35ee9b700effb126ff1f36d995262b7ed8b73364780\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:49Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:49 crc kubenswrapper[4752]: I0929 10:44:49.710834 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:49 crc kubenswrapper[4752]: I0929 10:44:49.710895 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:49 crc kubenswrapper[4752]: I0929 10:44:49.710908 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:49 crc kubenswrapper[4752]: I0929 10:44:49.710933 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:49 crc kubenswrapper[4752]: I0929 10:44:49.710949 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:49Z","lastTransitionTime":"2025-09-29T10:44:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:49 crc kubenswrapper[4752]: I0929 10:44:49.715210 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:49Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:49 crc kubenswrapper[4752]: I0929 10:44:49.726247 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131d2c8a72fc6a373ebf6835840e6b9c1829db4c78b4961bf36642fd0e8a5636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:49Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:49 crc kubenswrapper[4752]: I0929 10:44:49.737910 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5863c243-797d-462a-b11f-71aaf005f8d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://166738b29f01996ec981fd00b49f422e4a97fe774396e7ea153ad29ef30a7370\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdtpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32155f6078e9c15abe4c659ac79b064ec182a232ea1d816998da4de273b7aa67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdtpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mgrvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:49Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:49 crc kubenswrapper[4752]: I0929 10:44:49.760910 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4whp8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"398b6e5c-29ac-4701-9207-d3d269b62224\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63db080ebca3f5ea23ddc9af874b6b500abe8044c73794ae0749df2949fb9520\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9hp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4whp8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:49Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:49 crc kubenswrapper[4752]: I0929 10:44:49.813500 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:49 crc kubenswrapper[4752]: I0929 10:44:49.813551 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:49 crc kubenswrapper[4752]: I0929 10:44:49.813563 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:49 crc kubenswrapper[4752]: I0929 10:44:49.813584 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:49 crc kubenswrapper[4752]: I0929 10:44:49.813595 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:49Z","lastTransitionTime":"2025-09-29T10:44:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:49 crc kubenswrapper[4752]: I0929 10:44:49.916684 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:49 crc kubenswrapper[4752]: I0929 10:44:49.916744 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:49 crc kubenswrapper[4752]: I0929 10:44:49.916757 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:49 crc kubenswrapper[4752]: I0929 10:44:49.916790 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:49 crc kubenswrapper[4752]: I0929 10:44:49.916826 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:49Z","lastTransitionTime":"2025-09-29T10:44:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:50 crc kubenswrapper[4752]: I0929 10:44:50.019855 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:50 crc kubenswrapper[4752]: I0929 10:44:50.019925 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:50 crc kubenswrapper[4752]: I0929 10:44:50.019949 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:50 crc kubenswrapper[4752]: I0929 10:44:50.019973 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:50 crc kubenswrapper[4752]: I0929 10:44:50.019988 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:50Z","lastTransitionTime":"2025-09-29T10:44:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:50 crc kubenswrapper[4752]: I0929 10:44:50.062282 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca4f637cfcb1e52fa69f0ffa46b3a53459225d9ad4afd1178bff709e812c5418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b70242846937de5b4dda37a2b8c48947fded378c299ea4ad857168589d7c175\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:50Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:50 crc kubenswrapper[4752]: I0929 10:44:50.078466 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7kp7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66a61a7f-9be6-486b-a425-62ed62ec0ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4170732970e5e7c429279d239eb2d4b9d8249ff254b35f38ff80d0321087be2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kgr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7kp7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:50Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:50 crc kubenswrapper[4752]: I0929 10:44:50.098141 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vm6zb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f30a1f9-86ef-450e-9f8c-8ef8d4ac380a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6bc5aff417397c8b264553f67de7ebd1aeadb67fb83114c5bb13c2e0d10e397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239ca1f17b9f1e1d6ba63b196e34066fe7fb37373453460261044f5fcaf819af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://239ca1f17b9f1e1d6ba63b196e34066fe7fb37373453460261044f5fcaf819af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd5b369dc688f11e4ab502a3886b722cba392fce0d3ac7850bd59abffbf7dee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd5b369dc688f11e4ab502a3886b722cba392fce0d3ac7850bd59abffbf7dee2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d17821abed9aca5c20373738f44ca9a61e954d1eee46f0d16c3e9b34d810a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88d17821abed9aca5c20373738f44ca9a61e954d1eee46f0d16c3e9b34d810a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50f5727e0bd53639ba6b6632f2d62c7c62ae74b07a60aa1cb58c2020990cae42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f5727e0bd53639ba6b6632f2d62c7c62ae74b07a60aa1cb58c2020990cae42\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd84740e3b0a970decedcc3960fb987fa618f9627f06be1d2d0b034d0361f805\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd84740e3b0a970decedcc3960fb987fa618f9627f06be1d2d0b034d0361f805\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6af6d9f7c1ca6625f88dcaa9ef267cf11f3ebb16a0ce12d3c2442550bc0833ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6af6d9f7c1ca6625f88dcaa9ef267cf11f3ebb16a0ce12d3c2442550bc0833ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vm6zb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:50Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:50 crc kubenswrapper[4752]: I0929 10:44:50.120528 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94028c24-ec10-4d5c-b32c-1700e677d539\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://486ac9c45cc8e6cc88a199b152343c1db14c51125b4357c85d5d082467fc4560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2860691a355a598f52a1f13213198fa7889748e67cca21a617ed5714f5eabcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e34a55130babbc5fbe9fb81d05fc687dc1b06c3bffea762ba699f9f6c317b312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5985eb5ebc8fa2ca986873aea235335770621597493b43eaa58d98329cd37009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b46368b26939edaf377aa86ef45fc9dc3ec4fa274dfe1cba458bafb8d32309e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a98f237ee9baeb799b2ea76ccbe7b349ed70b50f47738fc514ae56b46ee8d1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1374b35afd65797b94d674c0e4f0932a46ac76c73c5c03e6dbd42e66182b58ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea11fb795febf50e35263b0a02c32a01fd69937dfbfe196696cd1792e40cc191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f22dfbbd26fb3ebf4869b46406913cc1963e33c11794193c815235be5acee338\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f22dfbbd26fb3ebf4869b46406913cc1963e33c11794193c815235be5acee338\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c2vrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:50Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:50 crc kubenswrapper[4752]: I0929 10:44:50.122379 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:50 crc kubenswrapper[4752]: I0929 10:44:50.122418 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:50 crc kubenswrapper[4752]: I0929 10:44:50.122433 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:50 crc kubenswrapper[4752]: I0929 10:44:50.122456 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:50 crc kubenswrapper[4752]: I0929 10:44:50.122471 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:50Z","lastTransitionTime":"2025-09-29T10:44:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:50 crc kubenswrapper[4752]: I0929 10:44:50.138332 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520a5d33-312c-4033-8b69-5dd582f13ccc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6223734bbce461c09916aea7629bba0cfa97ea17050bca7417020ece9ae031a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1157b82d6f3337270d30abdceadaa1f0a01b3c6d8de6bc8e9edf083a8264f19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://854abd6205c2eec2229d0d65aec3edb7cf1cc1e77759df41bd22deda4a08c8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c927118840179fccacbe6a18a329c117cef73a6e914bf38d20fc2439d6a5c1ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c927118840179fccacbe6a18a329c117cef73a6e914bf38d20fc2439d6a5c1ee\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"message\\\":\\\"le observer\\\\nW0929 10:44:40.787758 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0929 10:44:40.787900 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 10:44:40.788558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1487283959/tls.crt::/tmp/serving-cert-1487283959/tls.key\\\\\\\"\\\\nI0929 10:44:41.256284 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 10:44:41.261265 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 10:44:41.261291 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 10:44:41.261311 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 10:44:41.261316 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 10:44:41.267824 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0929 10:44:41.267847 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0929 10:44:41.267849 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 10:44:41.267871 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 10:44:41.267876 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 10:44:41.267879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 10:44:41.267882 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 10:44:41.267884 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0929 10:44:41.270258 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbe61bb570ef2be352bb3a0e55da353ce7b618b397e3bf9f0d66da0c9b6f1d4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80f961b58569cce6d634f225369902695ccda2e78efb1c6fd635f1535467cc1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80f961b58569cce6d634f225369902695ccda2e78efb1c6fd635f1535467cc1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:50Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:50 crc kubenswrapper[4752]: I0929 10:44:50.154069 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:50Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:50 crc kubenswrapper[4752]: I0929 10:44:50.169064 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fb781fd16d4a9f56202eb1724ed1a4ed6700ff7b81819573b955bcb07e563a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:50Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:50 crc kubenswrapper[4752]: I0929 10:44:50.182290 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xv5q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52fc9378-c37b-424b-afde-7b191bab5fde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30ee75a35da106cc9424c7a3f97f28d0c711200667372c023612db4a9701c189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4rqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xv5q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:50Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:50 crc kubenswrapper[4752]: I0929 10:44:50.197977 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3e5d3a3-2f2d-4f61-ae95-26ebd1f72342\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66d77cd5048e199a6eae84be4079c3b00305f4f5223b5176a49df0feb2f0bf8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74b270e951a827068c908168bf04d4cd3bcba62e472e4a3f415de8b7463fdccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dd4d83f6d6b5db7fc93239bc1a6b731c67bc15ef1ca1990b53589e4ad36bfa7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39ef26bf3e7b95ac9a59199bbabe11fd4e831baba1b120ef97a4839c0c4aab7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:50Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:50 crc kubenswrapper[4752]: I0929 10:44:50.211333 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:50Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:50 crc kubenswrapper[4752]: I0929 10:44:50.225339 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:50 crc kubenswrapper[4752]: I0929 10:44:50.225380 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:50 crc kubenswrapper[4752]: I0929 10:44:50.225390 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:50 crc kubenswrapper[4752]: I0929 10:44:50.225405 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:50 crc kubenswrapper[4752]: I0929 10:44:50.225415 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:50Z","lastTransitionTime":"2025-09-29T10:44:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:50 crc kubenswrapper[4752]: I0929 10:44:50.226711 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:50Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:50 crc kubenswrapper[4752]: I0929 10:44:50.243667 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131d2c8a72fc6a373ebf6835840e6b9c1829db4c78b4961bf36642fd0e8a5636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:50Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:50 crc kubenswrapper[4752]: I0929 10:44:50.271560 4752 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 29 10:44:50 crc kubenswrapper[4752]: I0929 10:44:50.272058 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" Sep 29 10:44:50 crc kubenswrapper[4752]: I0929 10:44:50.283140 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5863c243-797d-462a-b11f-71aaf005f8d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://166738b29f01996ec981fd00b49f422e4a97fe774396e7ea153ad29ef30a7370\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdtpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32155f6078e9c15abe4c659ac79b064ec182a232ea1d816998da4de273b7aa67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdtpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mgrvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:50Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:50 crc kubenswrapper[4752]: I0929 10:44:50.296527 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" Sep 29 10:44:50 crc kubenswrapper[4752]: I0929 10:44:50.325387 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4whp8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"398b6e5c-29ac-4701-9207-d3d269b62224\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63db080ebca3f5ea23ddc9af874b6b500abe8044c73794ae0749df2949fb9520\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9hp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4whp8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:50Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:50 crc kubenswrapper[4752]: I0929 10:44:50.327533 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:50 crc kubenswrapper[4752]: I0929 10:44:50.327683 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:50 crc kubenswrapper[4752]: I0929 10:44:50.327809 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:50 crc kubenswrapper[4752]: I0929 10:44:50.327917 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:50 crc kubenswrapper[4752]: I0929 10:44:50.327995 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:50Z","lastTransitionTime":"2025-09-29T10:44:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:50 crc kubenswrapper[4752]: I0929 10:44:50.371430 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48ad7053-6039-4b1a-9729-fcbe1d938928\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00965359c30aa25677d4b114c00b339b155ab4b5316d5e355536bea5b65eaba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e2d86e0821e0155affe296e5cc70e9904f04c800943101e62509e3a5e4e0808\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9378a6f1ac902b030f4ecabac1eae40f884dc1546a360e178f38300e137d8b0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a174bcfad22c2a58c48792478272705c80a56775b45b14919ea1de1dd92b4cbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://828d416b69696f709d91feb8df8fead0f95be74a91c5dab25756e341e29413dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e4ae4f6e0a6df2f1e370b0ff37704c0b0252752c0d8e8a1cdd83088ca9ec951\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e4ae4f6e0a6df2f1e370b0ff37704c0b0252752c0d8e8a1cdd83088ca9ec951\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40c90938f79ba960fa16979dd5f239674df4b13cae8b0b5d3bb48b0e46219a34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40c90938f79ba960fa16979dd5f239674df4b13cae8b0b5d3bb48b0e46219a34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f99c6fe84624f3e518bbe35ee9b700effb126ff1f36d995262b7ed8b73364780\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f99c6fe84624f3e518bbe35ee9b700effb126ff1f36d995262b7ed8b73364780\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:50Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:50 crc kubenswrapper[4752]: I0929 10:44:50.404736 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:50Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:50 crc kubenswrapper[4752]: I0929 10:44:50.430773 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:50 crc kubenswrapper[4752]: I0929 10:44:50.430832 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:50 crc kubenswrapper[4752]: I0929 10:44:50.430846 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:50 crc kubenswrapper[4752]: I0929 10:44:50.430864 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:50 crc kubenswrapper[4752]: I0929 10:44:50.430877 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:50Z","lastTransitionTime":"2025-09-29T10:44:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:50 crc kubenswrapper[4752]: I0929 10:44:50.450150 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48ad7053-6039-4b1a-9729-fcbe1d938928\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00965359c30aa25677d4b114c00b339b155ab4b5316d5e355536bea5b65eaba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e2d86e0821e0155affe296e5cc70e9904f04c800943101e62509e3a5e4e0808\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9378a6f1ac902b030f4ecabac1eae40f884dc1546a360e178f38300e137d8b0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a174bcfad22c2a58c48792478272705c80a56775b45b14919ea1de1dd92b4cbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://828d416b69696f709d91feb8df8fead0f95be74a91c5dab25756e341e29413dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e4ae4f6e0a6df2f1e370b0ff37704c0b0252752c0d8e8a1cdd83088ca9ec951\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e4ae4f6e0a6df2f1e370b0ff37704c0b0252752c0d8e8a1cdd83088ca9ec951\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40c90938f79ba960fa16979dd5f239674df4b13cae8b0b5d3bb48b0e46219a34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40c90938f79ba960fa16979dd5f239674df4b13cae8b0b5d3bb48b0e46219a34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f99c6fe84624f3e518bbe35ee9b700effb126ff1f36d995262b7ed8b73364780\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f99c6fe84624f3e518bbe35ee9b700effb126ff1f36d995262b7ed8b73364780\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:50Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:50 crc kubenswrapper[4752]: I0929 10:44:50.484755 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:50Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:50 crc kubenswrapper[4752]: I0929 10:44:50.523217 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131d2c8a72fc6a373ebf6835840e6b9c1829db4c78b4961bf36642fd0e8a5636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:50Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:50 crc kubenswrapper[4752]: I0929 10:44:50.533922 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:50 crc kubenswrapper[4752]: I0929 10:44:50.533982 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:50 crc kubenswrapper[4752]: I0929 10:44:50.533996 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:50 crc kubenswrapper[4752]: I0929 10:44:50.534018 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:50 crc kubenswrapper[4752]: I0929 10:44:50.534031 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:50Z","lastTransitionTime":"2025-09-29T10:44:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:50 crc kubenswrapper[4752]: I0929 10:44:50.561822 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5863c243-797d-462a-b11f-71aaf005f8d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://166738b29f01996ec981fd00b49f422e4a97fe774396e7ea153ad29ef30a7370\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdtpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32155f6078e9c15abe4c659ac79b064ec182a232ea1d816998da4de273b7aa67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdtpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mgrvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:50Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:50 crc kubenswrapper[4752]: I0929 10:44:50.602833 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4whp8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"398b6e5c-29ac-4701-9207-d3d269b62224\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63db080ebca3f5ea23ddc9af874b6b500abe8044c73794ae0749df2949fb9520\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9hp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4whp8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:50Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:50 crc kubenswrapper[4752]: I0929 10:44:50.638892 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:50 crc kubenswrapper[4752]: I0929 10:44:50.638959 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:50 crc kubenswrapper[4752]: I0929 10:44:50.638972 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:50 crc kubenswrapper[4752]: I0929 10:44:50.638993 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:50 crc kubenswrapper[4752]: I0929 10:44:50.639006 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:50Z","lastTransitionTime":"2025-09-29T10:44:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:50 crc kubenswrapper[4752]: I0929 10:44:50.647154 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520a5d33-312c-4033-8b69-5dd582f13ccc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6223734bbce461c09916aea7629bba0cfa97ea17050bca7417020ece9ae031a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1157b82d6f3337270d30abdceadaa1f0a01b3c6d8de6bc8e9edf083a8264f19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://854abd6205c2eec2229d0d65aec3edb7cf1cc1e77759df41bd22deda4a08c8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c927118840179fccacbe6a18a329c117cef73a6e914bf38d20fc2439d6a5c1ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c927118840179fccacbe6a18a329c117cef73a6e914bf38d20fc2439d6a5c1ee\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"message\\\":\\\"le observer\\\\nW0929 10:44:40.787758 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0929 10:44:40.787900 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 10:44:40.788558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1487283959/tls.crt::/tmp/serving-cert-1487283959/tls.key\\\\\\\"\\\\nI0929 10:44:41.256284 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 10:44:41.261265 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 10:44:41.261291 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 10:44:41.261311 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 10:44:41.261316 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 10:44:41.267824 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0929 10:44:41.267847 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0929 10:44:41.267849 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 10:44:41.267871 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 10:44:41.267876 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 10:44:41.267879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 10:44:41.267882 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 10:44:41.267884 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0929 10:44:41.270258 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbe61bb570ef2be352bb3a0e55da353ce7b618b397e3bf9f0d66da0c9b6f1d4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80f961b58569cce6d634f225369902695ccda2e78efb1c6fd635f1535467cc1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80f961b58569cce6d634f225369902695ccda2e78efb1c6fd635f1535467cc1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:50Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:50 crc kubenswrapper[4752]: I0929 10:44:50.686721 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca4f637cfcb1e52fa69f0ffa46b3a53459225d9ad4afd1178bff709e812c5418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b70242846937de5b4dda37a2b8c48947fded378c299ea4ad857168589d7c175\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:50Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:50 crc kubenswrapper[4752]: I0929 10:44:50.725088 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7kp7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66a61a7f-9be6-486b-a425-62ed62ec0ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4170732970e5e7c429279d239eb2d4b9d8249ff254b35f38ff80d0321087be2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kgr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7kp7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:50Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:50 crc kubenswrapper[4752]: I0929 10:44:50.742398 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:50 crc kubenswrapper[4752]: I0929 10:44:50.742480 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:50 crc kubenswrapper[4752]: I0929 10:44:50.742494 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:50 crc kubenswrapper[4752]: I0929 10:44:50.742516 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:50 crc kubenswrapper[4752]: I0929 10:44:50.742530 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:50Z","lastTransitionTime":"2025-09-29T10:44:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:50 crc kubenswrapper[4752]: I0929 10:44:50.767503 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vm6zb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f30a1f9-86ef-450e-9f8c-8ef8d4ac380a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6bc5aff417397c8b264553f67de7ebd1aeadb67fb83114c5bb13c2e0d10e397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239ca1f17b9f1e1d6ba63b196e34066fe7fb37373453460261044f5fcaf819af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://239ca1f17b9f1e1d6ba63b196e34066fe7fb37373453460261044f5fcaf819af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd5b369dc688f11e4ab502a3886b722cba392fce0d3ac7850bd59abffbf7dee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd5b369dc688f11e4ab502a3886b722cba392fce0d3ac7850bd59abffbf7dee2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d17821abed9aca5c20373738f44ca9a61e954d1eee46f0d16c3e9b34d810a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88d17821abed9aca5c20373738f44ca9a61e954d1eee46f0d16c3e9b34d810a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50f5727e0bd53639ba6b6632f2d62c7c62ae74b07a60aa1cb58c2020990cae42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f5727e0bd53639ba6b6632f2d62c7c62ae74b07a60aa1cb58c2020990cae42\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd84740e3b0a970decedcc3960fb987fa618f9627f06be1d2d0b034d0361f805\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd84740e3b0a970decedcc3960fb987fa618f9627f06be1d2d0b034d0361f805\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6af6d9f7c1ca6625f88dcaa9ef267cf11f3ebb16a0ce12d3c2442550bc0833ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6af6d9f7c1ca6625f88dcaa9ef267cf11f3ebb16a0ce12d3c2442550bc0833ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vm6zb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:50Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:50 crc kubenswrapper[4752]: I0929 10:44:50.811450 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94028c24-ec10-4d5c-b32c-1700e677d539\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://486ac9c45cc8e6cc88a199b152343c1db14c51125b4357c85d5d082467fc4560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2860691a355a598f52a1f13213198fa7889748e67cca21a617ed5714f5eabcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e34a55130babbc5fbe9fb81d05fc687dc1b06c3bffea762ba699f9f6c317b312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5985eb5ebc8fa2ca986873aea235335770621597493b43eaa58d98329cd37009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b46368b26939edaf377aa86ef45fc9dc3ec4fa274dfe1cba458bafb8d32309e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a98f237ee9baeb799b2ea76ccbe7b349ed70b50f47738fc514ae56b46ee8d1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1374b35afd65797b94d674c0e4f0932a46ac76c73c5c03e6dbd42e66182b58ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea11fb795febf50e35263b0a02c32a01fd69937dfbfe196696cd1792e40cc191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f22dfbbd26fb3ebf4869b46406913cc1963e33c11794193c815235be5acee338\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f22dfbbd26fb3ebf4869b46406913cc1963e33c11794193c815235be5acee338\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c2vrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:50Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:50 crc kubenswrapper[4752]: I0929 10:44:50.845883 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:50 crc kubenswrapper[4752]: I0929 10:44:50.845920 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:50 crc kubenswrapper[4752]: I0929 10:44:50.845930 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:50 crc kubenswrapper[4752]: I0929 10:44:50.845948 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:50 crc kubenswrapper[4752]: I0929 10:44:50.845959 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:50Z","lastTransitionTime":"2025-09-29T10:44:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:50 crc kubenswrapper[4752]: I0929 10:44:50.846379 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3e5d3a3-2f2d-4f61-ae95-26ebd1f72342\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66d77cd5048e199a6eae84be4079c3b00305f4f5223b5176a49df0feb2f0bf8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74b270e951a827068c908168bf04d4cd3bcba62e472e4a3f415de8b7463fdccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dd4d83f6d6b5db7fc93239bc1a6b731c67bc15ef1ca1990b53589e4ad36bfa7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39ef26bf3e7b95ac9a59199bbabe11fd4e831baba1b120ef97a4839c0c4aab7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:50Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:50 crc kubenswrapper[4752]: I0929 10:44:50.885460 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:50Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:50 crc kubenswrapper[4752]: I0929 10:44:50.927428 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fb781fd16d4a9f56202eb1724ed1a4ed6700ff7b81819573b955bcb07e563a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:50Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:50 crc kubenswrapper[4752]: I0929 10:44:50.949710 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:50 crc kubenswrapper[4752]: I0929 10:44:50.949760 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:50 crc kubenswrapper[4752]: I0929 10:44:50.949770 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:50 crc kubenswrapper[4752]: I0929 10:44:50.949788 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:50 crc kubenswrapper[4752]: I0929 10:44:50.949818 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:50Z","lastTransitionTime":"2025-09-29T10:44:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:50 crc kubenswrapper[4752]: I0929 10:44:50.972381 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xv5q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52fc9378-c37b-424b-afde-7b191bab5fde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30ee75a35da106cc9424c7a3f97f28d0c711200667372c023612db4a9701c189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4rqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xv5q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:50Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:51 crc kubenswrapper[4752]: I0929 10:44:51.030989 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 10:44:51 crc kubenswrapper[4752]: I0929 10:44:51.031072 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 10:44:51 crc kubenswrapper[4752]: E0929 10:44:51.031180 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 10:44:51 crc kubenswrapper[4752]: I0929 10:44:51.031277 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 10:44:51 crc kubenswrapper[4752]: E0929 10:44:51.031492 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 10:44:51 crc kubenswrapper[4752]: E0929 10:44:51.031581 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 10:44:51 crc kubenswrapper[4752]: I0929 10:44:51.052696 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:51 crc kubenswrapper[4752]: I0929 10:44:51.052784 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:51 crc kubenswrapper[4752]: I0929 10:44:51.052820 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:51 crc kubenswrapper[4752]: I0929 10:44:51.052847 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:51 crc kubenswrapper[4752]: I0929 10:44:51.052861 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:51Z","lastTransitionTime":"2025-09-29T10:44:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:51 crc kubenswrapper[4752]: I0929 10:44:51.155664 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:51 crc kubenswrapper[4752]: I0929 10:44:51.155705 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:51 crc kubenswrapper[4752]: I0929 10:44:51.155719 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:51 crc kubenswrapper[4752]: I0929 10:44:51.155740 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:51 crc kubenswrapper[4752]: I0929 10:44:51.155757 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:51Z","lastTransitionTime":"2025-09-29T10:44:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:51 crc kubenswrapper[4752]: I0929 10:44:51.258352 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:51 crc kubenswrapper[4752]: I0929 10:44:51.258400 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:51 crc kubenswrapper[4752]: I0929 10:44:51.258416 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:51 crc kubenswrapper[4752]: I0929 10:44:51.258439 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:51 crc kubenswrapper[4752]: I0929 10:44:51.258456 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:51Z","lastTransitionTime":"2025-09-29T10:44:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:51 crc kubenswrapper[4752]: I0929 10:44:51.274536 4752 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 29 10:44:51 crc kubenswrapper[4752]: I0929 10:44:51.360987 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:51 crc kubenswrapper[4752]: I0929 10:44:51.361071 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:51 crc kubenswrapper[4752]: I0929 10:44:51.361083 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:51 crc kubenswrapper[4752]: I0929 10:44:51.361108 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:51 crc kubenswrapper[4752]: I0929 10:44:51.361122 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:51Z","lastTransitionTime":"2025-09-29T10:44:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:51 crc kubenswrapper[4752]: I0929 10:44:51.464481 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:51 crc kubenswrapper[4752]: I0929 10:44:51.464531 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:51 crc kubenswrapper[4752]: I0929 10:44:51.464539 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:51 crc kubenswrapper[4752]: I0929 10:44:51.464558 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:51 crc kubenswrapper[4752]: I0929 10:44:51.464569 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:51Z","lastTransitionTime":"2025-09-29T10:44:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:51 crc kubenswrapper[4752]: I0929 10:44:51.567731 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:51 crc kubenswrapper[4752]: I0929 10:44:51.567789 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:51 crc kubenswrapper[4752]: I0929 10:44:51.567821 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:51 crc kubenswrapper[4752]: I0929 10:44:51.567843 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:51 crc kubenswrapper[4752]: I0929 10:44:51.567860 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:51Z","lastTransitionTime":"2025-09-29T10:44:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:51 crc kubenswrapper[4752]: I0929 10:44:51.670693 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:51 crc kubenswrapper[4752]: I0929 10:44:51.670772 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:51 crc kubenswrapper[4752]: I0929 10:44:51.670790 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:51 crc kubenswrapper[4752]: I0929 10:44:51.670849 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:51 crc kubenswrapper[4752]: I0929 10:44:51.670871 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:51Z","lastTransitionTime":"2025-09-29T10:44:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:51 crc kubenswrapper[4752]: I0929 10:44:51.773978 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:51 crc kubenswrapper[4752]: I0929 10:44:51.774030 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:51 crc kubenswrapper[4752]: I0929 10:44:51.774041 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:51 crc kubenswrapper[4752]: I0929 10:44:51.774068 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:51 crc kubenswrapper[4752]: I0929 10:44:51.774079 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:51Z","lastTransitionTime":"2025-09-29T10:44:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:51 crc kubenswrapper[4752]: I0929 10:44:51.877404 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:51 crc kubenswrapper[4752]: I0929 10:44:51.877468 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:51 crc kubenswrapper[4752]: I0929 10:44:51.877488 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:51 crc kubenswrapper[4752]: I0929 10:44:51.877514 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:51 crc kubenswrapper[4752]: I0929 10:44:51.877530 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:51Z","lastTransitionTime":"2025-09-29T10:44:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:51 crc kubenswrapper[4752]: I0929 10:44:51.980449 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:51 crc kubenswrapper[4752]: I0929 10:44:51.980494 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:51 crc kubenswrapper[4752]: I0929 10:44:51.980503 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:51 crc kubenswrapper[4752]: I0929 10:44:51.980524 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:51 crc kubenswrapper[4752]: I0929 10:44:51.980535 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:51Z","lastTransitionTime":"2025-09-29T10:44:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:52 crc kubenswrapper[4752]: I0929 10:44:52.083355 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:52 crc kubenswrapper[4752]: I0929 10:44:52.083396 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:52 crc kubenswrapper[4752]: I0929 10:44:52.083407 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:52 crc kubenswrapper[4752]: I0929 10:44:52.083421 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:52 crc kubenswrapper[4752]: I0929 10:44:52.083430 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:52Z","lastTransitionTime":"2025-09-29T10:44:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:52 crc kubenswrapper[4752]: I0929 10:44:52.187340 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:52 crc kubenswrapper[4752]: I0929 10:44:52.187411 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:52 crc kubenswrapper[4752]: I0929 10:44:52.187427 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:52 crc kubenswrapper[4752]: I0929 10:44:52.187448 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:52 crc kubenswrapper[4752]: I0929 10:44:52.187462 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:52Z","lastTransitionTime":"2025-09-29T10:44:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:52 crc kubenswrapper[4752]: I0929 10:44:52.282106 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c2vrh_94028c24-ec10-4d5c-b32c-1700e677d539/ovnkube-controller/0.log" Sep 29 10:44:52 crc kubenswrapper[4752]: I0929 10:44:52.294245 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:52 crc kubenswrapper[4752]: I0929 10:44:52.294339 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:52 crc kubenswrapper[4752]: I0929 10:44:52.294353 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:52 crc kubenswrapper[4752]: I0929 10:44:52.294375 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:52 crc kubenswrapper[4752]: I0929 10:44:52.294390 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:52Z","lastTransitionTime":"2025-09-29T10:44:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:52 crc kubenswrapper[4752]: I0929 10:44:52.294728 4752 generic.go:334] "Generic (PLEG): container finished" podID="94028c24-ec10-4d5c-b32c-1700e677d539" containerID="1374b35afd65797b94d674c0e4f0932a46ac76c73c5c03e6dbd42e66182b58ee" exitCode=1 Sep 29 10:44:52 crc kubenswrapper[4752]: I0929 10:44:52.294785 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" event={"ID":"94028c24-ec10-4d5c-b32c-1700e677d539","Type":"ContainerDied","Data":"1374b35afd65797b94d674c0e4f0932a46ac76c73c5c03e6dbd42e66182b58ee"} Sep 29 10:44:52 crc kubenswrapper[4752]: I0929 10:44:52.295558 4752 scope.go:117] "RemoveContainer" containerID="1374b35afd65797b94d674c0e4f0932a46ac76c73c5c03e6dbd42e66182b58ee" Sep 29 10:44:52 crc kubenswrapper[4752]: I0929 10:44:52.311839 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520a5d33-312c-4033-8b69-5dd582f13ccc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6223734bbce461c09916aea7629bba0cfa97ea17050bca7417020ece9ae031a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1157b82d6f3337270d30abdceadaa1f0a01b3c6d8de6bc8e9edf083a8264f19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://854abd6205c2eec2229d0d65aec3edb7cf1cc1e77759df41bd22deda4a08c8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c927118840179fccacbe6a18a329c117cef73a6e914bf38d20fc2439d6a5c1ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c927118840179fccacbe6a18a329c117cef73a6e914bf38d20fc2439d6a5c1ee\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"message\\\":\\\"le observer\\\\nW0929 10:44:40.787758 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0929 10:44:40.787900 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 10:44:40.788558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1487283959/tls.crt::/tmp/serving-cert-1487283959/tls.key\\\\\\\"\\\\nI0929 10:44:41.256284 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 10:44:41.261265 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 10:44:41.261291 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 10:44:41.261311 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 10:44:41.261316 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 10:44:41.267824 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0929 10:44:41.267847 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0929 10:44:41.267849 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 10:44:41.267871 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 10:44:41.267876 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 10:44:41.267879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 10:44:41.267882 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 10:44:41.267884 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0929 10:44:41.270258 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbe61bb570ef2be352bb3a0e55da353ce7b618b397e3bf9f0d66da0c9b6f1d4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80f961b58569cce6d634f225369902695ccda2e78efb1c6fd635f1535467cc1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80f961b58569cce6d634f225369902695ccda2e78efb1c6fd635f1535467cc1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:52Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:52 crc kubenswrapper[4752]: I0929 10:44:52.328249 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca4f637cfcb1e52fa69f0ffa46b3a53459225d9ad4afd1178bff709e812c5418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b70242846937de5b4dda37a2b8c48947fded378c299ea4ad857168589d7c175\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:52Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:52 crc kubenswrapper[4752]: I0929 10:44:52.341526 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7kp7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66a61a7f-9be6-486b-a425-62ed62ec0ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4170732970e5e7c429279d239eb2d4b9d8249ff254b35f38ff80d0321087be2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kgr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7kp7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:52Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:52 crc kubenswrapper[4752]: I0929 10:44:52.357709 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vm6zb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f30a1f9-86ef-450e-9f8c-8ef8d4ac380a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6bc5aff417397c8b264553f67de7ebd1aeadb67fb83114c5bb13c2e0d10e397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239ca1f17b9f1e1d6ba63b196e34066fe7fb37373453460261044f5fcaf819af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://239ca1f17b9f1e1d6ba63b196e34066fe7fb37373453460261044f5fcaf819af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd5b369dc688f11e4ab502a3886b722cba392fce0d3ac7850bd59abffbf7dee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd5b369dc688f11e4ab502a3886b722cba392fce0d3ac7850bd59abffbf7dee2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d17821abed9aca5c20373738f44ca9a61e954d1eee46f0d16c3e9b34d810a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88d17821abed9aca5c20373738f44ca9a61e954d1eee46f0d16c3e9b34d810a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50f5727e0bd53639ba6b6632f2d62c7c62ae74b07a60aa1cb58c2020990cae42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f5727e0bd53639ba6b6632f2d62c7c62ae74b07a60aa1cb58c2020990cae42\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd84740e3b0a970decedcc3960fb987fa618f9627f06be1d2d0b034d0361f805\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd84740e3b0a970decedcc3960fb987fa618f9627f06be1d2d0b034d0361f805\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6af6d9f7c1ca6625f88dcaa9ef267cf11f3ebb16a0ce12d3c2442550bc0833ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6af6d9f7c1ca6625f88dcaa9ef267cf11f3ebb16a0ce12d3c2442550bc0833ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vm6zb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:52Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:52 crc kubenswrapper[4752]: I0929 10:44:52.380736 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94028c24-ec10-4d5c-b32c-1700e677d539\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://486ac9c45cc8e6cc88a199b152343c1db14c51125b4357c85d5d082467fc4560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2860691a355a598f52a1f13213198fa7889748e67cca21a617ed5714f5eabcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e34a55130babbc5fbe9fb81d05fc687dc1b06c3bffea762ba699f9f6c317b312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5985eb5ebc8fa2ca986873aea235335770621597493b43eaa58d98329cd37009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b46368b26939edaf377aa86ef45fc9dc3ec4fa274dfe1cba458bafb8d32309e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a98f237ee9baeb799b2ea76ccbe7b349ed70b50f47738fc514ae56b46ee8d1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1374b35afd65797b94d674c0e4f0932a46ac76c73c5c03e6dbd42e66182b58ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1374b35afd65797b94d674c0e4f0932a46ac76c73c5c03e6dbd42e66182b58ee\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T10:44:52Z\\\",\\\"message\\\":\\\"Node event handler 2 for removal\\\\nI0929 10:44:51.834387 6054 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0929 10:44:51.834400 6054 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0929 10:44:51.834450 6054 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0929 10:44:51.834462 6054 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0929 10:44:51.834469 6054 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0929 10:44:51.834477 6054 handler.go:208] Removed *v1.Node event handler 7\\\\nI0929 10:44:51.834486 6054 handler.go:208] Removed *v1.Node event handler 2\\\\nI0929 10:44:51.834513 6054 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0929 10:44:51.834541 6054 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0929 10:44:51.834694 6054 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0929 10:44:51.835176 6054 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0929 10:44:51.835281 6054 factory.go:656] Stopping watch factory\\\\nI0929 10:44:51.835343 6054 ovnkube.go:599] Stopped ovnkube\\\\nI0929 10:44:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea11fb795febf50e35263b0a02c32a01fd69937dfbfe196696cd1792e40cc191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f22dfbbd26fb3ebf4869b46406913cc1963e33c11794193c815235be5acee338\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f22dfbbd26fb3ebf4869b46406913cc1963e33c11794193c815235be5acee338\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c2vrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:52Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:52 crc kubenswrapper[4752]: I0929 10:44:52.396407 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:52 crc kubenswrapper[4752]: I0929 10:44:52.396473 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:52 crc kubenswrapper[4752]: I0929 10:44:52.396491 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:52 crc kubenswrapper[4752]: I0929 10:44:52.396517 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:52 crc kubenswrapper[4752]: I0929 10:44:52.396533 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:52Z","lastTransitionTime":"2025-09-29T10:44:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:52 crc kubenswrapper[4752]: I0929 10:44:52.396891 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3e5d3a3-2f2d-4f61-ae95-26ebd1f72342\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66d77cd5048e199a6eae84be4079c3b00305f4f5223b5176a49df0feb2f0bf8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74b270e951a827068c908168bf04d4cd3bcba62e472e4a3f415de8b7463fdccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dd4d83f6d6b5db7fc93239bc1a6b731c67bc15ef1ca1990b53589e4ad36bfa7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39ef26bf3e7b95ac9a59199bbabe11fd4e831baba1b120ef97a4839c0c4aab7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:52Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:52 crc kubenswrapper[4752]: I0929 10:44:52.410999 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:52Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:52 crc kubenswrapper[4752]: I0929 10:44:52.425677 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fb781fd16d4a9f56202eb1724ed1a4ed6700ff7b81819573b955bcb07e563a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:52Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:52 crc kubenswrapper[4752]: I0929 10:44:52.442748 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xv5q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52fc9378-c37b-424b-afde-7b191bab5fde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30ee75a35da106cc9424c7a3f97f28d0c711200667372c023612db4a9701c189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4rqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xv5q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:52Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:52 crc kubenswrapper[4752]: I0929 10:44:52.459677 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:52Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:52 crc kubenswrapper[4752]: I0929 10:44:52.483185 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48ad7053-6039-4b1a-9729-fcbe1d938928\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00965359c30aa25677d4b114c00b339b155ab4b5316d5e355536bea5b65eaba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e2d86e0821e0155affe296e5cc70e9904f04c800943101e62509e3a5e4e0808\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9378a6f1ac902b030f4ecabac1eae40f884dc1546a360e178f38300e137d8b0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a174bcfad22c2a58c48792478272705c80a56775b45b14919ea1de1dd92b4cbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://828d416b69696f709d91feb8df8fead0f95be74a91c5dab25756e341e29413dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e4ae4f6e0a6df2f1e370b0ff37704c0b0252752c0d8e8a1cdd83088ca9ec951\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e4ae4f6e0a6df2f1e370b0ff37704c0b0252752c0d8e8a1cdd83088ca9ec951\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40c90938f79ba960fa16979dd5f239674df4b13cae8b0b5d3bb48b0e46219a34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40c90938f79ba960fa16979dd5f239674df4b13cae8b0b5d3bb48b0e46219a34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f99c6fe84624f3e518bbe35ee9b700effb126ff1f36d995262b7ed8b73364780\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f99c6fe84624f3e518bbe35ee9b700effb126ff1f36d995262b7ed8b73364780\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:52Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:52 crc kubenswrapper[4752]: I0929 10:44:52.500128 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:52 crc kubenswrapper[4752]: I0929 10:44:52.500600 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:52 crc kubenswrapper[4752]: I0929 10:44:52.500615 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:52 crc kubenswrapper[4752]: I0929 10:44:52.500637 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:52 crc kubenswrapper[4752]: I0929 10:44:52.500652 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:52Z","lastTransitionTime":"2025-09-29T10:44:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:52 crc kubenswrapper[4752]: I0929 10:44:52.504991 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:52Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:52 crc kubenswrapper[4752]: I0929 10:44:52.525946 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131d2c8a72fc6a373ebf6835840e6b9c1829db4c78b4961bf36642fd0e8a5636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:52Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:52 crc kubenswrapper[4752]: I0929 10:44:52.541071 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5863c243-797d-462a-b11f-71aaf005f8d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://166738b29f01996ec981fd00b49f422e4a97fe774396e7ea153ad29ef30a7370\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdtpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32155f6078e9c15abe4c659ac79b064ec182a232ea1d816998da4de273b7aa67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdtpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mgrvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:52Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:52 crc kubenswrapper[4752]: I0929 10:44:52.553505 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4whp8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"398b6e5c-29ac-4701-9207-d3d269b62224\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63db080ebca3f5ea23ddc9af874b6b500abe8044c73794ae0749df2949fb9520\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9hp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4whp8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:52Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:52 crc kubenswrapper[4752]: I0929 10:44:52.604468 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:52 crc kubenswrapper[4752]: I0929 10:44:52.604593 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:52 crc kubenswrapper[4752]: I0929 10:44:52.604625 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:52 crc kubenswrapper[4752]: I0929 10:44:52.604659 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:52 crc kubenswrapper[4752]: I0929 10:44:52.604682 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:52Z","lastTransitionTime":"2025-09-29T10:44:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:52 crc kubenswrapper[4752]: I0929 10:44:52.707987 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:52 crc kubenswrapper[4752]: I0929 10:44:52.708044 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:52 crc kubenswrapper[4752]: I0929 10:44:52.708059 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:52 crc kubenswrapper[4752]: I0929 10:44:52.708083 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:52 crc kubenswrapper[4752]: I0929 10:44:52.708097 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:52Z","lastTransitionTime":"2025-09-29T10:44:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:52 crc kubenswrapper[4752]: I0929 10:44:52.811257 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:52 crc kubenswrapper[4752]: I0929 10:44:52.811313 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:52 crc kubenswrapper[4752]: I0929 10:44:52.811329 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:52 crc kubenswrapper[4752]: I0929 10:44:52.811363 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:52 crc kubenswrapper[4752]: I0929 10:44:52.811379 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:52Z","lastTransitionTime":"2025-09-29T10:44:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:52 crc kubenswrapper[4752]: I0929 10:44:52.914450 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:52 crc kubenswrapper[4752]: I0929 10:44:52.914500 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:52 crc kubenswrapper[4752]: I0929 10:44:52.914513 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:52 crc kubenswrapper[4752]: I0929 10:44:52.914538 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:52 crc kubenswrapper[4752]: I0929 10:44:52.914553 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:52Z","lastTransitionTime":"2025-09-29T10:44:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:53 crc kubenswrapper[4752]: I0929 10:44:53.016965 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:53 crc kubenswrapper[4752]: I0929 10:44:53.017026 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:53 crc kubenswrapper[4752]: I0929 10:44:53.017049 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:53 crc kubenswrapper[4752]: I0929 10:44:53.017076 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:53 crc kubenswrapper[4752]: I0929 10:44:53.017093 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:53Z","lastTransitionTime":"2025-09-29T10:44:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:53 crc kubenswrapper[4752]: I0929 10:44:53.030333 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 10:44:53 crc kubenswrapper[4752]: I0929 10:44:53.030344 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 10:44:53 crc kubenswrapper[4752]: E0929 10:44:53.030534 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 10:44:53 crc kubenswrapper[4752]: I0929 10:44:53.030354 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 10:44:53 crc kubenswrapper[4752]: E0929 10:44:53.030772 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 10:44:53 crc kubenswrapper[4752]: E0929 10:44:53.030938 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 10:44:53 crc kubenswrapper[4752]: I0929 10:44:53.120002 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:53 crc kubenswrapper[4752]: I0929 10:44:53.120057 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:53 crc kubenswrapper[4752]: I0929 10:44:53.120070 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:53 crc kubenswrapper[4752]: I0929 10:44:53.120088 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:53 crc kubenswrapper[4752]: I0929 10:44:53.120099 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:53Z","lastTransitionTime":"2025-09-29T10:44:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:53 crc kubenswrapper[4752]: I0929 10:44:53.223343 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:53 crc kubenswrapper[4752]: I0929 10:44:53.223406 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:53 crc kubenswrapper[4752]: I0929 10:44:53.223421 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:53 crc kubenswrapper[4752]: I0929 10:44:53.223442 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:53 crc kubenswrapper[4752]: I0929 10:44:53.223458 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:53Z","lastTransitionTime":"2025-09-29T10:44:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:53 crc kubenswrapper[4752]: I0929 10:44:53.300613 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c2vrh_94028c24-ec10-4d5c-b32c-1700e677d539/ovnkube-controller/0.log" Sep 29 10:44:53 crc kubenswrapper[4752]: I0929 10:44:53.304509 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" event={"ID":"94028c24-ec10-4d5c-b32c-1700e677d539","Type":"ContainerStarted","Data":"da3dc227c40a352ef71dec7f4fe6a59b773b7901f2ec3ec4f18c829adf8e87ed"} Sep 29 10:44:53 crc kubenswrapper[4752]: I0929 10:44:53.304620 4752 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 29 10:44:53 crc kubenswrapper[4752]: I0929 10:44:53.318714 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:53Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:53 crc kubenswrapper[4752]: I0929 10:44:53.325764 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:53 crc kubenswrapper[4752]: I0929 10:44:53.325860 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:53 crc kubenswrapper[4752]: I0929 10:44:53.325878 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:53 crc kubenswrapper[4752]: I0929 10:44:53.325903 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:53 crc kubenswrapper[4752]: I0929 10:44:53.325918 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:53Z","lastTransitionTime":"2025-09-29T10:44:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:53 crc kubenswrapper[4752]: I0929 10:44:53.333842 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131d2c8a72fc6a373ebf6835840e6b9c1829db4c78b4961bf36642fd0e8a5636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:53Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:53 crc kubenswrapper[4752]: I0929 10:44:53.349174 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5863c243-797d-462a-b11f-71aaf005f8d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://166738b29f01996ec981fd00b49f422e4a97fe774396e7ea153ad29ef30a7370\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdtpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32155f6078e9c15abe4c659ac79b064ec182a232ea1d816998da4de273b7aa67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdtpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mgrvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:53Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:53 crc kubenswrapper[4752]: I0929 10:44:53.366318 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4whp8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"398b6e5c-29ac-4701-9207-d3d269b62224\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63db080ebca3f5ea23ddc9af874b6b500abe8044c73794ae0749df2949fb9520\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9hp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4whp8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:53Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:53 crc kubenswrapper[4752]: I0929 10:44:53.390707 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48ad7053-6039-4b1a-9729-fcbe1d938928\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00965359c30aa25677d4b114c00b339b155ab4b5316d5e355536bea5b65eaba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e2d86e0821e0155affe296e5cc70e9904f04c800943101e62509e3a5e4e0808\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9378a6f1ac902b030f4ecabac1eae40f884dc1546a360e178f38300e137d8b0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a174bcfad22c2a58c48792478272705c80a56775b45b14919ea1de1dd92b4cbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://828d416b69696f709d91feb8df8fead0f95be74a91c5dab25756e341e29413dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e4ae4f6e0a6df2f1e370b0ff37704c0b0252752c0d8e8a1cdd83088ca9ec951\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e4ae4f6e0a6df2f1e370b0ff37704c0b0252752c0d8e8a1cdd83088ca9ec951\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40c90938f79ba960fa16979dd5f239674df4b13cae8b0b5d3bb48b0e46219a34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40c90938f79ba960fa16979dd5f239674df4b13cae8b0b5d3bb48b0e46219a34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f99c6fe84624f3e518bbe35ee9b700effb126ff1f36d995262b7ed8b73364780\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f99c6fe84624f3e518bbe35ee9b700effb126ff1f36d995262b7ed8b73364780\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:53Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:53 crc kubenswrapper[4752]: I0929 10:44:53.404525 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca4f637cfcb1e52fa69f0ffa46b3a53459225d9ad4afd1178bff709e812c5418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b70242846937de5b4dda37a2b8c48947fded378c299ea4ad857168589d7c175\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:53Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:53 crc kubenswrapper[4752]: I0929 10:44:53.416255 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7kp7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66a61a7f-9be6-486b-a425-62ed62ec0ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4170732970e5e7c429279d239eb2d4b9d8249ff254b35f38ff80d0321087be2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kgr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7kp7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:53Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:53 crc kubenswrapper[4752]: I0929 10:44:53.429166 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:53 crc kubenswrapper[4752]: I0929 10:44:53.429217 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:53 crc kubenswrapper[4752]: I0929 10:44:53.429227 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:53 crc kubenswrapper[4752]: I0929 10:44:53.429245 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:53 crc kubenswrapper[4752]: I0929 10:44:53.429259 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:53Z","lastTransitionTime":"2025-09-29T10:44:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:53 crc kubenswrapper[4752]: I0929 10:44:53.432955 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vm6zb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f30a1f9-86ef-450e-9f8c-8ef8d4ac380a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6bc5aff417397c8b264553f67de7ebd1aeadb67fb83114c5bb13c2e0d10e397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239ca1f17b9f1e1d6ba63b196e34066fe7fb37373453460261044f5fcaf819af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://239ca1f17b9f1e1d6ba63b196e34066fe7fb37373453460261044f5fcaf819af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd5b369dc688f11e4ab502a3886b722cba392fce0d3ac7850bd59abffbf7dee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd5b369dc688f11e4ab502a3886b722cba392fce0d3ac7850bd59abffbf7dee2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d17821abed9aca5c20373738f44ca9a61e954d1eee46f0d16c3e9b34d810a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88d17821abed9aca5c20373738f44ca9a61e954d1eee46f0d16c3e9b34d810a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50f5727e0bd53639ba6b6632f2d62c7c62ae74b07a60aa1cb58c2020990cae42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f5727e0bd53639ba6b6632f2d62c7c62ae74b07a60aa1cb58c2020990cae42\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd84740e3b0a970decedcc3960fb987fa618f9627f06be1d2d0b034d0361f805\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd84740e3b0a970decedcc3960fb987fa618f9627f06be1d2d0b034d0361f805\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6af6d9f7c1ca6625f88dcaa9ef267cf11f3ebb16a0ce12d3c2442550bc0833ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6af6d9f7c1ca6625f88dcaa9ef267cf11f3ebb16a0ce12d3c2442550bc0833ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vm6zb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:53Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:53 crc kubenswrapper[4752]: I0929 10:44:53.452229 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94028c24-ec10-4d5c-b32c-1700e677d539\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://486ac9c45cc8e6cc88a199b152343c1db14c51125b4357c85d5d082467fc4560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2860691a355a598f52a1f13213198fa7889748e67cca21a617ed5714f5eabcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e34a55130babbc5fbe9fb81d05fc687dc1b06c3bffea762ba699f9f6c317b312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5985eb5ebc8fa2ca986873aea235335770621597493b43eaa58d98329cd37009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b46368b26939edaf377aa86ef45fc9dc3ec4fa274dfe1cba458bafb8d32309e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a98f237ee9baeb799b2ea76ccbe7b349ed70b50f47738fc514ae56b46ee8d1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da3dc227c40a352ef71dec7f4fe6a59b773b7901f2ec3ec4f18c829adf8e87ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1374b35afd65797b94d674c0e4f0932a46ac76c73c5c03e6dbd42e66182b58ee\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T10:44:52Z\\\",\\\"message\\\":\\\"Node event handler 2 for removal\\\\nI0929 10:44:51.834387 6054 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0929 10:44:51.834400 6054 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0929 10:44:51.834450 6054 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0929 10:44:51.834462 6054 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0929 10:44:51.834469 6054 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0929 10:44:51.834477 6054 handler.go:208] Removed *v1.Node event handler 7\\\\nI0929 10:44:51.834486 6054 handler.go:208] Removed *v1.Node event handler 2\\\\nI0929 10:44:51.834513 6054 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0929 10:44:51.834541 6054 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0929 10:44:51.834694 6054 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0929 10:44:51.835176 6054 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0929 10:44:51.835281 6054 factory.go:656] Stopping watch factory\\\\nI0929 10:44:51.835343 6054 ovnkube.go:599] Stopped ovnkube\\\\nI0929 10:44:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea11fb795febf50e35263b0a02c32a01fd69937dfbfe196696cd1792e40cc191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f22dfbbd26fb3ebf4869b46406913cc1963e33c11794193c815235be5acee338\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f22dfbbd26fb3ebf4869b46406913cc1963e33c11794193c815235be5acee338\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c2vrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:53Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:53 crc kubenswrapper[4752]: I0929 10:44:53.470772 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520a5d33-312c-4033-8b69-5dd582f13ccc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6223734bbce461c09916aea7629bba0cfa97ea17050bca7417020ece9ae031a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1157b82d6f3337270d30abdceadaa1f0a01b3c6d8de6bc8e9edf083a8264f19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://854abd6205c2eec2229d0d65aec3edb7cf1cc1e77759df41bd22deda4a08c8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c927118840179fccacbe6a18a329c117cef73a6e914bf38d20fc2439d6a5c1ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c927118840179fccacbe6a18a329c117cef73a6e914bf38d20fc2439d6a5c1ee\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"message\\\":\\\"le observer\\\\nW0929 10:44:40.787758 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0929 10:44:40.787900 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 10:44:40.788558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1487283959/tls.crt::/tmp/serving-cert-1487283959/tls.key\\\\\\\"\\\\nI0929 10:44:41.256284 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 10:44:41.261265 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 10:44:41.261291 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 10:44:41.261311 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 10:44:41.261316 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 10:44:41.267824 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0929 10:44:41.267847 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0929 10:44:41.267849 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 10:44:41.267871 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 10:44:41.267876 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 10:44:41.267879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 10:44:41.267882 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 10:44:41.267884 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0929 10:44:41.270258 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbe61bb570ef2be352bb3a0e55da353ce7b618b397e3bf9f0d66da0c9b6f1d4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80f961b58569cce6d634f225369902695ccda2e78efb1c6fd635f1535467cc1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80f961b58569cce6d634f225369902695ccda2e78efb1c6fd635f1535467cc1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:53Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:53 crc kubenswrapper[4752]: I0929 10:44:53.485713 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:53Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:53 crc kubenswrapper[4752]: I0929 10:44:53.502764 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fb781fd16d4a9f56202eb1724ed1a4ed6700ff7b81819573b955bcb07e563a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:53Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:53 crc kubenswrapper[4752]: I0929 10:44:53.517755 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xv5q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52fc9378-c37b-424b-afde-7b191bab5fde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30ee75a35da106cc9424c7a3f97f28d0c711200667372c023612db4a9701c189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4rqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xv5q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:53Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:53 crc kubenswrapper[4752]: I0929 10:44:53.532101 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3e5d3a3-2f2d-4f61-ae95-26ebd1f72342\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66d77cd5048e199a6eae84be4079c3b00305f4f5223b5176a49df0feb2f0bf8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74b270e951a827068c908168bf04d4cd3bcba62e472e4a3f415de8b7463fdccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dd4d83f6d6b5db7fc93239bc1a6b731c67bc15ef1ca1990b53589e4ad36bfa7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39ef26bf3e7b95ac9a59199bbabe11fd4e831baba1b120ef97a4839c0c4aab7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:53Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:53 crc kubenswrapper[4752]: I0929 10:44:53.532221 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:53 crc kubenswrapper[4752]: I0929 10:44:53.532253 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:53 crc kubenswrapper[4752]: I0929 10:44:53.532265 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:53 crc kubenswrapper[4752]: I0929 10:44:53.532282 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:53 crc kubenswrapper[4752]: I0929 10:44:53.532292 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:53Z","lastTransitionTime":"2025-09-29T10:44:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:53 crc kubenswrapper[4752]: I0929 10:44:53.545311 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:53Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:53 crc kubenswrapper[4752]: I0929 10:44:53.635059 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:53 crc kubenswrapper[4752]: I0929 10:44:53.635104 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:53 crc kubenswrapper[4752]: I0929 10:44:53.635114 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:53 crc kubenswrapper[4752]: I0929 10:44:53.635128 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:53 crc kubenswrapper[4752]: I0929 10:44:53.635137 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:53Z","lastTransitionTime":"2025-09-29T10:44:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:53 crc kubenswrapper[4752]: I0929 10:44:53.737867 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:53 crc kubenswrapper[4752]: I0929 10:44:53.737920 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:53 crc kubenswrapper[4752]: I0929 10:44:53.737929 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:53 crc kubenswrapper[4752]: I0929 10:44:53.737946 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:53 crc kubenswrapper[4752]: I0929 10:44:53.737958 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:53Z","lastTransitionTime":"2025-09-29T10:44:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:53 crc kubenswrapper[4752]: I0929 10:44:53.840764 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:53 crc kubenswrapper[4752]: I0929 10:44:53.840845 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:53 crc kubenswrapper[4752]: I0929 10:44:53.840859 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:53 crc kubenswrapper[4752]: I0929 10:44:53.840882 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:53 crc kubenswrapper[4752]: I0929 10:44:53.840896 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:53Z","lastTransitionTime":"2025-09-29T10:44:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:53 crc kubenswrapper[4752]: I0929 10:44:53.944190 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:53 crc kubenswrapper[4752]: I0929 10:44:53.944538 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:53 crc kubenswrapper[4752]: I0929 10:44:53.944628 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:53 crc kubenswrapper[4752]: I0929 10:44:53.944726 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:53 crc kubenswrapper[4752]: I0929 10:44:53.944798 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:53Z","lastTransitionTime":"2025-09-29T10:44:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:54 crc kubenswrapper[4752]: I0929 10:44:54.048598 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:54 crc kubenswrapper[4752]: I0929 10:44:54.048686 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:54 crc kubenswrapper[4752]: I0929 10:44:54.048705 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:54 crc kubenswrapper[4752]: I0929 10:44:54.048749 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:54 crc kubenswrapper[4752]: I0929 10:44:54.048771 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:54Z","lastTransitionTime":"2025-09-29T10:44:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:54 crc kubenswrapper[4752]: I0929 10:44:54.147475 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mp5pm"] Sep 29 10:44:54 crc kubenswrapper[4752]: I0929 10:44:54.148075 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mp5pm" Sep 29 10:44:54 crc kubenswrapper[4752]: I0929 10:44:54.150269 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Sep 29 10:44:54 crc kubenswrapper[4752]: I0929 10:44:54.150342 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Sep 29 10:44:54 crc kubenswrapper[4752]: I0929 10:44:54.152019 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:54 crc kubenswrapper[4752]: I0929 10:44:54.152051 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:54 crc kubenswrapper[4752]: I0929 10:44:54.152062 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:54 crc kubenswrapper[4752]: I0929 10:44:54.152079 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:54 crc kubenswrapper[4752]: I0929 10:44:54.152091 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:54Z","lastTransitionTime":"2025-09-29T10:44:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:54 crc kubenswrapper[4752]: I0929 10:44:54.176974 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48ad7053-6039-4b1a-9729-fcbe1d938928\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00965359c30aa25677d4b114c00b339b155ab4b5316d5e355536bea5b65eaba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e2d86e0821e0155affe296e5cc70e9904f04c800943101e62509e3a5e4e0808\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9378a6f1ac902b030f4ecabac1eae40f884dc1546a360e178f38300e137d8b0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a174bcfad22c2a58c48792478272705c80a56775b45b14919ea1de1dd92b4cbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://828d416b69696f709d91feb8df8fead0f95be74a91c5dab25756e341e29413dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e4ae4f6e0a6df2f1e370b0ff37704c0b0252752c0d8e8a1cdd83088ca9ec951\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e4ae4f6e0a6df2f1e370b0ff37704c0b0252752c0d8e8a1cdd83088ca9ec951\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40c90938f79ba960fa16979dd5f239674df4b13cae8b0b5d3bb48b0e46219a34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40c90938f79ba960fa16979dd5f239674df4b13cae8b0b5d3bb48b0e46219a34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f99c6fe84624f3e518bbe35ee9b700effb126ff1f36d995262b7ed8b73364780\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f99c6fe84624f3e518bbe35ee9b700effb126ff1f36d995262b7ed8b73364780\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:54Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:54 crc kubenswrapper[4752]: I0929 10:44:54.193832 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:54Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:54 crc kubenswrapper[4752]: I0929 10:44:54.208777 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131d2c8a72fc6a373ebf6835840e6b9c1829db4c78b4961bf36642fd0e8a5636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:54Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:54 crc kubenswrapper[4752]: I0929 10:44:54.222942 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5863c243-797d-462a-b11f-71aaf005f8d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://166738b29f01996ec981fd00b49f422e4a97fe774396e7ea153ad29ef30a7370\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdtpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32155f6078e9c15abe4c659ac79b064ec182a232ea1d816998da4de273b7aa67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdtpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mgrvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:54Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:54 crc kubenswrapper[4752]: I0929 10:44:54.235842 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4whp8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"398b6e5c-29ac-4701-9207-d3d269b62224\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63db080ebca3f5ea23ddc9af874b6b500abe8044c73794ae0749df2949fb9520\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9hp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4whp8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:54Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:54 crc kubenswrapper[4752]: I0929 10:44:54.249483 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520a5d33-312c-4033-8b69-5dd582f13ccc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6223734bbce461c09916aea7629bba0cfa97ea17050bca7417020ece9ae031a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1157b82d6f3337270d30abdceadaa1f0a01b3c6d8de6bc8e9edf083a8264f19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://854abd6205c2eec2229d0d65aec3edb7cf1cc1e77759df41bd22deda4a08c8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c927118840179fccacbe6a18a329c117cef73a6e914bf38d20fc2439d6a5c1ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c927118840179fccacbe6a18a329c117cef73a6e914bf38d20fc2439d6a5c1ee\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"message\\\":\\\"le observer\\\\nW0929 10:44:40.787758 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0929 10:44:40.787900 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 10:44:40.788558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1487283959/tls.crt::/tmp/serving-cert-1487283959/tls.key\\\\\\\"\\\\nI0929 10:44:41.256284 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 10:44:41.261265 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 10:44:41.261291 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 10:44:41.261311 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 10:44:41.261316 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 10:44:41.267824 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0929 10:44:41.267847 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0929 10:44:41.267849 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 10:44:41.267871 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 10:44:41.267876 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 10:44:41.267879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 10:44:41.267882 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 10:44:41.267884 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0929 10:44:41.270258 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbe61bb570ef2be352bb3a0e55da353ce7b618b397e3bf9f0d66da0c9b6f1d4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80f961b58569cce6d634f225369902695ccda2e78efb1c6fd635f1535467cc1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80f961b58569cce6d634f225369902695ccda2e78efb1c6fd635f1535467cc1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:54Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:54 crc kubenswrapper[4752]: I0929 10:44:54.254971 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:54 crc kubenswrapper[4752]: I0929 10:44:54.255019 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:54 crc kubenswrapper[4752]: I0929 10:44:54.255032 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:54 crc kubenswrapper[4752]: I0929 10:44:54.255051 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:54 crc kubenswrapper[4752]: I0929 10:44:54.255063 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:54Z","lastTransitionTime":"2025-09-29T10:44:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:54 crc kubenswrapper[4752]: I0929 10:44:54.264684 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca4f637cfcb1e52fa69f0ffa46b3a53459225d9ad4afd1178bff709e812c5418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b70242846937de5b4dda37a2b8c48947fded378c299ea4ad857168589d7c175\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:54Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:54 crc kubenswrapper[4752]: I0929 10:44:54.267914 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/65f5485e-9000-4512-aad3-7d367715ac2d-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-mp5pm\" (UID: \"65f5485e-9000-4512-aad3-7d367715ac2d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mp5pm" Sep 29 10:44:54 crc kubenswrapper[4752]: I0929 10:44:54.267973 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/65f5485e-9000-4512-aad3-7d367715ac2d-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-mp5pm\" (UID: \"65f5485e-9000-4512-aad3-7d367715ac2d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mp5pm" Sep 29 10:44:54 crc kubenswrapper[4752]: I0929 10:44:54.268012 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z772z\" (UniqueName: \"kubernetes.io/projected/65f5485e-9000-4512-aad3-7d367715ac2d-kube-api-access-z772z\") pod \"ovnkube-control-plane-749d76644c-mp5pm\" (UID: \"65f5485e-9000-4512-aad3-7d367715ac2d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mp5pm" Sep 29 10:44:54 crc kubenswrapper[4752]: I0929 10:44:54.268044 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/65f5485e-9000-4512-aad3-7d367715ac2d-env-overrides\") pod \"ovnkube-control-plane-749d76644c-mp5pm\" (UID: \"65f5485e-9000-4512-aad3-7d367715ac2d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mp5pm" Sep 29 10:44:54 crc kubenswrapper[4752]: I0929 10:44:54.279182 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7kp7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66a61a7f-9be6-486b-a425-62ed62ec0ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4170732970e5e7c429279d239eb2d4b9d8249ff254b35f38ff80d0321087be2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kgr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7kp7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:54Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:54 crc kubenswrapper[4752]: I0929 10:44:54.295217 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vm6zb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f30a1f9-86ef-450e-9f8c-8ef8d4ac380a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6bc5aff417397c8b264553f67de7ebd1aeadb67fb83114c5bb13c2e0d10e397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239ca1f17b9f1e1d6ba63b196e34066fe7fb37373453460261044f5fcaf819af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://239ca1f17b9f1e1d6ba63b196e34066fe7fb37373453460261044f5fcaf819af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd5b369dc688f11e4ab502a3886b722cba392fce0d3ac7850bd59abffbf7dee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd5b369dc688f11e4ab502a3886b722cba392fce0d3ac7850bd59abffbf7dee2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d17821abed9aca5c20373738f44ca9a61e954d1eee46f0d16c3e9b34d810a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88d17821abed9aca5c20373738f44ca9a61e954d1eee46f0d16c3e9b34d810a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50f5727e0bd53639ba6b6632f2d62c7c62ae74b07a60aa1cb58c2020990cae42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f5727e0bd53639ba6b6632f2d62c7c62ae74b07a60aa1cb58c2020990cae42\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd84740e3b0a970decedcc3960fb987fa618f9627f06be1d2d0b034d0361f805\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd84740e3b0a970decedcc3960fb987fa618f9627f06be1d2d0b034d0361f805\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6af6d9f7c1ca6625f88dcaa9ef267cf11f3ebb16a0ce12d3c2442550bc0833ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6af6d9f7c1ca6625f88dcaa9ef267cf11f3ebb16a0ce12d3c2442550bc0833ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vm6zb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:54Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:54 crc kubenswrapper[4752]: I0929 10:44:54.310412 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c2vrh_94028c24-ec10-4d5c-b32c-1700e677d539/ovnkube-controller/1.log" Sep 29 10:44:54 crc kubenswrapper[4752]: I0929 10:44:54.312034 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c2vrh_94028c24-ec10-4d5c-b32c-1700e677d539/ovnkube-controller/0.log" Sep 29 10:44:54 crc kubenswrapper[4752]: I0929 10:44:54.316053 4752 generic.go:334] "Generic (PLEG): container finished" podID="94028c24-ec10-4d5c-b32c-1700e677d539" containerID="da3dc227c40a352ef71dec7f4fe6a59b773b7901f2ec3ec4f18c829adf8e87ed" exitCode=1 Sep 29 10:44:54 crc kubenswrapper[4752]: I0929 10:44:54.316128 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" event={"ID":"94028c24-ec10-4d5c-b32c-1700e677d539","Type":"ContainerDied","Data":"da3dc227c40a352ef71dec7f4fe6a59b773b7901f2ec3ec4f18c829adf8e87ed"} Sep 29 10:44:54 crc kubenswrapper[4752]: I0929 10:44:54.316182 4752 scope.go:117] "RemoveContainer" containerID="1374b35afd65797b94d674c0e4f0932a46ac76c73c5c03e6dbd42e66182b58ee" Sep 29 10:44:54 crc kubenswrapper[4752]: I0929 10:44:54.316950 4752 scope.go:117] "RemoveContainer" containerID="da3dc227c40a352ef71dec7f4fe6a59b773b7901f2ec3ec4f18c829adf8e87ed" Sep 29 10:44:54 crc kubenswrapper[4752]: E0929 10:44:54.317118 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-c2vrh_openshift-ovn-kubernetes(94028c24-ec10-4d5c-b32c-1700e677d539)\"" pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" podUID="94028c24-ec10-4d5c-b32c-1700e677d539" Sep 29 10:44:54 crc kubenswrapper[4752]: I0929 10:44:54.322985 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94028c24-ec10-4d5c-b32c-1700e677d539\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://486ac9c45cc8e6cc88a199b152343c1db14c51125b4357c85d5d082467fc4560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2860691a355a598f52a1f13213198fa7889748e67cca21a617ed5714f5eabcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e34a55130babbc5fbe9fb81d05fc687dc1b06c3bffea762ba699f9f6c317b312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5985eb5ebc8fa2ca986873aea235335770621597493b43eaa58d98329cd37009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b46368b26939edaf377aa86ef45fc9dc3ec4fa274dfe1cba458bafb8d32309e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a98f237ee9baeb799b2ea76ccbe7b349ed70b50f47738fc514ae56b46ee8d1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da3dc227c40a352ef71dec7f4fe6a59b773b7901f2ec3ec4f18c829adf8e87ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1374b35afd65797b94d674c0e4f0932a46ac76c73c5c03e6dbd42e66182b58ee\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T10:44:52Z\\\",\\\"message\\\":\\\"Node event handler 2 for removal\\\\nI0929 10:44:51.834387 6054 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0929 10:44:51.834400 6054 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0929 10:44:51.834450 6054 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0929 10:44:51.834462 6054 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0929 10:44:51.834469 6054 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0929 10:44:51.834477 6054 handler.go:208] Removed *v1.Node event handler 7\\\\nI0929 10:44:51.834486 6054 handler.go:208] Removed *v1.Node event handler 2\\\\nI0929 10:44:51.834513 6054 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0929 10:44:51.834541 6054 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0929 10:44:51.834694 6054 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0929 10:44:51.835176 6054 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0929 10:44:51.835281 6054 factory.go:656] Stopping watch factory\\\\nI0929 10:44:51.835343 6054 ovnkube.go:599] Stopped ovnkube\\\\nI0929 10:44:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea11fb795febf50e35263b0a02c32a01fd69937dfbfe196696cd1792e40cc191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f22dfbbd26fb3ebf4869b46406913cc1963e33c11794193c815235be5acee338\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f22dfbbd26fb3ebf4869b46406913cc1963e33c11794193c815235be5acee338\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c2vrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:54Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:54 crc kubenswrapper[4752]: I0929 10:44:54.337128 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mp5pm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65f5485e-9000-4512-aad3-7d367715ac2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z772z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z772z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mp5pm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:54Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:54 crc kubenswrapper[4752]: I0929 10:44:54.351361 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3e5d3a3-2f2d-4f61-ae95-26ebd1f72342\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66d77cd5048e199a6eae84be4079c3b00305f4f5223b5176a49df0feb2f0bf8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74b270e951a827068c908168bf04d4cd3bcba62e472e4a3f415de8b7463fdccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dd4d83f6d6b5db7fc93239bc1a6b731c67bc15ef1ca1990b53589e4ad36bfa7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39ef26bf3e7b95ac9a59199bbabe11fd4e831baba1b120ef97a4839c0c4aab7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:54Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:54 crc kubenswrapper[4752]: I0929 10:44:54.357850 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:54 crc kubenswrapper[4752]: I0929 10:44:54.357936 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:54 crc kubenswrapper[4752]: I0929 10:44:54.357957 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:54 crc kubenswrapper[4752]: I0929 10:44:54.357987 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:54 crc kubenswrapper[4752]: I0929 10:44:54.358007 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:54Z","lastTransitionTime":"2025-09-29T10:44:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:54 crc kubenswrapper[4752]: I0929 10:44:54.366637 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:54Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:54 crc kubenswrapper[4752]: I0929 10:44:54.369183 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/65f5485e-9000-4512-aad3-7d367715ac2d-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-mp5pm\" (UID: \"65f5485e-9000-4512-aad3-7d367715ac2d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mp5pm" Sep 29 10:44:54 crc kubenswrapper[4752]: I0929 10:44:54.369235 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/65f5485e-9000-4512-aad3-7d367715ac2d-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-mp5pm\" (UID: \"65f5485e-9000-4512-aad3-7d367715ac2d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mp5pm" Sep 29 10:44:54 crc kubenswrapper[4752]: I0929 10:44:54.369259 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z772z\" (UniqueName: \"kubernetes.io/projected/65f5485e-9000-4512-aad3-7d367715ac2d-kube-api-access-z772z\") pod \"ovnkube-control-plane-749d76644c-mp5pm\" (UID: \"65f5485e-9000-4512-aad3-7d367715ac2d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mp5pm" Sep 29 10:44:54 crc kubenswrapper[4752]: I0929 10:44:54.369279 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/65f5485e-9000-4512-aad3-7d367715ac2d-env-overrides\") pod \"ovnkube-control-plane-749d76644c-mp5pm\" (UID: \"65f5485e-9000-4512-aad3-7d367715ac2d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mp5pm" Sep 29 10:44:54 crc kubenswrapper[4752]: I0929 10:44:54.370099 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/65f5485e-9000-4512-aad3-7d367715ac2d-env-overrides\") pod \"ovnkube-control-plane-749d76644c-mp5pm\" (UID: \"65f5485e-9000-4512-aad3-7d367715ac2d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mp5pm" Sep 29 10:44:54 crc kubenswrapper[4752]: I0929 10:44:54.370284 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/65f5485e-9000-4512-aad3-7d367715ac2d-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-mp5pm\" (UID: \"65f5485e-9000-4512-aad3-7d367715ac2d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mp5pm" Sep 29 10:44:54 crc kubenswrapper[4752]: I0929 10:44:54.377335 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/65f5485e-9000-4512-aad3-7d367715ac2d-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-mp5pm\" (UID: \"65f5485e-9000-4512-aad3-7d367715ac2d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mp5pm" Sep 29 10:44:54 crc kubenswrapper[4752]: I0929 10:44:54.384010 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fb781fd16d4a9f56202eb1724ed1a4ed6700ff7b81819573b955bcb07e563a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:54Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:54 crc kubenswrapper[4752]: I0929 10:44:54.389027 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z772z\" (UniqueName: \"kubernetes.io/projected/65f5485e-9000-4512-aad3-7d367715ac2d-kube-api-access-z772z\") pod \"ovnkube-control-plane-749d76644c-mp5pm\" (UID: \"65f5485e-9000-4512-aad3-7d367715ac2d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mp5pm" Sep 29 10:44:54 crc kubenswrapper[4752]: I0929 10:44:54.398638 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xv5q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52fc9378-c37b-424b-afde-7b191bab5fde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30ee75a35da106cc9424c7a3f97f28d0c711200667372c023612db4a9701c189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4rqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xv5q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:54Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:54 crc kubenswrapper[4752]: I0929 10:44:54.416590 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:54Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:54 crc kubenswrapper[4752]: I0929 10:44:54.439342 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48ad7053-6039-4b1a-9729-fcbe1d938928\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00965359c30aa25677d4b114c00b339b155ab4b5316d5e355536bea5b65eaba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e2d86e0821e0155affe296e5cc70e9904f04c800943101e62509e3a5e4e0808\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9378a6f1ac902b030f4ecabac1eae40f884dc1546a360e178f38300e137d8b0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a174bcfad22c2a58c48792478272705c80a56775b45b14919ea1de1dd92b4cbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://828d416b69696f709d91feb8df8fead0f95be74a91c5dab25756e341e29413dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e4ae4f6e0a6df2f1e370b0ff37704c0b0252752c0d8e8a1cdd83088ca9ec951\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e4ae4f6e0a6df2f1e370b0ff37704c0b0252752c0d8e8a1cdd83088ca9ec951\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40c90938f79ba960fa16979dd5f239674df4b13cae8b0b5d3bb48b0e46219a34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40c90938f79ba960fa16979dd5f239674df4b13cae8b0b5d3bb48b0e46219a34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f99c6fe84624f3e518bbe35ee9b700effb126ff1f36d995262b7ed8b73364780\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f99c6fe84624f3e518bbe35ee9b700effb126ff1f36d995262b7ed8b73364780\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:54Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:54 crc kubenswrapper[4752]: I0929 10:44:54.453988 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:54Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:54 crc kubenswrapper[4752]: I0929 10:44:54.460864 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:54 crc kubenswrapper[4752]: I0929 10:44:54.460929 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:54 crc kubenswrapper[4752]: I0929 10:44:54.460947 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:54 crc kubenswrapper[4752]: I0929 10:44:54.460975 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:54 crc kubenswrapper[4752]: I0929 10:44:54.460993 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:54Z","lastTransitionTime":"2025-09-29T10:44:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:54 crc kubenswrapper[4752]: I0929 10:44:54.463555 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mp5pm" Sep 29 10:44:54 crc kubenswrapper[4752]: I0929 10:44:54.472480 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131d2c8a72fc6a373ebf6835840e6b9c1829db4c78b4961bf36642fd0e8a5636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:54Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:54 crc kubenswrapper[4752]: W0929 10:44:54.478229 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod65f5485e_9000_4512_aad3_7d367715ac2d.slice/crio-49d2edd0a6a7ba03d7171d4b5b246bf598c4d15ca8feb67718518f5148248ff2 WatchSource:0}: Error finding container 49d2edd0a6a7ba03d7171d4b5b246bf598c4d15ca8feb67718518f5148248ff2: Status 404 returned error can't find the container with id 49d2edd0a6a7ba03d7171d4b5b246bf598c4d15ca8feb67718518f5148248ff2 Sep 29 10:44:54 crc kubenswrapper[4752]: I0929 10:44:54.488751 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5863c243-797d-462a-b11f-71aaf005f8d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://166738b29f01996ec981fd00b49f422e4a97fe774396e7ea153ad29ef30a7370\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdtpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32155f6078e9c15abe4c659ac79b064ec182a232ea1d816998da4de273b7aa67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdtpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mgrvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:54Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:54 crc kubenswrapper[4752]: I0929 10:44:54.502411 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4whp8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"398b6e5c-29ac-4701-9207-d3d269b62224\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63db080ebca3f5ea23ddc9af874b6b500abe8044c73794ae0749df2949fb9520\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9hp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4whp8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:54Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:54 crc kubenswrapper[4752]: I0929 10:44:54.517852 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520a5d33-312c-4033-8b69-5dd582f13ccc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6223734bbce461c09916aea7629bba0cfa97ea17050bca7417020ece9ae031a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1157b82d6f3337270d30abdceadaa1f0a01b3c6d8de6bc8e9edf083a8264f19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://854abd6205c2eec2229d0d65aec3edb7cf1cc1e77759df41bd22deda4a08c8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c927118840179fccacbe6a18a329c117cef73a6e914bf38d20fc2439d6a5c1ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c927118840179fccacbe6a18a329c117cef73a6e914bf38d20fc2439d6a5c1ee\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"message\\\":\\\"le observer\\\\nW0929 10:44:40.787758 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0929 10:44:40.787900 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 10:44:40.788558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1487283959/tls.crt::/tmp/serving-cert-1487283959/tls.key\\\\\\\"\\\\nI0929 10:44:41.256284 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 10:44:41.261265 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 10:44:41.261291 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 10:44:41.261311 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 10:44:41.261316 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 10:44:41.267824 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0929 10:44:41.267847 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0929 10:44:41.267849 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 10:44:41.267871 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 10:44:41.267876 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 10:44:41.267879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 10:44:41.267882 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 10:44:41.267884 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0929 10:44:41.270258 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbe61bb570ef2be352bb3a0e55da353ce7b618b397e3bf9f0d66da0c9b6f1d4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80f961b58569cce6d634f225369902695ccda2e78efb1c6fd635f1535467cc1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80f961b58569cce6d634f225369902695ccda2e78efb1c6fd635f1535467cc1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:54Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:54 crc kubenswrapper[4752]: I0929 10:44:54.534705 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca4f637cfcb1e52fa69f0ffa46b3a53459225d9ad4afd1178bff709e812c5418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b70242846937de5b4dda37a2b8c48947fded378c299ea4ad857168589d7c175\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:54Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:54 crc kubenswrapper[4752]: I0929 10:44:54.547938 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7kp7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66a61a7f-9be6-486b-a425-62ed62ec0ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4170732970e5e7c429279d239eb2d4b9d8249ff254b35f38ff80d0321087be2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kgr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7kp7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:54Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:54 crc kubenswrapper[4752]: I0929 10:44:54.564069 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vm6zb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f30a1f9-86ef-450e-9f8c-8ef8d4ac380a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6bc5aff417397c8b264553f67de7ebd1aeadb67fb83114c5bb13c2e0d10e397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239ca1f17b9f1e1d6ba63b196e34066fe7fb37373453460261044f5fcaf819af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://239ca1f17b9f1e1d6ba63b196e34066fe7fb37373453460261044f5fcaf819af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd5b369dc688f11e4ab502a3886b722cba392fce0d3ac7850bd59abffbf7dee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd5b369dc688f11e4ab502a3886b722cba392fce0d3ac7850bd59abffbf7dee2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d17821abed9aca5c20373738f44ca9a61e954d1eee46f0d16c3e9b34d810a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88d17821abed9aca5c20373738f44ca9a61e954d1eee46f0d16c3e9b34d810a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50f5727e0bd53639ba6b6632f2d62c7c62ae74b07a60aa1cb58c2020990cae42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f5727e0bd53639ba6b6632f2d62c7c62ae74b07a60aa1cb58c2020990cae42\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd84740e3b0a970decedcc3960fb987fa618f9627f06be1d2d0b034d0361f805\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd84740e3b0a970decedcc3960fb987fa618f9627f06be1d2d0b034d0361f805\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6af6d9f7c1ca6625f88dcaa9ef267cf11f3ebb16a0ce12d3c2442550bc0833ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6af6d9f7c1ca6625f88dcaa9ef267cf11f3ebb16a0ce12d3c2442550bc0833ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vm6zb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:54Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:54 crc kubenswrapper[4752]: I0929 10:44:54.564324 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:54 crc kubenswrapper[4752]: I0929 10:44:54.564357 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:54 crc kubenswrapper[4752]: I0929 10:44:54.564366 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:54 crc kubenswrapper[4752]: I0929 10:44:54.564382 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:54 crc kubenswrapper[4752]: I0929 10:44:54.564395 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:54Z","lastTransitionTime":"2025-09-29T10:44:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:54 crc kubenswrapper[4752]: I0929 10:44:54.584636 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94028c24-ec10-4d5c-b32c-1700e677d539\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://486ac9c45cc8e6cc88a199b152343c1db14c51125b4357c85d5d082467fc4560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2860691a355a598f52a1f13213198fa7889748e67cca21a617ed5714f5eabcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e34a55130babbc5fbe9fb81d05fc687dc1b06c3bffea762ba699f9f6c317b312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5985eb5ebc8fa2ca986873aea235335770621597493b43eaa58d98329cd37009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b46368b26939edaf377aa86ef45fc9dc3ec4fa274dfe1cba458bafb8d32309e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a98f237ee9baeb799b2ea76ccbe7b349ed70b50f47738fc514ae56b46ee8d1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da3dc227c40a352ef71dec7f4fe6a59b773b7901f2ec3ec4f18c829adf8e87ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1374b35afd65797b94d674c0e4f0932a46ac76c73c5c03e6dbd42e66182b58ee\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T10:44:52Z\\\",\\\"message\\\":\\\"Node event handler 2 for removal\\\\nI0929 10:44:51.834387 6054 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0929 10:44:51.834400 6054 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0929 10:44:51.834450 6054 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0929 10:44:51.834462 6054 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0929 10:44:51.834469 6054 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0929 10:44:51.834477 6054 handler.go:208] Removed *v1.Node event handler 7\\\\nI0929 10:44:51.834486 6054 handler.go:208] Removed *v1.Node event handler 2\\\\nI0929 10:44:51.834513 6054 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0929 10:44:51.834541 6054 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0929 10:44:51.834694 6054 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0929 10:44:51.835176 6054 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0929 10:44:51.835281 6054 factory.go:656] Stopping watch factory\\\\nI0929 10:44:51.835343 6054 ovnkube.go:599] Stopped ovnkube\\\\nI0929 10:44:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da3dc227c40a352ef71dec7f4fe6a59b773b7901f2ec3ec4f18c829adf8e87ed\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T10:44:53Z\\\",\\\"message\\\":\\\"mers/factory.go:160\\\\nI0929 10:44:53.327068 6174 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0929 10:44:53.326997 6174 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0929 10:44:53.327207 6174 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0929 10:44:53.327278 6174 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0929 10:44:53.327541 6174 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0929 10:44:53.328126 6174 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0929 10:44:53.328200 6174 factory.go:656] Stopping watch factory\\\\nI0929 10:44:53.328228 6174 handler.go:208] Removed *v1.Node event handler 2\\\\nI0929 10:44:53.331274 6174 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0929 10:44:53.331331 6174 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0929 10:44:53.331404 6174 ovnkube.go:599] Stopped ovnkube\\\\nI0929 10:44:53.331469 6174 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0929 10:44:53.331650 6174 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea11fb795febf50e35263b0a02c32a01fd69937dfbfe196696cd1792e40cc191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f22dfbbd26fb3ebf4869b46406913cc1963e33c11794193c815235be5acee338\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f22dfbbd26fb3ebf4869b46406913cc1963e33c11794193c815235be5acee338\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c2vrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:54Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:54 crc kubenswrapper[4752]: I0929 10:44:54.602538 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3e5d3a3-2f2d-4f61-ae95-26ebd1f72342\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66d77cd5048e199a6eae84be4079c3b00305f4f5223b5176a49df0feb2f0bf8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74b270e951a827068c908168bf04d4cd3bcba62e472e4a3f415de8b7463fdccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dd4d83f6d6b5db7fc93239bc1a6b731c67bc15ef1ca1990b53589e4ad36bfa7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39ef26bf3e7b95ac9a59199bbabe11fd4e831baba1b120ef97a4839c0c4aab7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:54Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:54 crc kubenswrapper[4752]: I0929 10:44:54.619833 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:54Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:54 crc kubenswrapper[4752]: I0929 10:44:54.636088 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fb781fd16d4a9f56202eb1724ed1a4ed6700ff7b81819573b955bcb07e563a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:54Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:54 crc kubenswrapper[4752]: I0929 10:44:54.649006 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xv5q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52fc9378-c37b-424b-afde-7b191bab5fde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30ee75a35da106cc9424c7a3f97f28d0c711200667372c023612db4a9701c189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4rqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xv5q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:54Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:54 crc kubenswrapper[4752]: I0929 10:44:54.667299 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:54 crc kubenswrapper[4752]: I0929 10:44:54.667344 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:54 crc kubenswrapper[4752]: I0929 10:44:54.667353 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:54 crc kubenswrapper[4752]: I0929 10:44:54.667371 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:54 crc kubenswrapper[4752]: I0929 10:44:54.667382 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:54Z","lastTransitionTime":"2025-09-29T10:44:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:54 crc kubenswrapper[4752]: I0929 10:44:54.670695 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mp5pm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65f5485e-9000-4512-aad3-7d367715ac2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z772z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z772z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mp5pm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:54Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:54 crc kubenswrapper[4752]: I0929 10:44:54.683376 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:54Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:54 crc kubenswrapper[4752]: I0929 10:44:54.769772 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:54 crc kubenswrapper[4752]: I0929 10:44:54.769870 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:54 crc kubenswrapper[4752]: I0929 10:44:54.769883 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:54 crc kubenswrapper[4752]: I0929 10:44:54.769903 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:54 crc kubenswrapper[4752]: I0929 10:44:54.769916 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:54Z","lastTransitionTime":"2025-09-29T10:44:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:54 crc kubenswrapper[4752]: I0929 10:44:54.873115 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:54 crc kubenswrapper[4752]: I0929 10:44:54.873180 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:54 crc kubenswrapper[4752]: I0929 10:44:54.873197 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:54 crc kubenswrapper[4752]: I0929 10:44:54.873224 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:54 crc kubenswrapper[4752]: I0929 10:44:54.873279 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:54Z","lastTransitionTime":"2025-09-29T10:44:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:54 crc kubenswrapper[4752]: I0929 10:44:54.976905 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:54 crc kubenswrapper[4752]: I0929 10:44:54.976950 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:54 crc kubenswrapper[4752]: I0929 10:44:54.976966 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:54 crc kubenswrapper[4752]: I0929 10:44:54.976987 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:54 crc kubenswrapper[4752]: I0929 10:44:54.976999 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:54Z","lastTransitionTime":"2025-09-29T10:44:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:55 crc kubenswrapper[4752]: I0929 10:44:55.030986 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 10:44:55 crc kubenswrapper[4752]: I0929 10:44:55.031021 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 10:44:55 crc kubenswrapper[4752]: E0929 10:44:55.031165 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 10:44:55 crc kubenswrapper[4752]: I0929 10:44:55.031183 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 10:44:55 crc kubenswrapper[4752]: E0929 10:44:55.031433 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 10:44:55 crc kubenswrapper[4752]: E0929 10:44:55.031355 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 10:44:55 crc kubenswrapper[4752]: I0929 10:44:55.080240 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:55 crc kubenswrapper[4752]: I0929 10:44:55.080302 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:55 crc kubenswrapper[4752]: I0929 10:44:55.080315 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:55 crc kubenswrapper[4752]: I0929 10:44:55.080338 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:55 crc kubenswrapper[4752]: I0929 10:44:55.080357 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:55Z","lastTransitionTime":"2025-09-29T10:44:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:55 crc kubenswrapper[4752]: I0929 10:44:55.183048 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:55 crc kubenswrapper[4752]: I0929 10:44:55.183120 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:55 crc kubenswrapper[4752]: I0929 10:44:55.183135 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:55 crc kubenswrapper[4752]: I0929 10:44:55.183156 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:55 crc kubenswrapper[4752]: I0929 10:44:55.183169 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:55Z","lastTransitionTime":"2025-09-29T10:44:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:55 crc kubenswrapper[4752]: I0929 10:44:55.286735 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:55 crc kubenswrapper[4752]: I0929 10:44:55.286850 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:55 crc kubenswrapper[4752]: I0929 10:44:55.286889 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:55 crc kubenswrapper[4752]: I0929 10:44:55.286916 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:55 crc kubenswrapper[4752]: I0929 10:44:55.286936 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:55Z","lastTransitionTime":"2025-09-29T10:44:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:55 crc kubenswrapper[4752]: I0929 10:44:55.323051 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mp5pm" event={"ID":"65f5485e-9000-4512-aad3-7d367715ac2d","Type":"ContainerStarted","Data":"073cf9e4675b04d77ad58f0b7e1b313e3fe15e8daee4e1c8934a90924b04ad22"} Sep 29 10:44:55 crc kubenswrapper[4752]: I0929 10:44:55.323116 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mp5pm" event={"ID":"65f5485e-9000-4512-aad3-7d367715ac2d","Type":"ContainerStarted","Data":"db5dba49df10714a5f00ec40865af87528f6bee63ee58a89f299af7c10e4d769"} Sep 29 10:44:55 crc kubenswrapper[4752]: I0929 10:44:55.323132 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mp5pm" event={"ID":"65f5485e-9000-4512-aad3-7d367715ac2d","Type":"ContainerStarted","Data":"49d2edd0a6a7ba03d7171d4b5b246bf598c4d15ca8feb67718518f5148248ff2"} Sep 29 10:44:55 crc kubenswrapper[4752]: I0929 10:44:55.326165 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c2vrh_94028c24-ec10-4d5c-b32c-1700e677d539/ovnkube-controller/1.log" Sep 29 10:44:55 crc kubenswrapper[4752]: I0929 10:44:55.332321 4752 scope.go:117] "RemoveContainer" containerID="da3dc227c40a352ef71dec7f4fe6a59b773b7901f2ec3ec4f18c829adf8e87ed" Sep 29 10:44:55 crc kubenswrapper[4752]: E0929 10:44:55.332551 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-c2vrh_openshift-ovn-kubernetes(94028c24-ec10-4d5c-b32c-1700e677d539)\"" pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" podUID="94028c24-ec10-4d5c-b32c-1700e677d539" Sep 29 10:44:55 crc kubenswrapper[4752]: I0929 10:44:55.345953 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131d2c8a72fc6a373ebf6835840e6b9c1829db4c78b4961bf36642fd0e8a5636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:55Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:55 crc kubenswrapper[4752]: I0929 10:44:55.361484 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5863c243-797d-462a-b11f-71aaf005f8d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://166738b29f01996ec981fd00b49f422e4a97fe774396e7ea153ad29ef30a7370\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdtpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32155f6078e9c15abe4c659ac79b064ec182a232ea1d816998da4de273b7aa67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdtpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mgrvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:55Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:55 crc kubenswrapper[4752]: I0929 10:44:55.375540 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4whp8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"398b6e5c-29ac-4701-9207-d3d269b62224\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63db080ebca3f5ea23ddc9af874b6b500abe8044c73794ae0749df2949fb9520\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9hp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4whp8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:55Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:55 crc kubenswrapper[4752]: I0929 10:44:55.389879 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:55 crc kubenswrapper[4752]: I0929 10:44:55.389924 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:55 crc kubenswrapper[4752]: I0929 10:44:55.389934 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:55 crc kubenswrapper[4752]: I0929 10:44:55.389952 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:55 crc kubenswrapper[4752]: I0929 10:44:55.389962 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:55Z","lastTransitionTime":"2025-09-29T10:44:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:55 crc kubenswrapper[4752]: I0929 10:44:55.420834 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48ad7053-6039-4b1a-9729-fcbe1d938928\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00965359c30aa25677d4b114c00b339b155ab4b5316d5e355536bea5b65eaba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e2d86e0821e0155affe296e5cc70e9904f04c800943101e62509e3a5e4e0808\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9378a6f1ac902b030f4ecabac1eae40f884dc1546a360e178f38300e137d8b0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a174bcfad22c2a58c48792478272705c80a56775b45b14919ea1de1dd92b4cbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://828d416b69696f709d91feb8df8fead0f95be74a91c5dab25756e341e29413dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e4ae4f6e0a6df2f1e370b0ff37704c0b0252752c0d8e8a1cdd83088ca9ec951\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e4ae4f6e0a6df2f1e370b0ff37704c0b0252752c0d8e8a1cdd83088ca9ec951\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40c90938f79ba960fa16979dd5f239674df4b13cae8b0b5d3bb48b0e46219a34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40c90938f79ba960fa16979dd5f239674df4b13cae8b0b5d3bb48b0e46219a34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f99c6fe84624f3e518bbe35ee9b700effb126ff1f36d995262b7ed8b73364780\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f99c6fe84624f3e518bbe35ee9b700effb126ff1f36d995262b7ed8b73364780\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:55Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:55 crc kubenswrapper[4752]: I0929 10:44:55.439424 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:55Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:55 crc kubenswrapper[4752]: I0929 10:44:55.457736 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7kp7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66a61a7f-9be6-486b-a425-62ed62ec0ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4170732970e5e7c429279d239eb2d4b9d8249ff254b35f38ff80d0321087be2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kgr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7kp7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:55Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:55 crc kubenswrapper[4752]: I0929 10:44:55.485700 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vm6zb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f30a1f9-86ef-450e-9f8c-8ef8d4ac380a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6bc5aff417397c8b264553f67de7ebd1aeadb67fb83114c5bb13c2e0d10e397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239ca1f17b9f1e1d6ba63b196e34066fe7fb37373453460261044f5fcaf819af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://239ca1f17b9f1e1d6ba63b196e34066fe7fb37373453460261044f5fcaf819af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd5b369dc688f11e4ab502a3886b722cba392fce0d3ac7850bd59abffbf7dee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd5b369dc688f11e4ab502a3886b722cba392fce0d3ac7850bd59abffbf7dee2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d17821abed9aca5c20373738f44ca9a61e954d1eee46f0d16c3e9b34d810a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88d17821abed9aca5c20373738f44ca9a61e954d1eee46f0d16c3e9b34d810a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50f5727e0bd53639ba6b6632f2d62c7c62ae74b07a60aa1cb58c2020990cae42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f5727e0bd53639ba6b6632f2d62c7c62ae74b07a60aa1cb58c2020990cae42\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd84740e3b0a970decedcc3960fb987fa618f9627f06be1d2d0b034d0361f805\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd84740e3b0a970decedcc3960fb987fa618f9627f06be1d2d0b034d0361f805\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6af6d9f7c1ca6625f88dcaa9ef267cf11f3ebb16a0ce12d3c2442550bc0833ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6af6d9f7c1ca6625f88dcaa9ef267cf11f3ebb16a0ce12d3c2442550bc0833ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vm6zb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:55Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:55 crc kubenswrapper[4752]: I0929 10:44:55.493379 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:55 crc kubenswrapper[4752]: I0929 10:44:55.493416 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:55 crc kubenswrapper[4752]: I0929 10:44:55.493426 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:55 crc kubenswrapper[4752]: I0929 10:44:55.493442 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:55 crc kubenswrapper[4752]: I0929 10:44:55.493452 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:55Z","lastTransitionTime":"2025-09-29T10:44:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:55 crc kubenswrapper[4752]: I0929 10:44:55.508827 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94028c24-ec10-4d5c-b32c-1700e677d539\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://486ac9c45cc8e6cc88a199b152343c1db14c51125b4357c85d5d082467fc4560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2860691a355a598f52a1f13213198fa7889748e67cca21a617ed5714f5eabcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e34a55130babbc5fbe9fb81d05fc687dc1b06c3bffea762ba699f9f6c317b312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5985eb5ebc8fa2ca986873aea235335770621597493b43eaa58d98329cd37009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b46368b26939edaf377aa86ef45fc9dc3ec4fa274dfe1cba458bafb8d32309e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a98f237ee9baeb799b2ea76ccbe7b349ed70b50f47738fc514ae56b46ee8d1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da3dc227c40a352ef71dec7f4fe6a59b773b7901f2ec3ec4f18c829adf8e87ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1374b35afd65797b94d674c0e4f0932a46ac76c73c5c03e6dbd42e66182b58ee\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T10:44:52Z\\\",\\\"message\\\":\\\"Node event handler 2 for removal\\\\nI0929 10:44:51.834387 6054 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0929 10:44:51.834400 6054 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0929 10:44:51.834450 6054 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0929 10:44:51.834462 6054 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0929 10:44:51.834469 6054 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0929 10:44:51.834477 6054 handler.go:208] Removed *v1.Node event handler 7\\\\nI0929 10:44:51.834486 6054 handler.go:208] Removed *v1.Node event handler 2\\\\nI0929 10:44:51.834513 6054 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0929 10:44:51.834541 6054 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0929 10:44:51.834694 6054 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0929 10:44:51.835176 6054 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0929 10:44:51.835281 6054 factory.go:656] Stopping watch factory\\\\nI0929 10:44:51.835343 6054 ovnkube.go:599] Stopped ovnkube\\\\nI0929 10:44:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da3dc227c40a352ef71dec7f4fe6a59b773b7901f2ec3ec4f18c829adf8e87ed\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T10:44:53Z\\\",\\\"message\\\":\\\"mers/factory.go:160\\\\nI0929 10:44:53.327068 6174 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0929 10:44:53.326997 6174 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0929 10:44:53.327207 6174 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0929 10:44:53.327278 6174 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0929 10:44:53.327541 6174 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0929 10:44:53.328126 6174 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0929 10:44:53.328200 6174 factory.go:656] Stopping watch factory\\\\nI0929 10:44:53.328228 6174 handler.go:208] Removed *v1.Node event handler 2\\\\nI0929 10:44:53.331274 6174 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0929 10:44:53.331331 6174 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0929 10:44:53.331404 6174 ovnkube.go:599] Stopped ovnkube\\\\nI0929 10:44:53.331469 6174 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0929 10:44:53.331650 6174 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea11fb795febf50e35263b0a02c32a01fd69937dfbfe196696cd1792e40cc191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f22dfbbd26fb3ebf4869b46406913cc1963e33c11794193c815235be5acee338\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f22dfbbd26fb3ebf4869b46406913cc1963e33c11794193c815235be5acee338\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c2vrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:55Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:55 crc kubenswrapper[4752]: I0929 10:44:55.526153 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520a5d33-312c-4033-8b69-5dd582f13ccc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6223734bbce461c09916aea7629bba0cfa97ea17050bca7417020ece9ae031a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1157b82d6f3337270d30abdceadaa1f0a01b3c6d8de6bc8e9edf083a8264f19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://854abd6205c2eec2229d0d65aec3edb7cf1cc1e77759df41bd22deda4a08c8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c927118840179fccacbe6a18a329c117cef73a6e914bf38d20fc2439d6a5c1ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c927118840179fccacbe6a18a329c117cef73a6e914bf38d20fc2439d6a5c1ee\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"message\\\":\\\"le observer\\\\nW0929 10:44:40.787758 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0929 10:44:40.787900 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 10:44:40.788558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1487283959/tls.crt::/tmp/serving-cert-1487283959/tls.key\\\\\\\"\\\\nI0929 10:44:41.256284 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 10:44:41.261265 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 10:44:41.261291 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 10:44:41.261311 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 10:44:41.261316 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 10:44:41.267824 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0929 10:44:41.267847 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0929 10:44:41.267849 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 10:44:41.267871 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 10:44:41.267876 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 10:44:41.267879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 10:44:41.267882 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 10:44:41.267884 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0929 10:44:41.270258 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbe61bb570ef2be352bb3a0e55da353ce7b618b397e3bf9f0d66da0c9b6f1d4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80f961b58569cce6d634f225369902695ccda2e78efb1c6fd635f1535467cc1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80f961b58569cce6d634f225369902695ccda2e78efb1c6fd635f1535467cc1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:55Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:55 crc kubenswrapper[4752]: I0929 10:44:55.540325 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca4f637cfcb1e52fa69f0ffa46b3a53459225d9ad4afd1178bff709e812c5418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b70242846937de5b4dda37a2b8c48947fded378c299ea4ad857168589d7c175\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:55Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:55 crc kubenswrapper[4752]: I0929 10:44:55.555501 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:55Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:55 crc kubenswrapper[4752]: I0929 10:44:55.570122 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fb781fd16d4a9f56202eb1724ed1a4ed6700ff7b81819573b955bcb07e563a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:55Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:55 crc kubenswrapper[4752]: I0929 10:44:55.584447 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xv5q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52fc9378-c37b-424b-afde-7b191bab5fde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30ee75a35da106cc9424c7a3f97f28d0c711200667372c023612db4a9701c189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4rqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xv5q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:55Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:55 crc kubenswrapper[4752]: I0929 10:44:55.596132 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:55 crc kubenswrapper[4752]: I0929 10:44:55.596174 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:55 crc kubenswrapper[4752]: I0929 10:44:55.596187 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:55 crc kubenswrapper[4752]: I0929 10:44:55.596207 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:55 crc kubenswrapper[4752]: I0929 10:44:55.596222 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:55Z","lastTransitionTime":"2025-09-29T10:44:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:55 crc kubenswrapper[4752]: I0929 10:44:55.602658 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mp5pm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65f5485e-9000-4512-aad3-7d367715ac2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db5dba49df10714a5f00ec40865af87528f6bee63ee58a89f299af7c10e4d769\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z772z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://073cf9e4675b04d77ad58f0b7e1b313e3fe15e8daee4e1c8934a90924b04ad22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z772z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mp5pm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:55Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:55 crc kubenswrapper[4752]: I0929 10:44:55.621628 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3e5d3a3-2f2d-4f61-ae95-26ebd1f72342\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66d77cd5048e199a6eae84be4079c3b00305f4f5223b5176a49df0feb2f0bf8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74b270e951a827068c908168bf04d4cd3bcba62e472e4a3f415de8b7463fdccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dd4d83f6d6b5db7fc93239bc1a6b731c67bc15ef1ca1990b53589e4ad36bfa7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39ef26bf3e7b95ac9a59199bbabe11fd4e831baba1b120ef97a4839c0c4aab7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:55Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:55 crc kubenswrapper[4752]: I0929 10:44:55.625245 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-sq7f4"] Sep 29 10:44:55 crc kubenswrapper[4752]: I0929 10:44:55.625870 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sq7f4" Sep 29 10:44:55 crc kubenswrapper[4752]: E0929 10:44:55.625954 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sq7f4" podUID="0a33b92e-d79c-4162-8500-df7a89df8df3" Sep 29 10:44:55 crc kubenswrapper[4752]: I0929 10:44:55.636772 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:55Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:55 crc kubenswrapper[4752]: I0929 10:44:55.659100 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48ad7053-6039-4b1a-9729-fcbe1d938928\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00965359c30aa25677d4b114c00b339b155ab4b5316d5e355536bea5b65eaba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e2d86e0821e0155affe296e5cc70e9904f04c800943101e62509e3a5e4e0808\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9378a6f1ac902b030f4ecabac1eae40f884dc1546a360e178f38300e137d8b0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a174bcfad22c2a58c48792478272705c80a56775b45b14919ea1de1dd92b4cbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://828d416b69696f709d91feb8df8fead0f95be74a91c5dab25756e341e29413dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e4ae4f6e0a6df2f1e370b0ff37704c0b0252752c0d8e8a1cdd83088ca9ec951\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e4ae4f6e0a6df2f1e370b0ff37704c0b0252752c0d8e8a1cdd83088ca9ec951\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40c90938f79ba960fa16979dd5f239674df4b13cae8b0b5d3bb48b0e46219a34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40c90938f79ba960fa16979dd5f239674df4b13cae8b0b5d3bb48b0e46219a34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f99c6fe84624f3e518bbe35ee9b700effb126ff1f36d995262b7ed8b73364780\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f99c6fe84624f3e518bbe35ee9b700effb126ff1f36d995262b7ed8b73364780\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:55Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:55 crc kubenswrapper[4752]: I0929 10:44:55.673899 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:55Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:55 crc kubenswrapper[4752]: I0929 10:44:55.686926 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131d2c8a72fc6a373ebf6835840e6b9c1829db4c78b4961bf36642fd0e8a5636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:55Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:55 crc kubenswrapper[4752]: I0929 10:44:55.698462 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:55 crc kubenswrapper[4752]: I0929 10:44:55.698506 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:55 crc kubenswrapper[4752]: I0929 10:44:55.698516 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:55 crc kubenswrapper[4752]: I0929 10:44:55.698534 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:55 crc kubenswrapper[4752]: I0929 10:44:55.698547 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:55Z","lastTransitionTime":"2025-09-29T10:44:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:55 crc kubenswrapper[4752]: I0929 10:44:55.702000 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5863c243-797d-462a-b11f-71aaf005f8d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://166738b29f01996ec981fd00b49f422e4a97fe774396e7ea153ad29ef30a7370\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdtpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32155f6078e9c15abe4c659ac79b064ec182a232ea1d816998da4de273b7aa67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdtpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mgrvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:55Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:55 crc kubenswrapper[4752]: I0929 10:44:55.715699 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4whp8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"398b6e5c-29ac-4701-9207-d3d269b62224\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63db080ebca3f5ea23ddc9af874b6b500abe8044c73794ae0749df2949fb9520\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9hp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4whp8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:55Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:55 crc kubenswrapper[4752]: I0929 10:44:55.731341 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520a5d33-312c-4033-8b69-5dd582f13ccc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6223734bbce461c09916aea7629bba0cfa97ea17050bca7417020ece9ae031a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1157b82d6f3337270d30abdceadaa1f0a01b3c6d8de6bc8e9edf083a8264f19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://854abd6205c2eec2229d0d65aec3edb7cf1cc1e77759df41bd22deda4a08c8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c927118840179fccacbe6a18a329c117cef73a6e914bf38d20fc2439d6a5c1ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c927118840179fccacbe6a18a329c117cef73a6e914bf38d20fc2439d6a5c1ee\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"message\\\":\\\"le observer\\\\nW0929 10:44:40.787758 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0929 10:44:40.787900 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 10:44:40.788558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1487283959/tls.crt::/tmp/serving-cert-1487283959/tls.key\\\\\\\"\\\\nI0929 10:44:41.256284 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 10:44:41.261265 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 10:44:41.261291 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 10:44:41.261311 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 10:44:41.261316 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 10:44:41.267824 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0929 10:44:41.267847 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0929 10:44:41.267849 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 10:44:41.267871 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 10:44:41.267876 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 10:44:41.267879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 10:44:41.267882 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 10:44:41.267884 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0929 10:44:41.270258 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbe61bb570ef2be352bb3a0e55da353ce7b618b397e3bf9f0d66da0c9b6f1d4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80f961b58569cce6d634f225369902695ccda2e78efb1c6fd635f1535467cc1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80f961b58569cce6d634f225369902695ccda2e78efb1c6fd635f1535467cc1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:55Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:55 crc kubenswrapper[4752]: I0929 10:44:55.748946 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca4f637cfcb1e52fa69f0ffa46b3a53459225d9ad4afd1178bff709e812c5418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b70242846937de5b4dda37a2b8c48947fded378c299ea4ad857168589d7c175\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:55Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:55 crc kubenswrapper[4752]: I0929 10:44:55.766853 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7kp7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66a61a7f-9be6-486b-a425-62ed62ec0ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4170732970e5e7c429279d239eb2d4b9d8249ff254b35f38ff80d0321087be2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kgr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7kp7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:55Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:55 crc kubenswrapper[4752]: I0929 10:44:55.784712 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0a33b92e-d79c-4162-8500-df7a89df8df3-metrics-certs\") pod \"network-metrics-daemon-sq7f4\" (UID: \"0a33b92e-d79c-4162-8500-df7a89df8df3\") " pod="openshift-multus/network-metrics-daemon-sq7f4" Sep 29 10:44:55 crc kubenswrapper[4752]: I0929 10:44:55.784856 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qck2m\" (UniqueName: \"kubernetes.io/projected/0a33b92e-d79c-4162-8500-df7a89df8df3-kube-api-access-qck2m\") pod \"network-metrics-daemon-sq7f4\" (UID: \"0a33b92e-d79c-4162-8500-df7a89df8df3\") " pod="openshift-multus/network-metrics-daemon-sq7f4" Sep 29 10:44:55 crc kubenswrapper[4752]: I0929 10:44:55.788260 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vm6zb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f30a1f9-86ef-450e-9f8c-8ef8d4ac380a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6bc5aff417397c8b264553f67de7ebd1aeadb67fb83114c5bb13c2e0d10e397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239ca1f17b9f1e1d6ba63b196e34066fe7fb37373453460261044f5fcaf819af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://239ca1f17b9f1e1d6ba63b196e34066fe7fb37373453460261044f5fcaf819af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd5b369dc688f11e4ab502a3886b722cba392fce0d3ac7850bd59abffbf7dee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd5b369dc688f11e4ab502a3886b722cba392fce0d3ac7850bd59abffbf7dee2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d17821abed9aca5c20373738f44ca9a61e954d1eee46f0d16c3e9b34d810a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88d17821abed9aca5c20373738f44ca9a61e954d1eee46f0d16c3e9b34d810a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50f5727e0bd53639ba6b6632f2d62c7c62ae74b07a60aa1cb58c2020990cae42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f5727e0bd53639ba6b6632f2d62c7c62ae74b07a60aa1cb58c2020990cae42\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd84740e3b0a970decedcc3960fb987fa618f9627f06be1d2d0b034d0361f805\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd84740e3b0a970decedcc3960fb987fa618f9627f06be1d2d0b034d0361f805\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6af6d9f7c1ca6625f88dcaa9ef267cf11f3ebb16a0ce12d3c2442550bc0833ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6af6d9f7c1ca6625f88dcaa9ef267cf11f3ebb16a0ce12d3c2442550bc0833ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vm6zb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:55Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:55 crc kubenswrapper[4752]: I0929 10:44:55.803536 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:55 crc kubenswrapper[4752]: I0929 10:44:55.803609 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:55 crc kubenswrapper[4752]: I0929 10:44:55.803633 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:55 crc kubenswrapper[4752]: I0929 10:44:55.803658 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:55 crc kubenswrapper[4752]: I0929 10:44:55.803678 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:55Z","lastTransitionTime":"2025-09-29T10:44:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:55 crc kubenswrapper[4752]: I0929 10:44:55.820575 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94028c24-ec10-4d5c-b32c-1700e677d539\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://486ac9c45cc8e6cc88a199b152343c1db14c51125b4357c85d5d082467fc4560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2860691a355a598f52a1f13213198fa7889748e67cca21a617ed5714f5eabcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e34a55130babbc5fbe9fb81d05fc687dc1b06c3bffea762ba699f9f6c317b312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5985eb5ebc8fa2ca986873aea235335770621597493b43eaa58d98329cd37009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b46368b26939edaf377aa86ef45fc9dc3ec4fa274dfe1cba458bafb8d32309e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a98f237ee9baeb799b2ea76ccbe7b349ed70b50f47738fc514ae56b46ee8d1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da3dc227c40a352ef71dec7f4fe6a59b773b7901f2ec3ec4f18c829adf8e87ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da3dc227c40a352ef71dec7f4fe6a59b773b7901f2ec3ec4f18c829adf8e87ed\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T10:44:53Z\\\",\\\"message\\\":\\\"mers/factory.go:160\\\\nI0929 10:44:53.327068 6174 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0929 10:44:53.326997 6174 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0929 10:44:53.327207 6174 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0929 10:44:53.327278 6174 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0929 10:44:53.327541 6174 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0929 10:44:53.328126 6174 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0929 10:44:53.328200 6174 factory.go:656] Stopping watch factory\\\\nI0929 10:44:53.328228 6174 handler.go:208] Removed *v1.Node event handler 2\\\\nI0929 10:44:53.331274 6174 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0929 10:44:53.331331 6174 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0929 10:44:53.331404 6174 ovnkube.go:599] Stopped ovnkube\\\\nI0929 10:44:53.331469 6174 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0929 10:44:53.331650 6174 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-c2vrh_openshift-ovn-kubernetes(94028c24-ec10-4d5c-b32c-1700e677d539)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea11fb795febf50e35263b0a02c32a01fd69937dfbfe196696cd1792e40cc191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f22dfbbd26fb3ebf4869b46406913cc1963e33c11794193c815235be5acee338\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f22dfbbd26fb3ebf4869b46406913cc1963e33c11794193c815235be5acee338\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c2vrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:55Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:55 crc kubenswrapper[4752]: I0929 10:44:55.834733 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3e5d3a3-2f2d-4f61-ae95-26ebd1f72342\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66d77cd5048e199a6eae84be4079c3b00305f4f5223b5176a49df0feb2f0bf8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74b270e951a827068c908168bf04d4cd3bcba62e472e4a3f415de8b7463fdccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dd4d83f6d6b5db7fc93239bc1a6b731c67bc15ef1ca1990b53589e4ad36bfa7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39ef26bf3e7b95ac9a59199bbabe11fd4e831baba1b120ef97a4839c0c4aab7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:55Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:55 crc kubenswrapper[4752]: I0929 10:44:55.849262 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:55Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:55 crc kubenswrapper[4752]: I0929 10:44:55.864041 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fb781fd16d4a9f56202eb1724ed1a4ed6700ff7b81819573b955bcb07e563a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:55Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:55 crc kubenswrapper[4752]: I0929 10:44:55.879494 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xv5q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52fc9378-c37b-424b-afde-7b191bab5fde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30ee75a35da106cc9424c7a3f97f28d0c711200667372c023612db4a9701c189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4rqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xv5q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:55Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:55 crc kubenswrapper[4752]: I0929 10:44:55.886201 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0a33b92e-d79c-4162-8500-df7a89df8df3-metrics-certs\") pod \"network-metrics-daemon-sq7f4\" (UID: \"0a33b92e-d79c-4162-8500-df7a89df8df3\") " pod="openshift-multus/network-metrics-daemon-sq7f4" Sep 29 10:44:55 crc kubenswrapper[4752]: I0929 10:44:55.886255 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qck2m\" (UniqueName: \"kubernetes.io/projected/0a33b92e-d79c-4162-8500-df7a89df8df3-kube-api-access-qck2m\") pod \"network-metrics-daemon-sq7f4\" (UID: \"0a33b92e-d79c-4162-8500-df7a89df8df3\") " pod="openshift-multus/network-metrics-daemon-sq7f4" Sep 29 10:44:55 crc kubenswrapper[4752]: E0929 10:44:55.886441 4752 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 29 10:44:55 crc kubenswrapper[4752]: E0929 10:44:55.886550 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0a33b92e-d79c-4162-8500-df7a89df8df3-metrics-certs podName:0a33b92e-d79c-4162-8500-df7a89df8df3 nodeName:}" failed. No retries permitted until 2025-09-29 10:44:56.386525396 +0000 UTC m=+37.175667063 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0a33b92e-d79c-4162-8500-df7a89df8df3-metrics-certs") pod "network-metrics-daemon-sq7f4" (UID: "0a33b92e-d79c-4162-8500-df7a89df8df3") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 29 10:44:55 crc kubenswrapper[4752]: I0929 10:44:55.892632 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mp5pm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65f5485e-9000-4512-aad3-7d367715ac2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db5dba49df10714a5f00ec40865af87528f6bee63ee58a89f299af7c10e4d769\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z772z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://073cf9e4675b04d77ad58f0b7e1b313e3fe15e8daee4e1c8934a90924b04ad22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z772z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mp5pm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:55Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:55 crc kubenswrapper[4752]: I0929 10:44:55.903724 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qck2m\" (UniqueName: \"kubernetes.io/projected/0a33b92e-d79c-4162-8500-df7a89df8df3-kube-api-access-qck2m\") pod \"network-metrics-daemon-sq7f4\" (UID: \"0a33b92e-d79c-4162-8500-df7a89df8df3\") " pod="openshift-multus/network-metrics-daemon-sq7f4" Sep 29 10:44:55 crc kubenswrapper[4752]: I0929 10:44:55.904431 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sq7f4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a33b92e-d79c-4162-8500-df7a89df8df3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qck2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qck2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sq7f4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:55Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:55 crc kubenswrapper[4752]: I0929 10:44:55.906471 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:55 crc kubenswrapper[4752]: I0929 10:44:55.906504 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:55 crc kubenswrapper[4752]: I0929 10:44:55.906517 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:55 crc kubenswrapper[4752]: I0929 10:44:55.906535 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:55 crc kubenswrapper[4752]: I0929 10:44:55.906550 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:55Z","lastTransitionTime":"2025-09-29T10:44:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:55 crc kubenswrapper[4752]: I0929 10:44:55.920606 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:55Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:56 crc kubenswrapper[4752]: I0929 10:44:56.010287 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:56 crc kubenswrapper[4752]: I0929 10:44:56.010342 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:56 crc kubenswrapper[4752]: I0929 10:44:56.010355 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:56 crc kubenswrapper[4752]: I0929 10:44:56.010374 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:56 crc kubenswrapper[4752]: I0929 10:44:56.010387 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:56Z","lastTransitionTime":"2025-09-29T10:44:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:56 crc kubenswrapper[4752]: I0929 10:44:56.114629 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:56 crc kubenswrapper[4752]: I0929 10:44:56.114696 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:56 crc kubenswrapper[4752]: I0929 10:44:56.114708 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:56 crc kubenswrapper[4752]: I0929 10:44:56.114731 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:56 crc kubenswrapper[4752]: I0929 10:44:56.114744 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:56Z","lastTransitionTime":"2025-09-29T10:44:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:56 crc kubenswrapper[4752]: I0929 10:44:56.217837 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:56 crc kubenswrapper[4752]: I0929 10:44:56.217878 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:56 crc kubenswrapper[4752]: I0929 10:44:56.217891 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:56 crc kubenswrapper[4752]: I0929 10:44:56.217909 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:56 crc kubenswrapper[4752]: I0929 10:44:56.217920 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:56Z","lastTransitionTime":"2025-09-29T10:44:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:56 crc kubenswrapper[4752]: I0929 10:44:56.319783 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:56 crc kubenswrapper[4752]: I0929 10:44:56.319862 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:56 crc kubenswrapper[4752]: I0929 10:44:56.319875 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:56 crc kubenswrapper[4752]: I0929 10:44:56.319894 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:56 crc kubenswrapper[4752]: I0929 10:44:56.319907 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:56Z","lastTransitionTime":"2025-09-29T10:44:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:56 crc kubenswrapper[4752]: I0929 10:44:56.389965 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0a33b92e-d79c-4162-8500-df7a89df8df3-metrics-certs\") pod \"network-metrics-daemon-sq7f4\" (UID: \"0a33b92e-d79c-4162-8500-df7a89df8df3\") " pod="openshift-multus/network-metrics-daemon-sq7f4" Sep 29 10:44:56 crc kubenswrapper[4752]: E0929 10:44:56.390190 4752 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 29 10:44:56 crc kubenswrapper[4752]: E0929 10:44:56.390348 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0a33b92e-d79c-4162-8500-df7a89df8df3-metrics-certs podName:0a33b92e-d79c-4162-8500-df7a89df8df3 nodeName:}" failed. No retries permitted until 2025-09-29 10:44:57.3903156 +0000 UTC m=+38.179457317 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0a33b92e-d79c-4162-8500-df7a89df8df3-metrics-certs") pod "network-metrics-daemon-sq7f4" (UID: "0a33b92e-d79c-4162-8500-df7a89df8df3") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 29 10:44:56 crc kubenswrapper[4752]: I0929 10:44:56.422963 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:56 crc kubenswrapper[4752]: I0929 10:44:56.423037 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:56 crc kubenswrapper[4752]: I0929 10:44:56.423057 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:56 crc kubenswrapper[4752]: I0929 10:44:56.423087 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:56 crc kubenswrapper[4752]: I0929 10:44:56.423112 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:56Z","lastTransitionTime":"2025-09-29T10:44:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:56 crc kubenswrapper[4752]: I0929 10:44:56.526543 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:56 crc kubenswrapper[4752]: I0929 10:44:56.526607 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:56 crc kubenswrapper[4752]: I0929 10:44:56.526623 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:56 crc kubenswrapper[4752]: I0929 10:44:56.526649 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:56 crc kubenswrapper[4752]: I0929 10:44:56.526668 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:56Z","lastTransitionTime":"2025-09-29T10:44:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:56 crc kubenswrapper[4752]: I0929 10:44:56.630524 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:56 crc kubenswrapper[4752]: I0929 10:44:56.630729 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:56 crc kubenswrapper[4752]: I0929 10:44:56.630740 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:56 crc kubenswrapper[4752]: I0929 10:44:56.630764 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:56 crc kubenswrapper[4752]: I0929 10:44:56.630777 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:56Z","lastTransitionTime":"2025-09-29T10:44:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:56 crc kubenswrapper[4752]: I0929 10:44:56.693592 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 10:44:56 crc kubenswrapper[4752]: E0929 10:44:56.693925 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 10:45:12.69390506 +0000 UTC m=+53.483046727 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:44:56 crc kubenswrapper[4752]: I0929 10:44:56.733223 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:56 crc kubenswrapper[4752]: I0929 10:44:56.733276 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:56 crc kubenswrapper[4752]: I0929 10:44:56.733287 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:56 crc kubenswrapper[4752]: I0929 10:44:56.733308 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:56 crc kubenswrapper[4752]: I0929 10:44:56.733324 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:56Z","lastTransitionTime":"2025-09-29T10:44:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:56 crc kubenswrapper[4752]: I0929 10:44:56.794770 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 10:44:56 crc kubenswrapper[4752]: I0929 10:44:56.794875 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 10:44:56 crc kubenswrapper[4752]: E0929 10:44:56.794887 4752 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 29 10:44:56 crc kubenswrapper[4752]: I0929 10:44:56.794925 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 10:44:56 crc kubenswrapper[4752]: E0929 10:44:56.794962 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-29 10:45:12.794942216 +0000 UTC m=+53.584083893 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 29 10:44:56 crc kubenswrapper[4752]: I0929 10:44:56.794985 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 10:44:56 crc kubenswrapper[4752]: E0929 10:44:56.795028 4752 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 29 10:44:56 crc kubenswrapper[4752]: E0929 10:44:56.795068 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-29 10:45:12.795055419 +0000 UTC m=+53.584197086 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 29 10:44:56 crc kubenswrapper[4752]: E0929 10:44:56.795092 4752 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 29 10:44:56 crc kubenswrapper[4752]: E0929 10:44:56.795112 4752 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 29 10:44:56 crc kubenswrapper[4752]: E0929 10:44:56.795125 4752 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 10:44:56 crc kubenswrapper[4752]: E0929 10:44:56.795164 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-29 10:45:12.795154122 +0000 UTC m=+53.584295789 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 10:44:56 crc kubenswrapper[4752]: E0929 10:44:56.795713 4752 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 29 10:44:56 crc kubenswrapper[4752]: E0929 10:44:56.795766 4752 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 29 10:44:56 crc kubenswrapper[4752]: E0929 10:44:56.795785 4752 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 10:44:56 crc kubenswrapper[4752]: E0929 10:44:56.795930 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-29 10:45:12.795900091 +0000 UTC m=+53.585041799 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 10:44:56 crc kubenswrapper[4752]: I0929 10:44:56.836087 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:56 crc kubenswrapper[4752]: I0929 10:44:56.836144 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:56 crc kubenswrapper[4752]: I0929 10:44:56.836156 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:56 crc kubenswrapper[4752]: I0929 10:44:56.836188 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:56 crc kubenswrapper[4752]: I0929 10:44:56.836204 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:56Z","lastTransitionTime":"2025-09-29T10:44:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:56 crc kubenswrapper[4752]: I0929 10:44:56.939169 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:56 crc kubenswrapper[4752]: I0929 10:44:56.939221 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:56 crc kubenswrapper[4752]: I0929 10:44:56.939231 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:56 crc kubenswrapper[4752]: I0929 10:44:56.939248 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:56 crc kubenswrapper[4752]: I0929 10:44:56.939258 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:56Z","lastTransitionTime":"2025-09-29T10:44:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:57 crc kubenswrapper[4752]: I0929 10:44:57.026633 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:57 crc kubenswrapper[4752]: I0929 10:44:57.026729 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:57 crc kubenswrapper[4752]: I0929 10:44:57.026741 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:57 crc kubenswrapper[4752]: I0929 10:44:57.026764 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:57 crc kubenswrapper[4752]: I0929 10:44:57.026784 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:57Z","lastTransitionTime":"2025-09-29T10:44:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:57 crc kubenswrapper[4752]: I0929 10:44:57.030063 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sq7f4" Sep 29 10:44:57 crc kubenswrapper[4752]: I0929 10:44:57.030082 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 10:44:57 crc kubenswrapper[4752]: I0929 10:44:57.030142 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 10:44:57 crc kubenswrapper[4752]: I0929 10:44:57.030214 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 10:44:57 crc kubenswrapper[4752]: E0929 10:44:57.030322 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sq7f4" podUID="0a33b92e-d79c-4162-8500-df7a89df8df3" Sep 29 10:44:57 crc kubenswrapper[4752]: E0929 10:44:57.030489 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 10:44:57 crc kubenswrapper[4752]: E0929 10:44:57.030555 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 10:44:57 crc kubenswrapper[4752]: E0929 10:44:57.030628 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 10:44:57 crc kubenswrapper[4752]: E0929 10:44:57.042830 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:44:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:44:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:44:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:44:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"67757396-6dfe-4e60-ba89-bdfd50031eb3\\\",\\\"systemUUID\\\":\\\"d8106fc8-56a6-4aa2-998a-aa38bb8caa68\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:57Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:57 crc kubenswrapper[4752]: I0929 10:44:57.047430 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:57 crc kubenswrapper[4752]: I0929 10:44:57.047512 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:57 crc kubenswrapper[4752]: I0929 10:44:57.047527 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:57 crc kubenswrapper[4752]: I0929 10:44:57.047556 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:57 crc kubenswrapper[4752]: I0929 10:44:57.047570 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:57Z","lastTransitionTime":"2025-09-29T10:44:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:57 crc kubenswrapper[4752]: E0929 10:44:57.063210 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:44:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:44:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:44:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:44:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"67757396-6dfe-4e60-ba89-bdfd50031eb3\\\",\\\"systemUUID\\\":\\\"d8106fc8-56a6-4aa2-998a-aa38bb8caa68\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:57Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:57 crc kubenswrapper[4752]: I0929 10:44:57.066924 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:57 crc kubenswrapper[4752]: I0929 10:44:57.066988 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:57 crc kubenswrapper[4752]: I0929 10:44:57.066998 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:57 crc kubenswrapper[4752]: I0929 10:44:57.067019 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:57 crc kubenswrapper[4752]: I0929 10:44:57.067038 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:57Z","lastTransitionTime":"2025-09-29T10:44:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:57 crc kubenswrapper[4752]: E0929 10:44:57.082827 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:44:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:44:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:44:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:44:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"67757396-6dfe-4e60-ba89-bdfd50031eb3\\\",\\\"systemUUID\\\":\\\"d8106fc8-56a6-4aa2-998a-aa38bb8caa68\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:57Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:57 crc kubenswrapper[4752]: I0929 10:44:57.087483 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:57 crc kubenswrapper[4752]: I0929 10:44:57.087543 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:57 crc kubenswrapper[4752]: I0929 10:44:57.087560 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:57 crc kubenswrapper[4752]: I0929 10:44:57.087580 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:57 crc kubenswrapper[4752]: I0929 10:44:57.087593 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:57Z","lastTransitionTime":"2025-09-29T10:44:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:57 crc kubenswrapper[4752]: E0929 10:44:57.107853 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:44:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:44:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:44:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:44:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"67757396-6dfe-4e60-ba89-bdfd50031eb3\\\",\\\"systemUUID\\\":\\\"d8106fc8-56a6-4aa2-998a-aa38bb8caa68\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:57Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:57 crc kubenswrapper[4752]: I0929 10:44:57.113386 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:57 crc kubenswrapper[4752]: I0929 10:44:57.113440 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:57 crc kubenswrapper[4752]: I0929 10:44:57.113453 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:57 crc kubenswrapper[4752]: I0929 10:44:57.113475 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:57 crc kubenswrapper[4752]: I0929 10:44:57.113488 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:57Z","lastTransitionTime":"2025-09-29T10:44:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:57 crc kubenswrapper[4752]: E0929 10:44:57.127932 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:44:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:44:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:44:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:44:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"67757396-6dfe-4e60-ba89-bdfd50031eb3\\\",\\\"systemUUID\\\":\\\"d8106fc8-56a6-4aa2-998a-aa38bb8caa68\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:57Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:57 crc kubenswrapper[4752]: E0929 10:44:57.128125 4752 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 29 10:44:57 crc kubenswrapper[4752]: I0929 10:44:57.130369 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:57 crc kubenswrapper[4752]: I0929 10:44:57.130404 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:57 crc kubenswrapper[4752]: I0929 10:44:57.130415 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:57 crc kubenswrapper[4752]: I0929 10:44:57.130435 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:57 crc kubenswrapper[4752]: I0929 10:44:57.130448 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:57Z","lastTransitionTime":"2025-09-29T10:44:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:57 crc kubenswrapper[4752]: I0929 10:44:57.233289 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:57 crc kubenswrapper[4752]: I0929 10:44:57.233335 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:57 crc kubenswrapper[4752]: I0929 10:44:57.233346 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:57 crc kubenswrapper[4752]: I0929 10:44:57.233361 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:57 crc kubenswrapper[4752]: I0929 10:44:57.233372 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:57Z","lastTransitionTime":"2025-09-29T10:44:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:57 crc kubenswrapper[4752]: I0929 10:44:57.336581 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:57 crc kubenswrapper[4752]: I0929 10:44:57.336683 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:57 crc kubenswrapper[4752]: I0929 10:44:57.336712 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:57 crc kubenswrapper[4752]: I0929 10:44:57.336749 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:57 crc kubenswrapper[4752]: I0929 10:44:57.336769 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:57Z","lastTransitionTime":"2025-09-29T10:44:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:57 crc kubenswrapper[4752]: I0929 10:44:57.401642 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0a33b92e-d79c-4162-8500-df7a89df8df3-metrics-certs\") pod \"network-metrics-daemon-sq7f4\" (UID: \"0a33b92e-d79c-4162-8500-df7a89df8df3\") " pod="openshift-multus/network-metrics-daemon-sq7f4" Sep 29 10:44:57 crc kubenswrapper[4752]: E0929 10:44:57.401898 4752 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 29 10:44:57 crc kubenswrapper[4752]: E0929 10:44:57.401981 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0a33b92e-d79c-4162-8500-df7a89df8df3-metrics-certs podName:0a33b92e-d79c-4162-8500-df7a89df8df3 nodeName:}" failed. No retries permitted until 2025-09-29 10:44:59.401959672 +0000 UTC m=+40.191101329 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0a33b92e-d79c-4162-8500-df7a89df8df3-metrics-certs") pod "network-metrics-daemon-sq7f4" (UID: "0a33b92e-d79c-4162-8500-df7a89df8df3") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 29 10:44:57 crc kubenswrapper[4752]: I0929 10:44:57.440063 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:57 crc kubenswrapper[4752]: I0929 10:44:57.440131 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:57 crc kubenswrapper[4752]: I0929 10:44:57.440140 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:57 crc kubenswrapper[4752]: I0929 10:44:57.440159 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:57 crc kubenswrapper[4752]: I0929 10:44:57.440169 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:57Z","lastTransitionTime":"2025-09-29T10:44:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:57 crc kubenswrapper[4752]: I0929 10:44:57.543907 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:57 crc kubenswrapper[4752]: I0929 10:44:57.543971 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:57 crc kubenswrapper[4752]: I0929 10:44:57.543989 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:57 crc kubenswrapper[4752]: I0929 10:44:57.544010 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:57 crc kubenswrapper[4752]: I0929 10:44:57.544023 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:57Z","lastTransitionTime":"2025-09-29T10:44:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:57 crc kubenswrapper[4752]: I0929 10:44:57.647098 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:57 crc kubenswrapper[4752]: I0929 10:44:57.647155 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:57 crc kubenswrapper[4752]: I0929 10:44:57.647170 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:57 crc kubenswrapper[4752]: I0929 10:44:57.647206 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:57 crc kubenswrapper[4752]: I0929 10:44:57.647232 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:57Z","lastTransitionTime":"2025-09-29T10:44:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:57 crc kubenswrapper[4752]: I0929 10:44:57.749989 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:57 crc kubenswrapper[4752]: I0929 10:44:57.750067 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:57 crc kubenswrapper[4752]: I0929 10:44:57.750081 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:57 crc kubenswrapper[4752]: I0929 10:44:57.750102 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:57 crc kubenswrapper[4752]: I0929 10:44:57.750116 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:57Z","lastTransitionTime":"2025-09-29T10:44:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:57 crc kubenswrapper[4752]: I0929 10:44:57.852340 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:57 crc kubenswrapper[4752]: I0929 10:44:57.852406 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:57 crc kubenswrapper[4752]: I0929 10:44:57.852422 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:57 crc kubenswrapper[4752]: I0929 10:44:57.852442 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:57 crc kubenswrapper[4752]: I0929 10:44:57.852455 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:57Z","lastTransitionTime":"2025-09-29T10:44:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:57 crc kubenswrapper[4752]: I0929 10:44:57.955036 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:57 crc kubenswrapper[4752]: I0929 10:44:57.955076 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:57 crc kubenswrapper[4752]: I0929 10:44:57.955085 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:57 crc kubenswrapper[4752]: I0929 10:44:57.955100 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:57 crc kubenswrapper[4752]: I0929 10:44:57.955110 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:57Z","lastTransitionTime":"2025-09-29T10:44:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:58 crc kubenswrapper[4752]: I0929 10:44:58.032190 4752 scope.go:117] "RemoveContainer" containerID="c927118840179fccacbe6a18a329c117cef73a6e914bf38d20fc2439d6a5c1ee" Sep 29 10:44:58 crc kubenswrapper[4752]: I0929 10:44:58.059167 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:58 crc kubenswrapper[4752]: I0929 10:44:58.059231 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:58 crc kubenswrapper[4752]: I0929 10:44:58.059243 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:58 crc kubenswrapper[4752]: I0929 10:44:58.059268 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:58 crc kubenswrapper[4752]: I0929 10:44:58.059286 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:58Z","lastTransitionTime":"2025-09-29T10:44:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:58 crc kubenswrapper[4752]: I0929 10:44:58.162439 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:58 crc kubenswrapper[4752]: I0929 10:44:58.162487 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:58 crc kubenswrapper[4752]: I0929 10:44:58.162497 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:58 crc kubenswrapper[4752]: I0929 10:44:58.162514 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:58 crc kubenswrapper[4752]: I0929 10:44:58.162530 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:58Z","lastTransitionTime":"2025-09-29T10:44:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:58 crc kubenswrapper[4752]: I0929 10:44:58.266555 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:58 crc kubenswrapper[4752]: I0929 10:44:58.266604 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:58 crc kubenswrapper[4752]: I0929 10:44:58.266614 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:58 crc kubenswrapper[4752]: I0929 10:44:58.266632 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:58 crc kubenswrapper[4752]: I0929 10:44:58.266644 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:58Z","lastTransitionTime":"2025-09-29T10:44:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:58 crc kubenswrapper[4752]: I0929 10:44:58.344328 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Sep 29 10:44:58 crc kubenswrapper[4752]: I0929 10:44:58.347365 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"362298e6215cc1a9971973419e58a45e5ded2c4120b1e800afd87f480f6fd3d6"} Sep 29 10:44:58 crc kubenswrapper[4752]: I0929 10:44:58.347870 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 29 10:44:58 crc kubenswrapper[4752]: I0929 10:44:58.362921 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vm6zb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f30a1f9-86ef-450e-9f8c-8ef8d4ac380a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6bc5aff417397c8b264553f67de7ebd1aeadb67fb83114c5bb13c2e0d10e397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239ca1f17b9f1e1d6ba63b196e34066fe7fb37373453460261044f5fcaf819af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://239ca1f17b9f1e1d6ba63b196e34066fe7fb37373453460261044f5fcaf819af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd5b369dc688f11e4ab502a3886b722cba392fce0d3ac7850bd59abffbf7dee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd5b369dc688f11e4ab502a3886b722cba392fce0d3ac7850bd59abffbf7dee2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d17821abed9aca5c20373738f44ca9a61e954d1eee46f0d16c3e9b34d810a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88d17821abed9aca5c20373738f44ca9a61e954d1eee46f0d16c3e9b34d810a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50f5727e0bd53639ba6b6632f2d62c7c62ae74b07a60aa1cb58c2020990cae42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f5727e0bd53639ba6b6632f2d62c7c62ae74b07a60aa1cb58c2020990cae42\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd84740e3b0a970decedcc3960fb987fa618f9627f06be1d2d0b034d0361f805\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd84740e3b0a970decedcc3960fb987fa618f9627f06be1d2d0b034d0361f805\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6af6d9f7c1ca6625f88dcaa9ef267cf11f3ebb16a0ce12d3c2442550bc0833ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6af6d9f7c1ca6625f88dcaa9ef267cf11f3ebb16a0ce12d3c2442550bc0833ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vm6zb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:58Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:58 crc kubenswrapper[4752]: I0929 10:44:58.369227 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:58 crc kubenswrapper[4752]: I0929 10:44:58.369267 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:58 crc kubenswrapper[4752]: I0929 10:44:58.369279 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:58 crc kubenswrapper[4752]: I0929 10:44:58.369298 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:58 crc kubenswrapper[4752]: I0929 10:44:58.369310 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:58Z","lastTransitionTime":"2025-09-29T10:44:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:58 crc kubenswrapper[4752]: I0929 10:44:58.383456 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94028c24-ec10-4d5c-b32c-1700e677d539\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://486ac9c45cc8e6cc88a199b152343c1db14c51125b4357c85d5d082467fc4560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2860691a355a598f52a1f13213198fa7889748e67cca21a617ed5714f5eabcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e34a55130babbc5fbe9fb81d05fc687dc1b06c3bffea762ba699f9f6c317b312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5985eb5ebc8fa2ca986873aea235335770621597493b43eaa58d98329cd37009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b46368b26939edaf377aa86ef45fc9dc3ec4fa274dfe1cba458bafb8d32309e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a98f237ee9baeb799b2ea76ccbe7b349ed70b50f47738fc514ae56b46ee8d1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da3dc227c40a352ef71dec7f4fe6a59b773b7901f2ec3ec4f18c829adf8e87ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da3dc227c40a352ef71dec7f4fe6a59b773b7901f2ec3ec4f18c829adf8e87ed\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T10:44:53Z\\\",\\\"message\\\":\\\"mers/factory.go:160\\\\nI0929 10:44:53.327068 6174 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0929 10:44:53.326997 6174 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0929 10:44:53.327207 6174 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0929 10:44:53.327278 6174 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0929 10:44:53.327541 6174 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0929 10:44:53.328126 6174 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0929 10:44:53.328200 6174 factory.go:656] Stopping watch factory\\\\nI0929 10:44:53.328228 6174 handler.go:208] Removed *v1.Node event handler 2\\\\nI0929 10:44:53.331274 6174 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0929 10:44:53.331331 6174 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0929 10:44:53.331404 6174 ovnkube.go:599] Stopped ovnkube\\\\nI0929 10:44:53.331469 6174 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0929 10:44:53.331650 6174 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-c2vrh_openshift-ovn-kubernetes(94028c24-ec10-4d5c-b32c-1700e677d539)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea11fb795febf50e35263b0a02c32a01fd69937dfbfe196696cd1792e40cc191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f22dfbbd26fb3ebf4869b46406913cc1963e33c11794193c815235be5acee338\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f22dfbbd26fb3ebf4869b46406913cc1963e33c11794193c815235be5acee338\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c2vrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:58Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:58 crc kubenswrapper[4752]: I0929 10:44:58.400528 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520a5d33-312c-4033-8b69-5dd582f13ccc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6223734bbce461c09916aea7629bba0cfa97ea17050bca7417020ece9ae031a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1157b82d6f3337270d30abdceadaa1f0a01b3c6d8de6bc8e9edf083a8264f19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://854abd6205c2eec2229d0d65aec3edb7cf1cc1e77759df41bd22deda4a08c8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://362298e6215cc1a9971973419e58a45e5ded2c4120b1e800afd87f480f6fd3d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c927118840179fccacbe6a18a329c117cef73a6e914bf38d20fc2439d6a5c1ee\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"message\\\":\\\"le observer\\\\nW0929 10:44:40.787758 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0929 10:44:40.787900 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 10:44:40.788558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1487283959/tls.crt::/tmp/serving-cert-1487283959/tls.key\\\\\\\"\\\\nI0929 10:44:41.256284 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 10:44:41.261265 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 10:44:41.261291 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 10:44:41.261311 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 10:44:41.261316 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 10:44:41.267824 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0929 10:44:41.267847 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0929 10:44:41.267849 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 10:44:41.267871 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 10:44:41.267876 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 10:44:41.267879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 10:44:41.267882 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 10:44:41.267884 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0929 10:44:41.270258 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbe61bb570ef2be352bb3a0e55da353ce7b618b397e3bf9f0d66da0c9b6f1d4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80f961b58569cce6d634f225369902695ccda2e78efb1c6fd635f1535467cc1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80f961b58569cce6d634f225369902695ccda2e78efb1c6fd635f1535467cc1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:58Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:58 crc kubenswrapper[4752]: I0929 10:44:58.413402 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca4f637cfcb1e52fa69f0ffa46b3a53459225d9ad4afd1178bff709e812c5418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b70242846937de5b4dda37a2b8c48947fded378c299ea4ad857168589d7c175\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:58Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:58 crc kubenswrapper[4752]: I0929 10:44:58.423188 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7kp7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66a61a7f-9be6-486b-a425-62ed62ec0ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4170732970e5e7c429279d239eb2d4b9d8249ff254b35f38ff80d0321087be2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kgr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7kp7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:58Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:58 crc kubenswrapper[4752]: I0929 10:44:58.433225 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fb781fd16d4a9f56202eb1724ed1a4ed6700ff7b81819573b955bcb07e563a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:58Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:58 crc kubenswrapper[4752]: I0929 10:44:58.444832 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xv5q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52fc9378-c37b-424b-afde-7b191bab5fde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30ee75a35da106cc9424c7a3f97f28d0c711200667372c023612db4a9701c189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4rqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xv5q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:58Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:58 crc kubenswrapper[4752]: I0929 10:44:58.456245 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mp5pm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65f5485e-9000-4512-aad3-7d367715ac2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db5dba49df10714a5f00ec40865af87528f6bee63ee58a89f299af7c10e4d769\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z772z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://073cf9e4675b04d77ad58f0b7e1b313e3fe15e8daee4e1c8934a90924b04ad22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z772z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mp5pm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:58Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:58 crc kubenswrapper[4752]: I0929 10:44:58.467366 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sq7f4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a33b92e-d79c-4162-8500-df7a89df8df3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qck2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qck2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sq7f4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:58Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:58 crc kubenswrapper[4752]: I0929 10:44:58.471531 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:58 crc kubenswrapper[4752]: I0929 10:44:58.471580 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:58 crc kubenswrapper[4752]: I0929 10:44:58.471592 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:58 crc kubenswrapper[4752]: I0929 10:44:58.471611 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:58 crc kubenswrapper[4752]: I0929 10:44:58.471623 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:58Z","lastTransitionTime":"2025-09-29T10:44:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:58 crc kubenswrapper[4752]: I0929 10:44:58.481996 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3e5d3a3-2f2d-4f61-ae95-26ebd1f72342\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66d77cd5048e199a6eae84be4079c3b00305f4f5223b5176a49df0feb2f0bf8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74b270e951a827068c908168bf04d4cd3bcba62e472e4a3f415de8b7463fdccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dd4d83f6d6b5db7fc93239bc1a6b731c67bc15ef1ca1990b53589e4ad36bfa7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39ef26bf3e7b95ac9a59199bbabe11fd4e831baba1b120ef97a4839c0c4aab7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:58Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:58 crc kubenswrapper[4752]: I0929 10:44:58.495357 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:58Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:58 crc kubenswrapper[4752]: I0929 10:44:58.507582 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:58Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:58 crc kubenswrapper[4752]: I0929 10:44:58.522431 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5863c243-797d-462a-b11f-71aaf005f8d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://166738b29f01996ec981fd00b49f422e4a97fe774396e7ea153ad29ef30a7370\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdtpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32155f6078e9c15abe4c659ac79b064ec182a232ea1d816998da4de273b7aa67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdtpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mgrvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:58Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:58 crc kubenswrapper[4752]: I0929 10:44:58.539007 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4whp8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"398b6e5c-29ac-4701-9207-d3d269b62224\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63db080ebca3f5ea23ddc9af874b6b500abe8044c73794ae0749df2949fb9520\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9hp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4whp8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:58Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:58 crc kubenswrapper[4752]: I0929 10:44:58.564387 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48ad7053-6039-4b1a-9729-fcbe1d938928\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00965359c30aa25677d4b114c00b339b155ab4b5316d5e355536bea5b65eaba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e2d86e0821e0155affe296e5cc70e9904f04c800943101e62509e3a5e4e0808\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9378a6f1ac902b030f4ecabac1eae40f884dc1546a360e178f38300e137d8b0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a174bcfad22c2a58c48792478272705c80a56775b45b14919ea1de1dd92b4cbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://828d416b69696f709d91feb8df8fead0f95be74a91c5dab25756e341e29413dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e4ae4f6e0a6df2f1e370b0ff37704c0b0252752c0d8e8a1cdd83088ca9ec951\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e4ae4f6e0a6df2f1e370b0ff37704c0b0252752c0d8e8a1cdd83088ca9ec951\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40c90938f79ba960fa16979dd5f239674df4b13cae8b0b5d3bb48b0e46219a34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40c90938f79ba960fa16979dd5f239674df4b13cae8b0b5d3bb48b0e46219a34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f99c6fe84624f3e518bbe35ee9b700effb126ff1f36d995262b7ed8b73364780\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f99c6fe84624f3e518bbe35ee9b700effb126ff1f36d995262b7ed8b73364780\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:58Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:58 crc kubenswrapper[4752]: I0929 10:44:58.574030 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:58 crc kubenswrapper[4752]: I0929 10:44:58.574071 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:58 crc kubenswrapper[4752]: I0929 10:44:58.574080 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:58 crc kubenswrapper[4752]: I0929 10:44:58.574096 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:58 crc kubenswrapper[4752]: I0929 10:44:58.574106 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:58Z","lastTransitionTime":"2025-09-29T10:44:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:58 crc kubenswrapper[4752]: I0929 10:44:58.581232 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:58Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:58 crc kubenswrapper[4752]: I0929 10:44:58.596734 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131d2c8a72fc6a373ebf6835840e6b9c1829db4c78b4961bf36642fd0e8a5636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:44:58Z is after 2025-08-24T17:21:41Z" Sep 29 10:44:58 crc kubenswrapper[4752]: I0929 10:44:58.677432 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:58 crc kubenswrapper[4752]: I0929 10:44:58.677476 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:58 crc kubenswrapper[4752]: I0929 10:44:58.677487 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:58 crc kubenswrapper[4752]: I0929 10:44:58.677506 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:58 crc kubenswrapper[4752]: I0929 10:44:58.677520 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:58Z","lastTransitionTime":"2025-09-29T10:44:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:58 crc kubenswrapper[4752]: I0929 10:44:58.780771 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:58 crc kubenswrapper[4752]: I0929 10:44:58.780824 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:58 crc kubenswrapper[4752]: I0929 10:44:58.780835 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:58 crc kubenswrapper[4752]: I0929 10:44:58.780853 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:58 crc kubenswrapper[4752]: I0929 10:44:58.780863 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:58Z","lastTransitionTime":"2025-09-29T10:44:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:58 crc kubenswrapper[4752]: I0929 10:44:58.883705 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:58 crc kubenswrapper[4752]: I0929 10:44:58.883768 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:58 crc kubenswrapper[4752]: I0929 10:44:58.883786 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:58 crc kubenswrapper[4752]: I0929 10:44:58.883836 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:58 crc kubenswrapper[4752]: I0929 10:44:58.883852 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:58Z","lastTransitionTime":"2025-09-29T10:44:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:58 crc kubenswrapper[4752]: I0929 10:44:58.987086 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:58 crc kubenswrapper[4752]: I0929 10:44:58.987134 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:58 crc kubenswrapper[4752]: I0929 10:44:58.987144 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:58 crc kubenswrapper[4752]: I0929 10:44:58.987163 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:58 crc kubenswrapper[4752]: I0929 10:44:58.987175 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:58Z","lastTransitionTime":"2025-09-29T10:44:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:59 crc kubenswrapper[4752]: I0929 10:44:59.031102 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 10:44:59 crc kubenswrapper[4752]: I0929 10:44:59.031102 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 10:44:59 crc kubenswrapper[4752]: I0929 10:44:59.031118 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 10:44:59 crc kubenswrapper[4752]: I0929 10:44:59.031135 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sq7f4" Sep 29 10:44:59 crc kubenswrapper[4752]: E0929 10:44:59.031284 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 10:44:59 crc kubenswrapper[4752]: E0929 10:44:59.031335 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 10:44:59 crc kubenswrapper[4752]: E0929 10:44:59.031419 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sq7f4" podUID="0a33b92e-d79c-4162-8500-df7a89df8df3" Sep 29 10:44:59 crc kubenswrapper[4752]: E0929 10:44:59.031552 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 10:44:59 crc kubenswrapper[4752]: I0929 10:44:59.092350 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:59 crc kubenswrapper[4752]: I0929 10:44:59.092412 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:59 crc kubenswrapper[4752]: I0929 10:44:59.092423 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:59 crc kubenswrapper[4752]: I0929 10:44:59.092448 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:59 crc kubenswrapper[4752]: I0929 10:44:59.092464 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:59Z","lastTransitionTime":"2025-09-29T10:44:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:59 crc kubenswrapper[4752]: I0929 10:44:59.194889 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:59 crc kubenswrapper[4752]: I0929 10:44:59.194924 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:59 crc kubenswrapper[4752]: I0929 10:44:59.194958 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:59 crc kubenswrapper[4752]: I0929 10:44:59.194974 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:59 crc kubenswrapper[4752]: I0929 10:44:59.194985 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:59Z","lastTransitionTime":"2025-09-29T10:44:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:59 crc kubenswrapper[4752]: I0929 10:44:59.297598 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:59 crc kubenswrapper[4752]: I0929 10:44:59.297655 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:59 crc kubenswrapper[4752]: I0929 10:44:59.297667 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:59 crc kubenswrapper[4752]: I0929 10:44:59.297685 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:59 crc kubenswrapper[4752]: I0929 10:44:59.297700 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:59Z","lastTransitionTime":"2025-09-29T10:44:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:59 crc kubenswrapper[4752]: I0929 10:44:59.400554 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:59 crc kubenswrapper[4752]: I0929 10:44:59.400610 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:59 crc kubenswrapper[4752]: I0929 10:44:59.400623 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:59 crc kubenswrapper[4752]: I0929 10:44:59.400643 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:59 crc kubenswrapper[4752]: I0929 10:44:59.400657 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:59Z","lastTransitionTime":"2025-09-29T10:44:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:59 crc kubenswrapper[4752]: I0929 10:44:59.426276 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0a33b92e-d79c-4162-8500-df7a89df8df3-metrics-certs\") pod \"network-metrics-daemon-sq7f4\" (UID: \"0a33b92e-d79c-4162-8500-df7a89df8df3\") " pod="openshift-multus/network-metrics-daemon-sq7f4" Sep 29 10:44:59 crc kubenswrapper[4752]: E0929 10:44:59.426449 4752 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 29 10:44:59 crc kubenswrapper[4752]: E0929 10:44:59.426535 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0a33b92e-d79c-4162-8500-df7a89df8df3-metrics-certs podName:0a33b92e-d79c-4162-8500-df7a89df8df3 nodeName:}" failed. No retries permitted until 2025-09-29 10:45:03.4265159 +0000 UTC m=+44.215657577 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0a33b92e-d79c-4162-8500-df7a89df8df3-metrics-certs") pod "network-metrics-daemon-sq7f4" (UID: "0a33b92e-d79c-4162-8500-df7a89df8df3") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 29 10:44:59 crc kubenswrapper[4752]: I0929 10:44:59.503873 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:59 crc kubenswrapper[4752]: I0929 10:44:59.503936 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:59 crc kubenswrapper[4752]: I0929 10:44:59.503955 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:59 crc kubenswrapper[4752]: I0929 10:44:59.503980 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:59 crc kubenswrapper[4752]: I0929 10:44:59.503999 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:59Z","lastTransitionTime":"2025-09-29T10:44:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:59 crc kubenswrapper[4752]: I0929 10:44:59.606624 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:59 crc kubenswrapper[4752]: I0929 10:44:59.606675 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:59 crc kubenswrapper[4752]: I0929 10:44:59.606686 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:59 crc kubenswrapper[4752]: I0929 10:44:59.606704 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:59 crc kubenswrapper[4752]: I0929 10:44:59.606717 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:59Z","lastTransitionTime":"2025-09-29T10:44:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:59 crc kubenswrapper[4752]: I0929 10:44:59.709593 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:59 crc kubenswrapper[4752]: I0929 10:44:59.709678 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:59 crc kubenswrapper[4752]: I0929 10:44:59.709706 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:59 crc kubenswrapper[4752]: I0929 10:44:59.709742 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:59 crc kubenswrapper[4752]: I0929 10:44:59.709766 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:59Z","lastTransitionTime":"2025-09-29T10:44:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:59 crc kubenswrapper[4752]: I0929 10:44:59.813729 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:59 crc kubenswrapper[4752]: I0929 10:44:59.813849 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:59 crc kubenswrapper[4752]: I0929 10:44:59.813875 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:59 crc kubenswrapper[4752]: I0929 10:44:59.813915 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:59 crc kubenswrapper[4752]: I0929 10:44:59.813938 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:59Z","lastTransitionTime":"2025-09-29T10:44:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:44:59 crc kubenswrapper[4752]: I0929 10:44:59.916923 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:44:59 crc kubenswrapper[4752]: I0929 10:44:59.916991 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:44:59 crc kubenswrapper[4752]: I0929 10:44:59.917012 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:44:59 crc kubenswrapper[4752]: I0929 10:44:59.917036 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:44:59 crc kubenswrapper[4752]: I0929 10:44:59.917049 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:44:59Z","lastTransitionTime":"2025-09-29T10:44:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:00 crc kubenswrapper[4752]: I0929 10:45:00.019618 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:00 crc kubenswrapper[4752]: I0929 10:45:00.019664 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:00 crc kubenswrapper[4752]: I0929 10:45:00.019676 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:00 crc kubenswrapper[4752]: I0929 10:45:00.019692 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:00 crc kubenswrapper[4752]: I0929 10:45:00.019703 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:00Z","lastTransitionTime":"2025-09-29T10:45:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:00 crc kubenswrapper[4752]: I0929 10:45:00.048153 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5863c243-797d-462a-b11f-71aaf005f8d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://166738b29f01996ec981fd00b49f422e4a97fe774396e7ea153ad29ef30a7370\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdtpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32155f6078e9c15abe4c659ac79b064ec182a232ea1d816998da4de273b7aa67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdtpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mgrvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:00Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:00 crc kubenswrapper[4752]: I0929 10:45:00.063778 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4whp8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"398b6e5c-29ac-4701-9207-d3d269b62224\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63db080ebca3f5ea23ddc9af874b6b500abe8044c73794ae0749df2949fb9520\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9hp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4whp8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:00Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:00 crc kubenswrapper[4752]: I0929 10:45:00.088460 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48ad7053-6039-4b1a-9729-fcbe1d938928\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00965359c30aa25677d4b114c00b339b155ab4b5316d5e355536bea5b65eaba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e2d86e0821e0155affe296e5cc70e9904f04c800943101e62509e3a5e4e0808\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9378a6f1ac902b030f4ecabac1eae40f884dc1546a360e178f38300e137d8b0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a174bcfad22c2a58c48792478272705c80a56775b45b14919ea1de1dd92b4cbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://828d416b69696f709d91feb8df8fead0f95be74a91c5dab25756e341e29413dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e4ae4f6e0a6df2f1e370b0ff37704c0b0252752c0d8e8a1cdd83088ca9ec951\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e4ae4f6e0a6df2f1e370b0ff37704c0b0252752c0d8e8a1cdd83088ca9ec951\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40c90938f79ba960fa16979dd5f239674df4b13cae8b0b5d3bb48b0e46219a34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40c90938f79ba960fa16979dd5f239674df4b13cae8b0b5d3bb48b0e46219a34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f99c6fe84624f3e518bbe35ee9b700effb126ff1f36d995262b7ed8b73364780\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f99c6fe84624f3e518bbe35ee9b700effb126ff1f36d995262b7ed8b73364780\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:00Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:00 crc kubenswrapper[4752]: I0929 10:45:00.107970 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:00Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:00 crc kubenswrapper[4752]: I0929 10:45:00.122021 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:00 crc kubenswrapper[4752]: I0929 10:45:00.122057 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:00 crc kubenswrapper[4752]: I0929 10:45:00.122066 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:00 crc kubenswrapper[4752]: I0929 10:45:00.122083 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:00 crc kubenswrapper[4752]: I0929 10:45:00.122095 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:00Z","lastTransitionTime":"2025-09-29T10:45:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:00 crc kubenswrapper[4752]: I0929 10:45:00.124112 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131d2c8a72fc6a373ebf6835840e6b9c1829db4c78b4961bf36642fd0e8a5636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:00Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:00 crc kubenswrapper[4752]: I0929 10:45:00.151052 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vm6zb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f30a1f9-86ef-450e-9f8c-8ef8d4ac380a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6bc5aff417397c8b264553f67de7ebd1aeadb67fb83114c5bb13c2e0d10e397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239ca1f17b9f1e1d6ba63b196e34066fe7fb37373453460261044f5fcaf819af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://239ca1f17b9f1e1d6ba63b196e34066fe7fb37373453460261044f5fcaf819af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd5b369dc688f11e4ab502a3886b722cba392fce0d3ac7850bd59abffbf7dee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd5b369dc688f11e4ab502a3886b722cba392fce0d3ac7850bd59abffbf7dee2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d17821abed9aca5c20373738f44ca9a61e954d1eee46f0d16c3e9b34d810a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88d17821abed9aca5c20373738f44ca9a61e954d1eee46f0d16c3e9b34d810a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50f5727e0bd53639ba6b6632f2d62c7c62ae74b07a60aa1cb58c2020990cae42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f5727e0bd53639ba6b6632f2d62c7c62ae74b07a60aa1cb58c2020990cae42\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd84740e3b0a970decedcc3960fb987fa618f9627f06be1d2d0b034d0361f805\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd84740e3b0a970decedcc3960fb987fa618f9627f06be1d2d0b034d0361f805\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6af6d9f7c1ca6625f88dcaa9ef267cf11f3ebb16a0ce12d3c2442550bc0833ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6af6d9f7c1ca6625f88dcaa9ef267cf11f3ebb16a0ce12d3c2442550bc0833ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vm6zb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:00Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:00 crc kubenswrapper[4752]: I0929 10:45:00.170907 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94028c24-ec10-4d5c-b32c-1700e677d539\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://486ac9c45cc8e6cc88a199b152343c1db14c51125b4357c85d5d082467fc4560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2860691a355a598f52a1f13213198fa7889748e67cca21a617ed5714f5eabcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e34a55130babbc5fbe9fb81d05fc687dc1b06c3bffea762ba699f9f6c317b312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5985eb5ebc8fa2ca986873aea235335770621597493b43eaa58d98329cd37009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b46368b26939edaf377aa86ef45fc9dc3ec4fa274dfe1cba458bafb8d32309e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a98f237ee9baeb799b2ea76ccbe7b349ed70b50f47738fc514ae56b46ee8d1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da3dc227c40a352ef71dec7f4fe6a59b773b7901f2ec3ec4f18c829adf8e87ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da3dc227c40a352ef71dec7f4fe6a59b773b7901f2ec3ec4f18c829adf8e87ed\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T10:44:53Z\\\",\\\"message\\\":\\\"mers/factory.go:160\\\\nI0929 10:44:53.327068 6174 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0929 10:44:53.326997 6174 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0929 10:44:53.327207 6174 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0929 10:44:53.327278 6174 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0929 10:44:53.327541 6174 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0929 10:44:53.328126 6174 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0929 10:44:53.328200 6174 factory.go:656] Stopping watch factory\\\\nI0929 10:44:53.328228 6174 handler.go:208] Removed *v1.Node event handler 2\\\\nI0929 10:44:53.331274 6174 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0929 10:44:53.331331 6174 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0929 10:44:53.331404 6174 ovnkube.go:599] Stopped ovnkube\\\\nI0929 10:44:53.331469 6174 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0929 10:44:53.331650 6174 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-c2vrh_openshift-ovn-kubernetes(94028c24-ec10-4d5c-b32c-1700e677d539)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea11fb795febf50e35263b0a02c32a01fd69937dfbfe196696cd1792e40cc191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f22dfbbd26fb3ebf4869b46406913cc1963e33c11794193c815235be5acee338\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f22dfbbd26fb3ebf4869b46406913cc1963e33c11794193c815235be5acee338\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c2vrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:00Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:00 crc kubenswrapper[4752]: I0929 10:45:00.187752 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520a5d33-312c-4033-8b69-5dd582f13ccc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6223734bbce461c09916aea7629bba0cfa97ea17050bca7417020ece9ae031a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1157b82d6f3337270d30abdceadaa1f0a01b3c6d8de6bc8e9edf083a8264f19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://854abd6205c2eec2229d0d65aec3edb7cf1cc1e77759df41bd22deda4a08c8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://362298e6215cc1a9971973419e58a45e5ded2c4120b1e800afd87f480f6fd3d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c927118840179fccacbe6a18a329c117cef73a6e914bf38d20fc2439d6a5c1ee\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"message\\\":\\\"le observer\\\\nW0929 10:44:40.787758 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0929 10:44:40.787900 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 10:44:40.788558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1487283959/tls.crt::/tmp/serving-cert-1487283959/tls.key\\\\\\\"\\\\nI0929 10:44:41.256284 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 10:44:41.261265 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 10:44:41.261291 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 10:44:41.261311 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 10:44:41.261316 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 10:44:41.267824 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0929 10:44:41.267847 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0929 10:44:41.267849 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 10:44:41.267871 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 10:44:41.267876 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 10:44:41.267879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 10:44:41.267882 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 10:44:41.267884 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0929 10:44:41.270258 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbe61bb570ef2be352bb3a0e55da353ce7b618b397e3bf9f0d66da0c9b6f1d4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80f961b58569cce6d634f225369902695ccda2e78efb1c6fd635f1535467cc1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80f961b58569cce6d634f225369902695ccda2e78efb1c6fd635f1535467cc1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:00Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:00 crc kubenswrapper[4752]: I0929 10:45:00.209021 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca4f637cfcb1e52fa69f0ffa46b3a53459225d9ad4afd1178bff709e812c5418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b70242846937de5b4dda37a2b8c48947fded378c299ea4ad857168589d7c175\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:00Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:00 crc kubenswrapper[4752]: I0929 10:45:00.224393 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7kp7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66a61a7f-9be6-486b-a425-62ed62ec0ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4170732970e5e7c429279d239eb2d4b9d8249ff254b35f38ff80d0321087be2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kgr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7kp7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:00Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:00 crc kubenswrapper[4752]: I0929 10:45:00.225229 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:00 crc kubenswrapper[4752]: I0929 10:45:00.225272 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:00 crc kubenswrapper[4752]: I0929 10:45:00.225283 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:00 crc kubenswrapper[4752]: I0929 10:45:00.225302 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:00 crc kubenswrapper[4752]: I0929 10:45:00.225314 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:00Z","lastTransitionTime":"2025-09-29T10:45:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:00 crc kubenswrapper[4752]: I0929 10:45:00.240698 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fb781fd16d4a9f56202eb1724ed1a4ed6700ff7b81819573b955bcb07e563a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:00Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:00 crc kubenswrapper[4752]: I0929 10:45:00.255898 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xv5q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52fc9378-c37b-424b-afde-7b191bab5fde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30ee75a35da106cc9424c7a3f97f28d0c711200667372c023612db4a9701c189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4rqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xv5q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:00Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:00 crc kubenswrapper[4752]: I0929 10:45:00.270341 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mp5pm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65f5485e-9000-4512-aad3-7d367715ac2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db5dba49df10714a5f00ec40865af87528f6bee63ee58a89f299af7c10e4d769\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z772z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://073cf9e4675b04d77ad58f0b7e1b313e3fe15e8daee4e1c8934a90924b04ad22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z772z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mp5pm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:00Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:00 crc kubenswrapper[4752]: I0929 10:45:00.283623 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sq7f4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a33b92e-d79c-4162-8500-df7a89df8df3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qck2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qck2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sq7f4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:00Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:00 crc kubenswrapper[4752]: I0929 10:45:00.297734 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3e5d3a3-2f2d-4f61-ae95-26ebd1f72342\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66d77cd5048e199a6eae84be4079c3b00305f4f5223b5176a49df0feb2f0bf8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74b270e951a827068c908168bf04d4cd3bcba62e472e4a3f415de8b7463fdccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dd4d83f6d6b5db7fc93239bc1a6b731c67bc15ef1ca1990b53589e4ad36bfa7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39ef26bf3e7b95ac9a59199bbabe11fd4e831baba1b120ef97a4839c0c4aab7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:00Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:00 crc kubenswrapper[4752]: I0929 10:45:00.312203 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:00Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:00 crc kubenswrapper[4752]: I0929 10:45:00.327787 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:00 crc kubenswrapper[4752]: I0929 10:45:00.327851 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:00 crc kubenswrapper[4752]: I0929 10:45:00.327864 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:00 crc kubenswrapper[4752]: I0929 10:45:00.327885 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:00 crc kubenswrapper[4752]: I0929 10:45:00.327900 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:00Z","lastTransitionTime":"2025-09-29T10:45:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:00 crc kubenswrapper[4752]: I0929 10:45:00.328234 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:00Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:00 crc kubenswrapper[4752]: I0929 10:45:00.430919 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:00 crc kubenswrapper[4752]: I0929 10:45:00.431001 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:00 crc kubenswrapper[4752]: I0929 10:45:00.431024 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:00 crc kubenswrapper[4752]: I0929 10:45:00.431066 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:00 crc kubenswrapper[4752]: I0929 10:45:00.431102 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:00Z","lastTransitionTime":"2025-09-29T10:45:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:00 crc kubenswrapper[4752]: I0929 10:45:00.535272 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:00 crc kubenswrapper[4752]: I0929 10:45:00.535319 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:00 crc kubenswrapper[4752]: I0929 10:45:00.535328 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:00 crc kubenswrapper[4752]: I0929 10:45:00.535344 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:00 crc kubenswrapper[4752]: I0929 10:45:00.535358 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:00Z","lastTransitionTime":"2025-09-29T10:45:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:00 crc kubenswrapper[4752]: I0929 10:45:00.638159 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:00 crc kubenswrapper[4752]: I0929 10:45:00.638201 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:00 crc kubenswrapper[4752]: I0929 10:45:00.638213 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:00 crc kubenswrapper[4752]: I0929 10:45:00.638231 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:00 crc kubenswrapper[4752]: I0929 10:45:00.638244 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:00Z","lastTransitionTime":"2025-09-29T10:45:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:00 crc kubenswrapper[4752]: I0929 10:45:00.741632 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:00 crc kubenswrapper[4752]: I0929 10:45:00.741700 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:00 crc kubenswrapper[4752]: I0929 10:45:00.741714 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:00 crc kubenswrapper[4752]: I0929 10:45:00.741736 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:00 crc kubenswrapper[4752]: I0929 10:45:00.741751 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:00Z","lastTransitionTime":"2025-09-29T10:45:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:00 crc kubenswrapper[4752]: I0929 10:45:00.844398 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:00 crc kubenswrapper[4752]: I0929 10:45:00.844448 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:00 crc kubenswrapper[4752]: I0929 10:45:00.844460 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:00 crc kubenswrapper[4752]: I0929 10:45:00.844480 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:00 crc kubenswrapper[4752]: I0929 10:45:00.844493 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:00Z","lastTransitionTime":"2025-09-29T10:45:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:00 crc kubenswrapper[4752]: I0929 10:45:00.947656 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:00 crc kubenswrapper[4752]: I0929 10:45:00.947705 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:00 crc kubenswrapper[4752]: I0929 10:45:00.947717 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:00 crc kubenswrapper[4752]: I0929 10:45:00.947736 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:00 crc kubenswrapper[4752]: I0929 10:45:00.947752 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:00Z","lastTransitionTime":"2025-09-29T10:45:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:01 crc kubenswrapper[4752]: I0929 10:45:01.030019 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 10:45:01 crc kubenswrapper[4752]: I0929 10:45:01.030085 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sq7f4" Sep 29 10:45:01 crc kubenswrapper[4752]: E0929 10:45:01.030180 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 10:45:01 crc kubenswrapper[4752]: I0929 10:45:01.030199 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 10:45:01 crc kubenswrapper[4752]: I0929 10:45:01.030251 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 10:45:01 crc kubenswrapper[4752]: E0929 10:45:01.030359 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sq7f4" podUID="0a33b92e-d79c-4162-8500-df7a89df8df3" Sep 29 10:45:01 crc kubenswrapper[4752]: E0929 10:45:01.030459 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 10:45:01 crc kubenswrapper[4752]: E0929 10:45:01.030511 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 10:45:01 crc kubenswrapper[4752]: I0929 10:45:01.050131 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:01 crc kubenswrapper[4752]: I0929 10:45:01.050185 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:01 crc kubenswrapper[4752]: I0929 10:45:01.050200 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:01 crc kubenswrapper[4752]: I0929 10:45:01.050224 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:01 crc kubenswrapper[4752]: I0929 10:45:01.050237 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:01Z","lastTransitionTime":"2025-09-29T10:45:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:01 crc kubenswrapper[4752]: I0929 10:45:01.153662 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:01 crc kubenswrapper[4752]: I0929 10:45:01.153738 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:01 crc kubenswrapper[4752]: I0929 10:45:01.153751 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:01 crc kubenswrapper[4752]: I0929 10:45:01.153773 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:01 crc kubenswrapper[4752]: I0929 10:45:01.153789 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:01Z","lastTransitionTime":"2025-09-29T10:45:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:01 crc kubenswrapper[4752]: I0929 10:45:01.257302 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:01 crc kubenswrapper[4752]: I0929 10:45:01.257364 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:01 crc kubenswrapper[4752]: I0929 10:45:01.257405 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:01 crc kubenswrapper[4752]: I0929 10:45:01.257433 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:01 crc kubenswrapper[4752]: I0929 10:45:01.257448 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:01Z","lastTransitionTime":"2025-09-29T10:45:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:01 crc kubenswrapper[4752]: I0929 10:45:01.359683 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:01 crc kubenswrapper[4752]: I0929 10:45:01.360102 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:01 crc kubenswrapper[4752]: I0929 10:45:01.360212 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:01 crc kubenswrapper[4752]: I0929 10:45:01.360306 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:01 crc kubenswrapper[4752]: I0929 10:45:01.360389 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:01Z","lastTransitionTime":"2025-09-29T10:45:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:01 crc kubenswrapper[4752]: I0929 10:45:01.463328 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:01 crc kubenswrapper[4752]: I0929 10:45:01.463400 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:01 crc kubenswrapper[4752]: I0929 10:45:01.463421 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:01 crc kubenswrapper[4752]: I0929 10:45:01.463449 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:01 crc kubenswrapper[4752]: I0929 10:45:01.463465 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:01Z","lastTransitionTime":"2025-09-29T10:45:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:01 crc kubenswrapper[4752]: I0929 10:45:01.566651 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:01 crc kubenswrapper[4752]: I0929 10:45:01.566945 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:01 crc kubenswrapper[4752]: I0929 10:45:01.566956 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:01 crc kubenswrapper[4752]: I0929 10:45:01.566975 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:01 crc kubenswrapper[4752]: I0929 10:45:01.566985 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:01Z","lastTransitionTime":"2025-09-29T10:45:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:01 crc kubenswrapper[4752]: I0929 10:45:01.670045 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:01 crc kubenswrapper[4752]: I0929 10:45:01.670120 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:01 crc kubenswrapper[4752]: I0929 10:45:01.670132 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:01 crc kubenswrapper[4752]: I0929 10:45:01.670151 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:01 crc kubenswrapper[4752]: I0929 10:45:01.670163 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:01Z","lastTransitionTime":"2025-09-29T10:45:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:01 crc kubenswrapper[4752]: I0929 10:45:01.774174 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:01 crc kubenswrapper[4752]: I0929 10:45:01.774229 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:01 crc kubenswrapper[4752]: I0929 10:45:01.774239 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:01 crc kubenswrapper[4752]: I0929 10:45:01.774259 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:01 crc kubenswrapper[4752]: I0929 10:45:01.774276 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:01Z","lastTransitionTime":"2025-09-29T10:45:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:01 crc kubenswrapper[4752]: I0929 10:45:01.877101 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:01 crc kubenswrapper[4752]: I0929 10:45:01.877155 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:01 crc kubenswrapper[4752]: I0929 10:45:01.877169 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:01 crc kubenswrapper[4752]: I0929 10:45:01.877189 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:01 crc kubenswrapper[4752]: I0929 10:45:01.877201 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:01Z","lastTransitionTime":"2025-09-29T10:45:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:01 crc kubenswrapper[4752]: I0929 10:45:01.980132 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:01 crc kubenswrapper[4752]: I0929 10:45:01.980177 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:01 crc kubenswrapper[4752]: I0929 10:45:01.980189 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:01 crc kubenswrapper[4752]: I0929 10:45:01.980209 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:01 crc kubenswrapper[4752]: I0929 10:45:01.980223 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:01Z","lastTransitionTime":"2025-09-29T10:45:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:02 crc kubenswrapper[4752]: I0929 10:45:02.082683 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:02 crc kubenswrapper[4752]: I0929 10:45:02.082725 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:02 crc kubenswrapper[4752]: I0929 10:45:02.082732 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:02 crc kubenswrapper[4752]: I0929 10:45:02.082749 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:02 crc kubenswrapper[4752]: I0929 10:45:02.082759 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:02Z","lastTransitionTime":"2025-09-29T10:45:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:02 crc kubenswrapper[4752]: I0929 10:45:02.185252 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:02 crc kubenswrapper[4752]: I0929 10:45:02.185299 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:02 crc kubenswrapper[4752]: I0929 10:45:02.185310 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:02 crc kubenswrapper[4752]: I0929 10:45:02.185330 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:02 crc kubenswrapper[4752]: I0929 10:45:02.185341 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:02Z","lastTransitionTime":"2025-09-29T10:45:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:02 crc kubenswrapper[4752]: I0929 10:45:02.288419 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:02 crc kubenswrapper[4752]: I0929 10:45:02.288466 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:02 crc kubenswrapper[4752]: I0929 10:45:02.288476 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:02 crc kubenswrapper[4752]: I0929 10:45:02.288495 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:02 crc kubenswrapper[4752]: I0929 10:45:02.288507 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:02Z","lastTransitionTime":"2025-09-29T10:45:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:02 crc kubenswrapper[4752]: I0929 10:45:02.390761 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:02 crc kubenswrapper[4752]: I0929 10:45:02.390828 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:02 crc kubenswrapper[4752]: I0929 10:45:02.390842 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:02 crc kubenswrapper[4752]: I0929 10:45:02.390858 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:02 crc kubenswrapper[4752]: I0929 10:45:02.390867 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:02Z","lastTransitionTime":"2025-09-29T10:45:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:02 crc kubenswrapper[4752]: I0929 10:45:02.493713 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:02 crc kubenswrapper[4752]: I0929 10:45:02.493757 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:02 crc kubenswrapper[4752]: I0929 10:45:02.493768 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:02 crc kubenswrapper[4752]: I0929 10:45:02.493783 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:02 crc kubenswrapper[4752]: I0929 10:45:02.493794 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:02Z","lastTransitionTime":"2025-09-29T10:45:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:02 crc kubenswrapper[4752]: I0929 10:45:02.596905 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:02 crc kubenswrapper[4752]: I0929 10:45:02.596967 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:02 crc kubenswrapper[4752]: I0929 10:45:02.596977 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:02 crc kubenswrapper[4752]: I0929 10:45:02.596995 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:02 crc kubenswrapper[4752]: I0929 10:45:02.597009 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:02Z","lastTransitionTime":"2025-09-29T10:45:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:02 crc kubenswrapper[4752]: I0929 10:45:02.700033 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:02 crc kubenswrapper[4752]: I0929 10:45:02.700109 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:02 crc kubenswrapper[4752]: I0929 10:45:02.700144 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:02 crc kubenswrapper[4752]: I0929 10:45:02.700165 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:02 crc kubenswrapper[4752]: I0929 10:45:02.700179 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:02Z","lastTransitionTime":"2025-09-29T10:45:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:02 crc kubenswrapper[4752]: I0929 10:45:02.803956 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:02 crc kubenswrapper[4752]: I0929 10:45:02.804031 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:02 crc kubenswrapper[4752]: I0929 10:45:02.804043 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:02 crc kubenswrapper[4752]: I0929 10:45:02.804062 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:02 crc kubenswrapper[4752]: I0929 10:45:02.804073 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:02Z","lastTransitionTime":"2025-09-29T10:45:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:02 crc kubenswrapper[4752]: I0929 10:45:02.906592 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:02 crc kubenswrapper[4752]: I0929 10:45:02.906671 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:02 crc kubenswrapper[4752]: I0929 10:45:02.906685 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:02 crc kubenswrapper[4752]: I0929 10:45:02.906706 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:02 crc kubenswrapper[4752]: I0929 10:45:02.906720 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:02Z","lastTransitionTime":"2025-09-29T10:45:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:03 crc kubenswrapper[4752]: I0929 10:45:03.009592 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:03 crc kubenswrapper[4752]: I0929 10:45:03.009658 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:03 crc kubenswrapper[4752]: I0929 10:45:03.009671 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:03 crc kubenswrapper[4752]: I0929 10:45:03.009693 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:03 crc kubenswrapper[4752]: I0929 10:45:03.009705 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:03Z","lastTransitionTime":"2025-09-29T10:45:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:03 crc kubenswrapper[4752]: I0929 10:45:03.031030 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 10:45:03 crc kubenswrapper[4752]: I0929 10:45:03.031060 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 10:45:03 crc kubenswrapper[4752]: I0929 10:45:03.031101 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sq7f4" Sep 29 10:45:03 crc kubenswrapper[4752]: E0929 10:45:03.031296 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 10:45:03 crc kubenswrapper[4752]: E0929 10:45:03.031525 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 10:45:03 crc kubenswrapper[4752]: I0929 10:45:03.031713 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 10:45:03 crc kubenswrapper[4752]: E0929 10:45:03.031739 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sq7f4" podUID="0a33b92e-d79c-4162-8500-df7a89df8df3" Sep 29 10:45:03 crc kubenswrapper[4752]: E0929 10:45:03.032196 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 10:45:03 crc kubenswrapper[4752]: I0929 10:45:03.113282 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:03 crc kubenswrapper[4752]: I0929 10:45:03.113361 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:03 crc kubenswrapper[4752]: I0929 10:45:03.113382 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:03 crc kubenswrapper[4752]: I0929 10:45:03.113409 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:03 crc kubenswrapper[4752]: I0929 10:45:03.113430 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:03Z","lastTransitionTime":"2025-09-29T10:45:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:03 crc kubenswrapper[4752]: I0929 10:45:03.216426 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:03 crc kubenswrapper[4752]: I0929 10:45:03.216501 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:03 crc kubenswrapper[4752]: I0929 10:45:03.216523 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:03 crc kubenswrapper[4752]: I0929 10:45:03.216555 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:03 crc kubenswrapper[4752]: I0929 10:45:03.216574 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:03Z","lastTransitionTime":"2025-09-29T10:45:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:03 crc kubenswrapper[4752]: I0929 10:45:03.319497 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:03 crc kubenswrapper[4752]: I0929 10:45:03.319572 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:03 crc kubenswrapper[4752]: I0929 10:45:03.319587 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:03 crc kubenswrapper[4752]: I0929 10:45:03.319612 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:03 crc kubenswrapper[4752]: I0929 10:45:03.319639 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:03Z","lastTransitionTime":"2025-09-29T10:45:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:03 crc kubenswrapper[4752]: I0929 10:45:03.422567 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:03 crc kubenswrapper[4752]: I0929 10:45:03.422615 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:03 crc kubenswrapper[4752]: I0929 10:45:03.422626 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:03 crc kubenswrapper[4752]: I0929 10:45:03.422643 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:03 crc kubenswrapper[4752]: I0929 10:45:03.422654 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:03Z","lastTransitionTime":"2025-09-29T10:45:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:03 crc kubenswrapper[4752]: I0929 10:45:03.471063 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0a33b92e-d79c-4162-8500-df7a89df8df3-metrics-certs\") pod \"network-metrics-daemon-sq7f4\" (UID: \"0a33b92e-d79c-4162-8500-df7a89df8df3\") " pod="openshift-multus/network-metrics-daemon-sq7f4" Sep 29 10:45:03 crc kubenswrapper[4752]: E0929 10:45:03.471276 4752 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 29 10:45:03 crc kubenswrapper[4752]: E0929 10:45:03.471349 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0a33b92e-d79c-4162-8500-df7a89df8df3-metrics-certs podName:0a33b92e-d79c-4162-8500-df7a89df8df3 nodeName:}" failed. No retries permitted until 2025-09-29 10:45:11.471328503 +0000 UTC m=+52.260470170 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0a33b92e-d79c-4162-8500-df7a89df8df3-metrics-certs") pod "network-metrics-daemon-sq7f4" (UID: "0a33b92e-d79c-4162-8500-df7a89df8df3") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 29 10:45:03 crc kubenswrapper[4752]: I0929 10:45:03.526185 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:03 crc kubenswrapper[4752]: I0929 10:45:03.526868 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:03 crc kubenswrapper[4752]: I0929 10:45:03.526920 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:03 crc kubenswrapper[4752]: I0929 10:45:03.526947 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:03 crc kubenswrapper[4752]: I0929 10:45:03.526962 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:03Z","lastTransitionTime":"2025-09-29T10:45:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:03 crc kubenswrapper[4752]: I0929 10:45:03.630624 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:03 crc kubenswrapper[4752]: I0929 10:45:03.630662 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:03 crc kubenswrapper[4752]: I0929 10:45:03.630672 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:03 crc kubenswrapper[4752]: I0929 10:45:03.630693 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:03 crc kubenswrapper[4752]: I0929 10:45:03.630710 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:03Z","lastTransitionTime":"2025-09-29T10:45:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:03 crc kubenswrapper[4752]: I0929 10:45:03.734281 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:03 crc kubenswrapper[4752]: I0929 10:45:03.734341 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:03 crc kubenswrapper[4752]: I0929 10:45:03.734359 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:03 crc kubenswrapper[4752]: I0929 10:45:03.734386 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:03 crc kubenswrapper[4752]: I0929 10:45:03.734404 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:03Z","lastTransitionTime":"2025-09-29T10:45:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:03 crc kubenswrapper[4752]: I0929 10:45:03.844196 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:03 crc kubenswrapper[4752]: I0929 10:45:03.844234 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:03 crc kubenswrapper[4752]: I0929 10:45:03.844242 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:03 crc kubenswrapper[4752]: I0929 10:45:03.844258 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:03 crc kubenswrapper[4752]: I0929 10:45:03.844267 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:03Z","lastTransitionTime":"2025-09-29T10:45:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:03 crc kubenswrapper[4752]: I0929 10:45:03.947333 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:03 crc kubenswrapper[4752]: I0929 10:45:03.947377 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:03 crc kubenswrapper[4752]: I0929 10:45:03.947390 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:03 crc kubenswrapper[4752]: I0929 10:45:03.947404 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:03 crc kubenswrapper[4752]: I0929 10:45:03.947413 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:03Z","lastTransitionTime":"2025-09-29T10:45:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:04 crc kubenswrapper[4752]: I0929 10:45:04.050273 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:04 crc kubenswrapper[4752]: I0929 10:45:04.050332 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:04 crc kubenswrapper[4752]: I0929 10:45:04.050346 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:04 crc kubenswrapper[4752]: I0929 10:45:04.050369 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:04 crc kubenswrapper[4752]: I0929 10:45:04.050385 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:04Z","lastTransitionTime":"2025-09-29T10:45:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:04 crc kubenswrapper[4752]: I0929 10:45:04.153351 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:04 crc kubenswrapper[4752]: I0929 10:45:04.153411 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:04 crc kubenswrapper[4752]: I0929 10:45:04.153421 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:04 crc kubenswrapper[4752]: I0929 10:45:04.153439 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:04 crc kubenswrapper[4752]: I0929 10:45:04.153451 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:04Z","lastTransitionTime":"2025-09-29T10:45:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:04 crc kubenswrapper[4752]: I0929 10:45:04.256762 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:04 crc kubenswrapper[4752]: I0929 10:45:04.256854 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:04 crc kubenswrapper[4752]: I0929 10:45:04.256873 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:04 crc kubenswrapper[4752]: I0929 10:45:04.256895 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:04 crc kubenswrapper[4752]: I0929 10:45:04.256908 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:04Z","lastTransitionTime":"2025-09-29T10:45:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:04 crc kubenswrapper[4752]: I0929 10:45:04.359931 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:04 crc kubenswrapper[4752]: I0929 10:45:04.359990 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:04 crc kubenswrapper[4752]: I0929 10:45:04.360004 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:04 crc kubenswrapper[4752]: I0929 10:45:04.360023 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:04 crc kubenswrapper[4752]: I0929 10:45:04.360036 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:04Z","lastTransitionTime":"2025-09-29T10:45:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:04 crc kubenswrapper[4752]: I0929 10:45:04.462251 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:04 crc kubenswrapper[4752]: I0929 10:45:04.462324 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:04 crc kubenswrapper[4752]: I0929 10:45:04.462343 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:04 crc kubenswrapper[4752]: I0929 10:45:04.462374 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:04 crc kubenswrapper[4752]: I0929 10:45:04.462393 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:04Z","lastTransitionTime":"2025-09-29T10:45:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:04 crc kubenswrapper[4752]: I0929 10:45:04.566048 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:04 crc kubenswrapper[4752]: I0929 10:45:04.566111 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:04 crc kubenswrapper[4752]: I0929 10:45:04.566123 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:04 crc kubenswrapper[4752]: I0929 10:45:04.566153 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:04 crc kubenswrapper[4752]: I0929 10:45:04.566166 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:04Z","lastTransitionTime":"2025-09-29T10:45:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:04 crc kubenswrapper[4752]: I0929 10:45:04.668917 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:04 crc kubenswrapper[4752]: I0929 10:45:04.668951 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:04 crc kubenswrapper[4752]: I0929 10:45:04.668961 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:04 crc kubenswrapper[4752]: I0929 10:45:04.668975 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:04 crc kubenswrapper[4752]: I0929 10:45:04.668986 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:04Z","lastTransitionTime":"2025-09-29T10:45:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:04 crc kubenswrapper[4752]: I0929 10:45:04.771985 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:04 crc kubenswrapper[4752]: I0929 10:45:04.772053 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:04 crc kubenswrapper[4752]: I0929 10:45:04.772070 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:04 crc kubenswrapper[4752]: I0929 10:45:04.772099 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:04 crc kubenswrapper[4752]: I0929 10:45:04.772117 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:04Z","lastTransitionTime":"2025-09-29T10:45:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:04 crc kubenswrapper[4752]: I0929 10:45:04.875594 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:04 crc kubenswrapper[4752]: I0929 10:45:04.875647 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:04 crc kubenswrapper[4752]: I0929 10:45:04.875660 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:04 crc kubenswrapper[4752]: I0929 10:45:04.875683 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:04 crc kubenswrapper[4752]: I0929 10:45:04.875700 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:04Z","lastTransitionTime":"2025-09-29T10:45:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:04 crc kubenswrapper[4752]: I0929 10:45:04.979182 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:04 crc kubenswrapper[4752]: I0929 10:45:04.979270 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:04 crc kubenswrapper[4752]: I0929 10:45:04.979290 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:04 crc kubenswrapper[4752]: I0929 10:45:04.979327 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:04 crc kubenswrapper[4752]: I0929 10:45:04.979350 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:04Z","lastTransitionTime":"2025-09-29T10:45:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:05 crc kubenswrapper[4752]: I0929 10:45:05.030446 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sq7f4" Sep 29 10:45:05 crc kubenswrapper[4752]: I0929 10:45:05.030501 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 10:45:05 crc kubenswrapper[4752]: I0929 10:45:05.030513 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 10:45:05 crc kubenswrapper[4752]: E0929 10:45:05.030639 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sq7f4" podUID="0a33b92e-d79c-4162-8500-df7a89df8df3" Sep 29 10:45:05 crc kubenswrapper[4752]: I0929 10:45:05.030703 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 10:45:05 crc kubenswrapper[4752]: E0929 10:45:05.030752 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 10:45:05 crc kubenswrapper[4752]: E0929 10:45:05.030956 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 10:45:05 crc kubenswrapper[4752]: E0929 10:45:05.031135 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 10:45:05 crc kubenswrapper[4752]: I0929 10:45:05.081911 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:05 crc kubenswrapper[4752]: I0929 10:45:05.081960 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:05 crc kubenswrapper[4752]: I0929 10:45:05.081973 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:05 crc kubenswrapper[4752]: I0929 10:45:05.081992 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:05 crc kubenswrapper[4752]: I0929 10:45:05.082004 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:05Z","lastTransitionTime":"2025-09-29T10:45:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:05 crc kubenswrapper[4752]: I0929 10:45:05.184370 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:05 crc kubenswrapper[4752]: I0929 10:45:05.184433 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:05 crc kubenswrapper[4752]: I0929 10:45:05.184448 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:05 crc kubenswrapper[4752]: I0929 10:45:05.184468 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:05 crc kubenswrapper[4752]: I0929 10:45:05.184481 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:05Z","lastTransitionTime":"2025-09-29T10:45:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:05 crc kubenswrapper[4752]: I0929 10:45:05.286796 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:05 crc kubenswrapper[4752]: I0929 10:45:05.286862 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:05 crc kubenswrapper[4752]: I0929 10:45:05.286875 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:05 crc kubenswrapper[4752]: I0929 10:45:05.286893 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:05 crc kubenswrapper[4752]: I0929 10:45:05.286903 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:05Z","lastTransitionTime":"2025-09-29T10:45:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:05 crc kubenswrapper[4752]: I0929 10:45:05.389966 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:05 crc kubenswrapper[4752]: I0929 10:45:05.390038 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:05 crc kubenswrapper[4752]: I0929 10:45:05.390056 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:05 crc kubenswrapper[4752]: I0929 10:45:05.390084 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:05 crc kubenswrapper[4752]: I0929 10:45:05.390104 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:05Z","lastTransitionTime":"2025-09-29T10:45:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:05 crc kubenswrapper[4752]: I0929 10:45:05.493586 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:05 crc kubenswrapper[4752]: I0929 10:45:05.493642 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:05 crc kubenswrapper[4752]: I0929 10:45:05.493662 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:05 crc kubenswrapper[4752]: I0929 10:45:05.493688 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:05 crc kubenswrapper[4752]: I0929 10:45:05.493709 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:05Z","lastTransitionTime":"2025-09-29T10:45:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:05 crc kubenswrapper[4752]: I0929 10:45:05.596723 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:05 crc kubenswrapper[4752]: I0929 10:45:05.596775 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:05 crc kubenswrapper[4752]: I0929 10:45:05.596788 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:05 crc kubenswrapper[4752]: I0929 10:45:05.596824 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:05 crc kubenswrapper[4752]: I0929 10:45:05.596837 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:05Z","lastTransitionTime":"2025-09-29T10:45:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:05 crc kubenswrapper[4752]: I0929 10:45:05.699485 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:05 crc kubenswrapper[4752]: I0929 10:45:05.699552 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:05 crc kubenswrapper[4752]: I0929 10:45:05.699567 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:05 crc kubenswrapper[4752]: I0929 10:45:05.699588 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:05 crc kubenswrapper[4752]: I0929 10:45:05.699601 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:05Z","lastTransitionTime":"2025-09-29T10:45:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:05 crc kubenswrapper[4752]: I0929 10:45:05.802833 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:05 crc kubenswrapper[4752]: I0929 10:45:05.802910 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:05 crc kubenswrapper[4752]: I0929 10:45:05.802940 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:05 crc kubenswrapper[4752]: I0929 10:45:05.802979 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:05 crc kubenswrapper[4752]: I0929 10:45:05.803004 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:05Z","lastTransitionTime":"2025-09-29T10:45:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:05 crc kubenswrapper[4752]: I0929 10:45:05.906104 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:05 crc kubenswrapper[4752]: I0929 10:45:05.906171 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:05 crc kubenswrapper[4752]: I0929 10:45:05.906181 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:05 crc kubenswrapper[4752]: I0929 10:45:05.906199 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:05 crc kubenswrapper[4752]: I0929 10:45:05.906210 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:05Z","lastTransitionTime":"2025-09-29T10:45:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:06 crc kubenswrapper[4752]: I0929 10:45:06.009019 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:06 crc kubenswrapper[4752]: I0929 10:45:06.009079 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:06 crc kubenswrapper[4752]: I0929 10:45:06.009093 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:06 crc kubenswrapper[4752]: I0929 10:45:06.009140 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:06 crc kubenswrapper[4752]: I0929 10:45:06.009156 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:06Z","lastTransitionTime":"2025-09-29T10:45:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:06 crc kubenswrapper[4752]: I0929 10:45:06.112282 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:06 crc kubenswrapper[4752]: I0929 10:45:06.112333 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:06 crc kubenswrapper[4752]: I0929 10:45:06.112344 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:06 crc kubenswrapper[4752]: I0929 10:45:06.112363 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:06 crc kubenswrapper[4752]: I0929 10:45:06.112376 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:06Z","lastTransitionTime":"2025-09-29T10:45:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:06 crc kubenswrapper[4752]: I0929 10:45:06.215320 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:06 crc kubenswrapper[4752]: I0929 10:45:06.215366 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:06 crc kubenswrapper[4752]: I0929 10:45:06.215375 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:06 crc kubenswrapper[4752]: I0929 10:45:06.215390 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:06 crc kubenswrapper[4752]: I0929 10:45:06.215400 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:06Z","lastTransitionTime":"2025-09-29T10:45:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:06 crc kubenswrapper[4752]: I0929 10:45:06.318624 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:06 crc kubenswrapper[4752]: I0929 10:45:06.318678 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:06 crc kubenswrapper[4752]: I0929 10:45:06.318689 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:06 crc kubenswrapper[4752]: I0929 10:45:06.318710 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:06 crc kubenswrapper[4752]: I0929 10:45:06.318723 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:06Z","lastTransitionTime":"2025-09-29T10:45:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:06 crc kubenswrapper[4752]: I0929 10:45:06.421602 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:06 crc kubenswrapper[4752]: I0929 10:45:06.421651 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:06 crc kubenswrapper[4752]: I0929 10:45:06.421664 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:06 crc kubenswrapper[4752]: I0929 10:45:06.421686 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:06 crc kubenswrapper[4752]: I0929 10:45:06.421697 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:06Z","lastTransitionTime":"2025-09-29T10:45:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:06 crc kubenswrapper[4752]: I0929 10:45:06.524425 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:06 crc kubenswrapper[4752]: I0929 10:45:06.524491 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:06 crc kubenswrapper[4752]: I0929 10:45:06.524504 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:06 crc kubenswrapper[4752]: I0929 10:45:06.524525 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:06 crc kubenswrapper[4752]: I0929 10:45:06.524545 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:06Z","lastTransitionTime":"2025-09-29T10:45:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:06 crc kubenswrapper[4752]: I0929 10:45:06.627558 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:06 crc kubenswrapper[4752]: I0929 10:45:06.627613 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:06 crc kubenswrapper[4752]: I0929 10:45:06.627626 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:06 crc kubenswrapper[4752]: I0929 10:45:06.627644 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:06 crc kubenswrapper[4752]: I0929 10:45:06.627659 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:06Z","lastTransitionTime":"2025-09-29T10:45:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:06 crc kubenswrapper[4752]: I0929 10:45:06.731131 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:06 crc kubenswrapper[4752]: I0929 10:45:06.731175 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:06 crc kubenswrapper[4752]: I0929 10:45:06.731185 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:06 crc kubenswrapper[4752]: I0929 10:45:06.731201 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:06 crc kubenswrapper[4752]: I0929 10:45:06.731213 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:06Z","lastTransitionTime":"2025-09-29T10:45:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:06 crc kubenswrapper[4752]: I0929 10:45:06.834495 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:06 crc kubenswrapper[4752]: I0929 10:45:06.834572 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:06 crc kubenswrapper[4752]: I0929 10:45:06.834585 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:06 crc kubenswrapper[4752]: I0929 10:45:06.834608 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:06 crc kubenswrapper[4752]: I0929 10:45:06.834622 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:06Z","lastTransitionTime":"2025-09-29T10:45:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:06 crc kubenswrapper[4752]: I0929 10:45:06.938276 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:06 crc kubenswrapper[4752]: I0929 10:45:06.938324 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:06 crc kubenswrapper[4752]: I0929 10:45:06.938335 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:06 crc kubenswrapper[4752]: I0929 10:45:06.938354 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:06 crc kubenswrapper[4752]: I0929 10:45:06.938365 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:06Z","lastTransitionTime":"2025-09-29T10:45:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:07 crc kubenswrapper[4752]: I0929 10:45:07.030498 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 10:45:07 crc kubenswrapper[4752]: I0929 10:45:07.030498 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 10:45:07 crc kubenswrapper[4752]: E0929 10:45:07.030665 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 10:45:07 crc kubenswrapper[4752]: I0929 10:45:07.030766 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 10:45:07 crc kubenswrapper[4752]: E0929 10:45:07.030905 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 10:45:07 crc kubenswrapper[4752]: I0929 10:45:07.030522 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sq7f4" Sep 29 10:45:07 crc kubenswrapper[4752]: E0929 10:45:07.030980 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 10:45:07 crc kubenswrapper[4752]: E0929 10:45:07.031228 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sq7f4" podUID="0a33b92e-d79c-4162-8500-df7a89df8df3" Sep 29 10:45:07 crc kubenswrapper[4752]: I0929 10:45:07.041030 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:07 crc kubenswrapper[4752]: I0929 10:45:07.041092 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:07 crc kubenswrapper[4752]: I0929 10:45:07.041103 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:07 crc kubenswrapper[4752]: I0929 10:45:07.041125 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:07 crc kubenswrapper[4752]: I0929 10:45:07.041172 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:07Z","lastTransitionTime":"2025-09-29T10:45:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:07 crc kubenswrapper[4752]: I0929 10:45:07.144477 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:07 crc kubenswrapper[4752]: I0929 10:45:07.144543 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:07 crc kubenswrapper[4752]: I0929 10:45:07.144555 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:07 crc kubenswrapper[4752]: I0929 10:45:07.144595 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:07 crc kubenswrapper[4752]: I0929 10:45:07.144610 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:07Z","lastTransitionTime":"2025-09-29T10:45:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:07 crc kubenswrapper[4752]: I0929 10:45:07.220075 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:07 crc kubenswrapper[4752]: I0929 10:45:07.220148 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:07 crc kubenswrapper[4752]: I0929 10:45:07.220167 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:07 crc kubenswrapper[4752]: I0929 10:45:07.220192 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:07 crc kubenswrapper[4752]: I0929 10:45:07.220210 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:07Z","lastTransitionTime":"2025-09-29T10:45:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:07 crc kubenswrapper[4752]: E0929 10:45:07.240969 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:45:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:45:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:45:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:45:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"67757396-6dfe-4e60-ba89-bdfd50031eb3\\\",\\\"systemUUID\\\":\\\"d8106fc8-56a6-4aa2-998a-aa38bb8caa68\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:07Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:07 crc kubenswrapper[4752]: I0929 10:45:07.247060 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:07 crc kubenswrapper[4752]: I0929 10:45:07.247120 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:07 crc kubenswrapper[4752]: I0929 10:45:07.247140 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:07 crc kubenswrapper[4752]: I0929 10:45:07.247167 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:07 crc kubenswrapper[4752]: I0929 10:45:07.247186 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:07Z","lastTransitionTime":"2025-09-29T10:45:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:07 crc kubenswrapper[4752]: E0929 10:45:07.272190 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:45:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:45:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:45:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:45:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"67757396-6dfe-4e60-ba89-bdfd50031eb3\\\",\\\"systemUUID\\\":\\\"d8106fc8-56a6-4aa2-998a-aa38bb8caa68\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:07Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:07 crc kubenswrapper[4752]: I0929 10:45:07.277400 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:07 crc kubenswrapper[4752]: I0929 10:45:07.277464 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:07 crc kubenswrapper[4752]: I0929 10:45:07.277477 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:07 crc kubenswrapper[4752]: I0929 10:45:07.277531 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:07 crc kubenswrapper[4752]: I0929 10:45:07.277548 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:07Z","lastTransitionTime":"2025-09-29T10:45:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:07 crc kubenswrapper[4752]: E0929 10:45:07.292391 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:45:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:45:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:45:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:45:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"67757396-6dfe-4e60-ba89-bdfd50031eb3\\\",\\\"systemUUID\\\":\\\"d8106fc8-56a6-4aa2-998a-aa38bb8caa68\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:07Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:07 crc kubenswrapper[4752]: I0929 10:45:07.298659 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:07 crc kubenswrapper[4752]: I0929 10:45:07.298741 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:07 crc kubenswrapper[4752]: I0929 10:45:07.298759 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:07 crc kubenswrapper[4752]: I0929 10:45:07.298787 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:07 crc kubenswrapper[4752]: I0929 10:45:07.298843 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:07Z","lastTransitionTime":"2025-09-29T10:45:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:07 crc kubenswrapper[4752]: E0929 10:45:07.320257 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:45:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:45:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:45:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:45:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"67757396-6dfe-4e60-ba89-bdfd50031eb3\\\",\\\"systemUUID\\\":\\\"d8106fc8-56a6-4aa2-998a-aa38bb8caa68\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:07Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:07 crc kubenswrapper[4752]: I0929 10:45:07.325065 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:07 crc kubenswrapper[4752]: I0929 10:45:07.325112 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:07 crc kubenswrapper[4752]: I0929 10:45:07.325124 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:07 crc kubenswrapper[4752]: I0929 10:45:07.325141 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:07 crc kubenswrapper[4752]: I0929 10:45:07.325151 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:07Z","lastTransitionTime":"2025-09-29T10:45:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:07 crc kubenswrapper[4752]: E0929 10:45:07.342131 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:45:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:45:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:45:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:45:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"67757396-6dfe-4e60-ba89-bdfd50031eb3\\\",\\\"systemUUID\\\":\\\"d8106fc8-56a6-4aa2-998a-aa38bb8caa68\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:07Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:07 crc kubenswrapper[4752]: E0929 10:45:07.342272 4752 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 29 10:45:07 crc kubenswrapper[4752]: I0929 10:45:07.344381 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:07 crc kubenswrapper[4752]: I0929 10:45:07.344419 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:07 crc kubenswrapper[4752]: I0929 10:45:07.344430 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:07 crc kubenswrapper[4752]: I0929 10:45:07.344452 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:07 crc kubenswrapper[4752]: I0929 10:45:07.344471 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:07Z","lastTransitionTime":"2025-09-29T10:45:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:07 crc kubenswrapper[4752]: I0929 10:45:07.446526 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:07 crc kubenswrapper[4752]: I0929 10:45:07.446571 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:07 crc kubenswrapper[4752]: I0929 10:45:07.446631 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:07 crc kubenswrapper[4752]: I0929 10:45:07.446648 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:07 crc kubenswrapper[4752]: I0929 10:45:07.446659 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:07Z","lastTransitionTime":"2025-09-29T10:45:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:07 crc kubenswrapper[4752]: I0929 10:45:07.549286 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:07 crc kubenswrapper[4752]: I0929 10:45:07.549331 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:07 crc kubenswrapper[4752]: I0929 10:45:07.549340 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:07 crc kubenswrapper[4752]: I0929 10:45:07.549373 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:07 crc kubenswrapper[4752]: I0929 10:45:07.549387 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:07Z","lastTransitionTime":"2025-09-29T10:45:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:07 crc kubenswrapper[4752]: I0929 10:45:07.651791 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:07 crc kubenswrapper[4752]: I0929 10:45:07.651857 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:07 crc kubenswrapper[4752]: I0929 10:45:07.651867 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:07 crc kubenswrapper[4752]: I0929 10:45:07.651887 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:07 crc kubenswrapper[4752]: I0929 10:45:07.651897 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:07Z","lastTransitionTime":"2025-09-29T10:45:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:07 crc kubenswrapper[4752]: I0929 10:45:07.755179 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:07 crc kubenswrapper[4752]: I0929 10:45:07.755229 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:07 crc kubenswrapper[4752]: I0929 10:45:07.755239 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:07 crc kubenswrapper[4752]: I0929 10:45:07.755255 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:07 crc kubenswrapper[4752]: I0929 10:45:07.755267 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:07Z","lastTransitionTime":"2025-09-29T10:45:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:07 crc kubenswrapper[4752]: I0929 10:45:07.858480 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:07 crc kubenswrapper[4752]: I0929 10:45:07.858529 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:07 crc kubenswrapper[4752]: I0929 10:45:07.858543 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:07 crc kubenswrapper[4752]: I0929 10:45:07.858565 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:07 crc kubenswrapper[4752]: I0929 10:45:07.858580 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:07Z","lastTransitionTime":"2025-09-29T10:45:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:07 crc kubenswrapper[4752]: I0929 10:45:07.955988 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" Sep 29 10:45:07 crc kubenswrapper[4752]: I0929 10:45:07.957160 4752 scope.go:117] "RemoveContainer" containerID="da3dc227c40a352ef71dec7f4fe6a59b773b7901f2ec3ec4f18c829adf8e87ed" Sep 29 10:45:07 crc kubenswrapper[4752]: I0929 10:45:07.966218 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:07 crc kubenswrapper[4752]: I0929 10:45:07.966288 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:07 crc kubenswrapper[4752]: I0929 10:45:07.966301 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:07 crc kubenswrapper[4752]: I0929 10:45:07.966322 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:07 crc kubenswrapper[4752]: I0929 10:45:07.966332 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:07Z","lastTransitionTime":"2025-09-29T10:45:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:08 crc kubenswrapper[4752]: I0929 10:45:08.069917 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:08 crc kubenswrapper[4752]: I0929 10:45:08.069961 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:08 crc kubenswrapper[4752]: I0929 10:45:08.069971 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:08 crc kubenswrapper[4752]: I0929 10:45:08.069988 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:08 crc kubenswrapper[4752]: I0929 10:45:08.069999 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:08Z","lastTransitionTime":"2025-09-29T10:45:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:08 crc kubenswrapper[4752]: I0929 10:45:08.172355 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:08 crc kubenswrapper[4752]: I0929 10:45:08.172851 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:08 crc kubenswrapper[4752]: I0929 10:45:08.172863 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:08 crc kubenswrapper[4752]: I0929 10:45:08.172883 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:08 crc kubenswrapper[4752]: I0929 10:45:08.172895 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:08Z","lastTransitionTime":"2025-09-29T10:45:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:08 crc kubenswrapper[4752]: I0929 10:45:08.275699 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:08 crc kubenswrapper[4752]: I0929 10:45:08.275749 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:08 crc kubenswrapper[4752]: I0929 10:45:08.275761 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:08 crc kubenswrapper[4752]: I0929 10:45:08.275780 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:08 crc kubenswrapper[4752]: I0929 10:45:08.275796 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:08Z","lastTransitionTime":"2025-09-29T10:45:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:08 crc kubenswrapper[4752]: I0929 10:45:08.379427 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:08 crc kubenswrapper[4752]: I0929 10:45:08.379476 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:08 crc kubenswrapper[4752]: I0929 10:45:08.379488 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:08 crc kubenswrapper[4752]: I0929 10:45:08.379511 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:08 crc kubenswrapper[4752]: I0929 10:45:08.379529 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:08Z","lastTransitionTime":"2025-09-29T10:45:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:08 crc kubenswrapper[4752]: I0929 10:45:08.389722 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c2vrh_94028c24-ec10-4d5c-b32c-1700e677d539/ovnkube-controller/1.log" Sep 29 10:45:08 crc kubenswrapper[4752]: I0929 10:45:08.393664 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" event={"ID":"94028c24-ec10-4d5c-b32c-1700e677d539","Type":"ContainerStarted","Data":"d7274c1c9e4153ac28534a3b9f58c87c2e5480650edf3522e235805aea87dd76"} Sep 29 10:45:08 crc kubenswrapper[4752]: I0929 10:45:08.394329 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" Sep 29 10:45:08 crc kubenswrapper[4752]: I0929 10:45:08.412126 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520a5d33-312c-4033-8b69-5dd582f13ccc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6223734bbce461c09916aea7629bba0cfa97ea17050bca7417020ece9ae031a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1157b82d6f3337270d30abdceadaa1f0a01b3c6d8de6bc8e9edf083a8264f19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://854abd6205c2eec2229d0d65aec3edb7cf1cc1e77759df41bd22deda4a08c8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://362298e6215cc1a9971973419e58a45e5ded2c4120b1e800afd87f480f6fd3d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c927118840179fccacbe6a18a329c117cef73a6e914bf38d20fc2439d6a5c1ee\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"message\\\":\\\"le observer\\\\nW0929 10:44:40.787758 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0929 10:44:40.787900 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 10:44:40.788558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1487283959/tls.crt::/tmp/serving-cert-1487283959/tls.key\\\\\\\"\\\\nI0929 10:44:41.256284 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 10:44:41.261265 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 10:44:41.261291 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 10:44:41.261311 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 10:44:41.261316 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 10:44:41.267824 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0929 10:44:41.267847 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0929 10:44:41.267849 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 10:44:41.267871 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 10:44:41.267876 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 10:44:41.267879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 10:44:41.267882 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 10:44:41.267884 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0929 10:44:41.270258 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbe61bb570ef2be352bb3a0e55da353ce7b618b397e3bf9f0d66da0c9b6f1d4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80f961b58569cce6d634f225369902695ccda2e78efb1c6fd635f1535467cc1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80f961b58569cce6d634f225369902695ccda2e78efb1c6fd635f1535467cc1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:08Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:08 crc kubenswrapper[4752]: I0929 10:45:08.435654 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca4f637cfcb1e52fa69f0ffa46b3a53459225d9ad4afd1178bff709e812c5418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b70242846937de5b4dda37a2b8c48947fded378c299ea4ad857168589d7c175\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:08Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:08 crc kubenswrapper[4752]: I0929 10:45:08.448741 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7kp7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66a61a7f-9be6-486b-a425-62ed62ec0ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4170732970e5e7c429279d239eb2d4b9d8249ff254b35f38ff80d0321087be2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kgr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7kp7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:08Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:08 crc kubenswrapper[4752]: I0929 10:45:08.465419 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vm6zb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f30a1f9-86ef-450e-9f8c-8ef8d4ac380a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6bc5aff417397c8b264553f67de7ebd1aeadb67fb83114c5bb13c2e0d10e397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239ca1f17b9f1e1d6ba63b196e34066fe7fb37373453460261044f5fcaf819af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://239ca1f17b9f1e1d6ba63b196e34066fe7fb37373453460261044f5fcaf819af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd5b369dc688f11e4ab502a3886b722cba392fce0d3ac7850bd59abffbf7dee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd5b369dc688f11e4ab502a3886b722cba392fce0d3ac7850bd59abffbf7dee2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d17821abed9aca5c20373738f44ca9a61e954d1eee46f0d16c3e9b34d810a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88d17821abed9aca5c20373738f44ca9a61e954d1eee46f0d16c3e9b34d810a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50f5727e0bd53639ba6b6632f2d62c7c62ae74b07a60aa1cb58c2020990cae42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f5727e0bd53639ba6b6632f2d62c7c62ae74b07a60aa1cb58c2020990cae42\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd84740e3b0a970decedcc3960fb987fa618f9627f06be1d2d0b034d0361f805\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd84740e3b0a970decedcc3960fb987fa618f9627f06be1d2d0b034d0361f805\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6af6d9f7c1ca6625f88dcaa9ef267cf11f3ebb16a0ce12d3c2442550bc0833ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6af6d9f7c1ca6625f88dcaa9ef267cf11f3ebb16a0ce12d3c2442550bc0833ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vm6zb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:08Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:08 crc kubenswrapper[4752]: I0929 10:45:08.482778 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:08 crc kubenswrapper[4752]: I0929 10:45:08.482845 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:08 crc kubenswrapper[4752]: I0929 10:45:08.482858 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:08 crc kubenswrapper[4752]: I0929 10:45:08.482879 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:08 crc kubenswrapper[4752]: I0929 10:45:08.482893 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:08Z","lastTransitionTime":"2025-09-29T10:45:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:08 crc kubenswrapper[4752]: I0929 10:45:08.490162 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94028c24-ec10-4d5c-b32c-1700e677d539\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://486ac9c45cc8e6cc88a199b152343c1db14c51125b4357c85d5d082467fc4560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2860691a355a598f52a1f13213198fa7889748e67cca21a617ed5714f5eabcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e34a55130babbc5fbe9fb81d05fc687dc1b06c3bffea762ba699f9f6c317b312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5985eb5ebc8fa2ca986873aea235335770621597493b43eaa58d98329cd37009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b46368b26939edaf377aa86ef45fc9dc3ec4fa274dfe1cba458bafb8d32309e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a98f237ee9baeb799b2ea76ccbe7b349ed70b50f47738fc514ae56b46ee8d1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7274c1c9e4153ac28534a3b9f58c87c2e5480650edf3522e235805aea87dd76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da3dc227c40a352ef71dec7f4fe6a59b773b7901f2ec3ec4f18c829adf8e87ed\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T10:44:53Z\\\",\\\"message\\\":\\\"mers/factory.go:160\\\\nI0929 10:44:53.327068 6174 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0929 10:44:53.326997 6174 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0929 10:44:53.327207 6174 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0929 10:44:53.327278 6174 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0929 10:44:53.327541 6174 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0929 10:44:53.328126 6174 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0929 10:44:53.328200 6174 factory.go:656] Stopping watch factory\\\\nI0929 10:44:53.328228 6174 handler.go:208] Removed *v1.Node event handler 2\\\\nI0929 10:44:53.331274 6174 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0929 10:44:53.331331 6174 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0929 10:44:53.331404 6174 ovnkube.go:599] Stopped ovnkube\\\\nI0929 10:44:53.331469 6174 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0929 10:44:53.331650 6174 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:45:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea11fb795febf50e35263b0a02c32a01fd69937dfbfe196696cd1792e40cc191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f22dfbbd26fb3ebf4869b46406913cc1963e33c11794193c815235be5acee338\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f22dfbbd26fb3ebf4869b46406913cc1963e33c11794193c815235be5acee338\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c2vrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:08Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:08 crc kubenswrapper[4752]: I0929 10:45:08.509534 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3e5d3a3-2f2d-4f61-ae95-26ebd1f72342\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66d77cd5048e199a6eae84be4079c3b00305f4f5223b5176a49df0feb2f0bf8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74b270e951a827068c908168bf04d4cd3bcba62e472e4a3f415de8b7463fdccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dd4d83f6d6b5db7fc93239bc1a6b731c67bc15ef1ca1990b53589e4ad36bfa7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39ef26bf3e7b95ac9a59199bbabe11fd4e831baba1b120ef97a4839c0c4aab7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:08Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:08 crc kubenswrapper[4752]: I0929 10:45:08.538903 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:08Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:08 crc kubenswrapper[4752]: I0929 10:45:08.554232 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fb781fd16d4a9f56202eb1724ed1a4ed6700ff7b81819573b955bcb07e563a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:08Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:08 crc kubenswrapper[4752]: I0929 10:45:08.568550 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xv5q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52fc9378-c37b-424b-afde-7b191bab5fde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30ee75a35da106cc9424c7a3f97f28d0c711200667372c023612db4a9701c189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4rqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xv5q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:08Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:08 crc kubenswrapper[4752]: I0929 10:45:08.579758 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mp5pm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65f5485e-9000-4512-aad3-7d367715ac2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db5dba49df10714a5f00ec40865af87528f6bee63ee58a89f299af7c10e4d769\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z772z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://073cf9e4675b04d77ad58f0b7e1b313e3fe15e8daee4e1c8934a90924b04ad22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z772z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mp5pm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:08Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:08 crc kubenswrapper[4752]: I0929 10:45:08.586375 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:08 crc kubenswrapper[4752]: I0929 10:45:08.586426 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:08 crc kubenswrapper[4752]: I0929 10:45:08.586436 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:08 crc kubenswrapper[4752]: I0929 10:45:08.586455 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:08 crc kubenswrapper[4752]: I0929 10:45:08.586469 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:08Z","lastTransitionTime":"2025-09-29T10:45:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:08 crc kubenswrapper[4752]: I0929 10:45:08.611177 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sq7f4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a33b92e-d79c-4162-8500-df7a89df8df3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qck2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qck2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sq7f4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:08Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:08 crc kubenswrapper[4752]: I0929 10:45:08.639183 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:08Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:08 crc kubenswrapper[4752]: I0929 10:45:08.668531 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 29 10:45:08 crc kubenswrapper[4752]: I0929 10:45:08.673443 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48ad7053-6039-4b1a-9729-fcbe1d938928\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00965359c30aa25677d4b114c00b339b155ab4b5316d5e355536bea5b65eaba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e2d86e0821e0155affe296e5cc70e9904f04c800943101e62509e3a5e4e0808\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9378a6f1ac902b030f4ecabac1eae40f884dc1546a360e178f38300e137d8b0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a174bcfad22c2a58c48792478272705c80a56775b45b14919ea1de1dd92b4cbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://828d416b69696f709d91feb8df8fead0f95be74a91c5dab25756e341e29413dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e4ae4f6e0a6df2f1e370b0ff37704c0b0252752c0d8e8a1cdd83088ca9ec951\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e4ae4f6e0a6df2f1e370b0ff37704c0b0252752c0d8e8a1cdd83088ca9ec951\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40c90938f79ba960fa16979dd5f239674df4b13cae8b0b5d3bb48b0e46219a34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40c90938f79ba960fa16979dd5f239674df4b13cae8b0b5d3bb48b0e46219a34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f99c6fe84624f3e518bbe35ee9b700effb126ff1f36d995262b7ed8b73364780\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f99c6fe84624f3e518bbe35ee9b700effb126ff1f36d995262b7ed8b73364780\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:08Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:08 crc kubenswrapper[4752]: I0929 10:45:08.678108 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Sep 29 10:45:08 crc kubenswrapper[4752]: I0929 10:45:08.688616 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:08 crc kubenswrapper[4752]: I0929 10:45:08.688665 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:08 crc kubenswrapper[4752]: I0929 10:45:08.688676 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:08 crc kubenswrapper[4752]: I0929 10:45:08.688696 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:08 crc kubenswrapper[4752]: I0929 10:45:08.688710 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:08Z","lastTransitionTime":"2025-09-29T10:45:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:08 crc kubenswrapper[4752]: I0929 10:45:08.690569 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:08Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:08 crc kubenswrapper[4752]: I0929 10:45:08.704726 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131d2c8a72fc6a373ebf6835840e6b9c1829db4c78b4961bf36642fd0e8a5636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:08Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:08 crc kubenswrapper[4752]: I0929 10:45:08.720776 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5863c243-797d-462a-b11f-71aaf005f8d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://166738b29f01996ec981fd00b49f422e4a97fe774396e7ea153ad29ef30a7370\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdtpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32155f6078e9c15abe4c659ac79b064ec182a232ea1d816998da4de273b7aa67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdtpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mgrvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:08Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:08 crc kubenswrapper[4752]: I0929 10:45:08.735939 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4whp8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"398b6e5c-29ac-4701-9207-d3d269b62224\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63db080ebca3f5ea23ddc9af874b6b500abe8044c73794ae0749df2949fb9520\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9hp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4whp8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:08Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:08 crc kubenswrapper[4752]: I0929 10:45:08.754354 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vm6zb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f30a1f9-86ef-450e-9f8c-8ef8d4ac380a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6bc5aff417397c8b264553f67de7ebd1aeadb67fb83114c5bb13c2e0d10e397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239ca1f17b9f1e1d6ba63b196e34066fe7fb37373453460261044f5fcaf819af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://239ca1f17b9f1e1d6ba63b196e34066fe7fb37373453460261044f5fcaf819af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd5b369dc688f11e4ab502a3886b722cba392fce0d3ac7850bd59abffbf7dee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd5b369dc688f11e4ab502a3886b722cba392fce0d3ac7850bd59abffbf7dee2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d17821abed9aca5c20373738f44ca9a61e954d1eee46f0d16c3e9b34d810a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88d17821abed9aca5c20373738f44ca9a61e954d1eee46f0d16c3e9b34d810a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50f5727e0bd53639ba6b6632f2d62c7c62ae74b07a60aa1cb58c2020990cae42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f5727e0bd53639ba6b6632f2d62c7c62ae74b07a60aa1cb58c2020990cae42\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd84740e3b0a970decedcc3960fb987fa618f9627f06be1d2d0b034d0361f805\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd84740e3b0a970decedcc3960fb987fa618f9627f06be1d2d0b034d0361f805\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6af6d9f7c1ca6625f88dcaa9ef267cf11f3ebb16a0ce12d3c2442550bc0833ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6af6d9f7c1ca6625f88dcaa9ef267cf11f3ebb16a0ce12d3c2442550bc0833ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vm6zb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:08Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:08 crc kubenswrapper[4752]: I0929 10:45:08.775567 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94028c24-ec10-4d5c-b32c-1700e677d539\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://486ac9c45cc8e6cc88a199b152343c1db14c51125b4357c85d5d082467fc4560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2860691a355a598f52a1f13213198fa7889748e67cca21a617ed5714f5eabcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e34a55130babbc5fbe9fb81d05fc687dc1b06c3bffea762ba699f9f6c317b312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5985eb5ebc8fa2ca986873aea235335770621597493b43eaa58d98329cd37009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b46368b26939edaf377aa86ef45fc9dc3ec4fa274dfe1cba458bafb8d32309e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a98f237ee9baeb799b2ea76ccbe7b349ed70b50f47738fc514ae56b46ee8d1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7274c1c9e4153ac28534a3b9f58c87c2e5480650edf3522e235805aea87dd76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da3dc227c40a352ef71dec7f4fe6a59b773b7901f2ec3ec4f18c829adf8e87ed\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T10:44:53Z\\\",\\\"message\\\":\\\"mers/factory.go:160\\\\nI0929 10:44:53.327068 6174 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0929 10:44:53.326997 6174 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0929 10:44:53.327207 6174 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0929 10:44:53.327278 6174 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0929 10:44:53.327541 6174 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0929 10:44:53.328126 6174 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0929 10:44:53.328200 6174 factory.go:656] Stopping watch factory\\\\nI0929 10:44:53.328228 6174 handler.go:208] Removed *v1.Node event handler 2\\\\nI0929 10:44:53.331274 6174 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0929 10:44:53.331331 6174 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0929 10:44:53.331404 6174 ovnkube.go:599] Stopped ovnkube\\\\nI0929 10:44:53.331469 6174 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0929 10:44:53.331650 6174 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:45:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea11fb795febf50e35263b0a02c32a01fd69937dfbfe196696cd1792e40cc191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f22dfbbd26fb3ebf4869b46406913cc1963e33c11794193c815235be5acee338\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f22dfbbd26fb3ebf4869b46406913cc1963e33c11794193c815235be5acee338\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c2vrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:08Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:08 crc kubenswrapper[4752]: I0929 10:45:08.791069 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:08 crc kubenswrapper[4752]: I0929 10:45:08.791122 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:08 crc kubenswrapper[4752]: I0929 10:45:08.791136 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:08 crc kubenswrapper[4752]: I0929 10:45:08.791156 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:08 crc kubenswrapper[4752]: I0929 10:45:08.791168 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:08Z","lastTransitionTime":"2025-09-29T10:45:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:08 crc kubenswrapper[4752]: I0929 10:45:08.791087 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520a5d33-312c-4033-8b69-5dd582f13ccc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6223734bbce461c09916aea7629bba0cfa97ea17050bca7417020ece9ae031a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1157b82d6f3337270d30abdceadaa1f0a01b3c6d8de6bc8e9edf083a8264f19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://854abd6205c2eec2229d0d65aec3edb7cf1cc1e77759df41bd22deda4a08c8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://362298e6215cc1a9971973419e58a45e5ded2c4120b1e800afd87f480f6fd3d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c927118840179fccacbe6a18a329c117cef73a6e914bf38d20fc2439d6a5c1ee\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"message\\\":\\\"le observer\\\\nW0929 10:44:40.787758 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0929 10:44:40.787900 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 10:44:40.788558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1487283959/tls.crt::/tmp/serving-cert-1487283959/tls.key\\\\\\\"\\\\nI0929 10:44:41.256284 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 10:44:41.261265 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 10:44:41.261291 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 10:44:41.261311 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 10:44:41.261316 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 10:44:41.267824 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0929 10:44:41.267847 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0929 10:44:41.267849 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 10:44:41.267871 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 10:44:41.267876 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 10:44:41.267879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 10:44:41.267882 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 10:44:41.267884 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0929 10:44:41.270258 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbe61bb570ef2be352bb3a0e55da353ce7b618b397e3bf9f0d66da0c9b6f1d4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80f961b58569cce6d634f225369902695ccda2e78efb1c6fd635f1535467cc1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80f961b58569cce6d634f225369902695ccda2e78efb1c6fd635f1535467cc1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:08Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:08 crc kubenswrapper[4752]: I0929 10:45:08.807736 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca4f637cfcb1e52fa69f0ffa46b3a53459225d9ad4afd1178bff709e812c5418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b70242846937de5b4dda37a2b8c48947fded378c299ea4ad857168589d7c175\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:08Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:08 crc kubenswrapper[4752]: I0929 10:45:08.823221 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7kp7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66a61a7f-9be6-486b-a425-62ed62ec0ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4170732970e5e7c429279d239eb2d4b9d8249ff254b35f38ff80d0321087be2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kgr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7kp7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:08Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:08 crc kubenswrapper[4752]: I0929 10:45:08.839856 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fb781fd16d4a9f56202eb1724ed1a4ed6700ff7b81819573b955bcb07e563a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:08Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:08 crc kubenswrapper[4752]: I0929 10:45:08.854949 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xv5q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52fc9378-c37b-424b-afde-7b191bab5fde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30ee75a35da106cc9424c7a3f97f28d0c711200667372c023612db4a9701c189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4rqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xv5q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:08Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:08 crc kubenswrapper[4752]: I0929 10:45:08.868592 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mp5pm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65f5485e-9000-4512-aad3-7d367715ac2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db5dba49df10714a5f00ec40865af87528f6bee63ee58a89f299af7c10e4d769\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z772z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://073cf9e4675b04d77ad58f0b7e1b313e3fe15e8daee4e1c8934a90924b04ad22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z772z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mp5pm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:08Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:08 crc kubenswrapper[4752]: I0929 10:45:08.882078 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sq7f4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a33b92e-d79c-4162-8500-df7a89df8df3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qck2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qck2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sq7f4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:08Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:08 crc kubenswrapper[4752]: I0929 10:45:08.893547 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:08 crc kubenswrapper[4752]: I0929 10:45:08.893599 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:08 crc kubenswrapper[4752]: I0929 10:45:08.893610 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:08 crc kubenswrapper[4752]: I0929 10:45:08.893630 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:08 crc kubenswrapper[4752]: I0929 10:45:08.893643 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:08Z","lastTransitionTime":"2025-09-29T10:45:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:08 crc kubenswrapper[4752]: I0929 10:45:08.897563 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3e5d3a3-2f2d-4f61-ae95-26ebd1f72342\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66d77cd5048e199a6eae84be4079c3b00305f4f5223b5176a49df0feb2f0bf8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74b270e951a827068c908168bf04d4cd3bcba62e472e4a3f415de8b7463fdccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dd4d83f6d6b5db7fc93239bc1a6b731c67bc15ef1ca1990b53589e4ad36bfa7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39ef26bf3e7b95ac9a59199bbabe11fd4e831baba1b120ef97a4839c0c4aab7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:08Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:08 crc kubenswrapper[4752]: I0929 10:45:08.913927 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:08Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:08 crc kubenswrapper[4752]: I0929 10:45:08.928424 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22c62d6c-d29c-416f-bfeb-476f97181a39\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bab580564f9dd31f6b2ea23a31918a9fdd2f247d13a0bd882f38dbaee4bf0b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44a5132d9611cf58eef747d86fd0cef4eb52366b9d1bacc6df0cf5be145d3998\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c6d3ad808fe69e726b66a03be183d33f000a614fadbc7f644015633fbb2b457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5335487c039d2e7e80a940cfe980fb46caf0cfc6302660b9318d9c8c525227cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5335487c039d2e7e80a940cfe980fb46caf0cfc6302660b9318d9c8c525227cb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:08Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:08 crc kubenswrapper[4752]: I0929 10:45:08.942754 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:08Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:08 crc kubenswrapper[4752]: I0929 10:45:08.955631 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5863c243-797d-462a-b11f-71aaf005f8d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://166738b29f01996ec981fd00b49f422e4a97fe774396e7ea153ad29ef30a7370\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdtpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32155f6078e9c15abe4c659ac79b064ec182a232ea1d816998da4de273b7aa67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdtpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mgrvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:08Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:08 crc kubenswrapper[4752]: I0929 10:45:08.968866 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4whp8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"398b6e5c-29ac-4701-9207-d3d269b62224\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63db080ebca3f5ea23ddc9af874b6b500abe8044c73794ae0749df2949fb9520\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9hp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4whp8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:08Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:08 crc kubenswrapper[4752]: I0929 10:45:08.991353 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48ad7053-6039-4b1a-9729-fcbe1d938928\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00965359c30aa25677d4b114c00b339b155ab4b5316d5e355536bea5b65eaba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e2d86e0821e0155affe296e5cc70e9904f04c800943101e62509e3a5e4e0808\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9378a6f1ac902b030f4ecabac1eae40f884dc1546a360e178f38300e137d8b0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a174bcfad22c2a58c48792478272705c80a56775b45b14919ea1de1dd92b4cbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://828d416b69696f709d91feb8df8fead0f95be74a91c5dab25756e341e29413dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e4ae4f6e0a6df2f1e370b0ff37704c0b0252752c0d8e8a1cdd83088ca9ec951\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e4ae4f6e0a6df2f1e370b0ff37704c0b0252752c0d8e8a1cdd83088ca9ec951\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40c90938f79ba960fa16979dd5f239674df4b13cae8b0b5d3bb48b0e46219a34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40c90938f79ba960fa16979dd5f239674df4b13cae8b0b5d3bb48b0e46219a34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f99c6fe84624f3e518bbe35ee9b700effb126ff1f36d995262b7ed8b73364780\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f99c6fe84624f3e518bbe35ee9b700effb126ff1f36d995262b7ed8b73364780\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:08Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:08 crc kubenswrapper[4752]: I0929 10:45:08.996282 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:08 crc kubenswrapper[4752]: I0929 10:45:08.996316 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:08 crc kubenswrapper[4752]: I0929 10:45:08.996325 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:08 crc kubenswrapper[4752]: I0929 10:45:08.996341 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:08 crc kubenswrapper[4752]: I0929 10:45:08.996351 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:08Z","lastTransitionTime":"2025-09-29T10:45:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:09 crc kubenswrapper[4752]: I0929 10:45:09.006583 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:09Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:09 crc kubenswrapper[4752]: I0929 10:45:09.020369 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131d2c8a72fc6a373ebf6835840e6b9c1829db4c78b4961bf36642fd0e8a5636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:09Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:09 crc kubenswrapper[4752]: I0929 10:45:09.030580 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 10:45:09 crc kubenswrapper[4752]: I0929 10:45:09.030642 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 10:45:09 crc kubenswrapper[4752]: E0929 10:45:09.030745 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 10:45:09 crc kubenswrapper[4752]: I0929 10:45:09.030767 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 10:45:09 crc kubenswrapper[4752]: E0929 10:45:09.030819 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 10:45:09 crc kubenswrapper[4752]: I0929 10:45:09.030580 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sq7f4" Sep 29 10:45:09 crc kubenswrapper[4752]: E0929 10:45:09.030863 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 10:45:09 crc kubenswrapper[4752]: E0929 10:45:09.030933 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sq7f4" podUID="0a33b92e-d79c-4162-8500-df7a89df8df3" Sep 29 10:45:09 crc kubenswrapper[4752]: I0929 10:45:09.103202 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:09 crc kubenswrapper[4752]: I0929 10:45:09.103266 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:09 crc kubenswrapper[4752]: I0929 10:45:09.103281 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:09 crc kubenswrapper[4752]: I0929 10:45:09.103301 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:09 crc kubenswrapper[4752]: I0929 10:45:09.103317 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:09Z","lastTransitionTime":"2025-09-29T10:45:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:09 crc kubenswrapper[4752]: I0929 10:45:09.206941 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:09 crc kubenswrapper[4752]: I0929 10:45:09.206991 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:09 crc kubenswrapper[4752]: I0929 10:45:09.207003 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:09 crc kubenswrapper[4752]: I0929 10:45:09.207023 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:09 crc kubenswrapper[4752]: I0929 10:45:09.207037 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:09Z","lastTransitionTime":"2025-09-29T10:45:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:09 crc kubenswrapper[4752]: I0929 10:45:09.310024 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:09 crc kubenswrapper[4752]: I0929 10:45:09.310077 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:09 crc kubenswrapper[4752]: I0929 10:45:09.310091 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:09 crc kubenswrapper[4752]: I0929 10:45:09.310130 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:09 crc kubenswrapper[4752]: I0929 10:45:09.310141 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:09Z","lastTransitionTime":"2025-09-29T10:45:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:09 crc kubenswrapper[4752]: I0929 10:45:09.400369 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c2vrh_94028c24-ec10-4d5c-b32c-1700e677d539/ovnkube-controller/2.log" Sep 29 10:45:09 crc kubenswrapper[4752]: I0929 10:45:09.401524 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c2vrh_94028c24-ec10-4d5c-b32c-1700e677d539/ovnkube-controller/1.log" Sep 29 10:45:09 crc kubenswrapper[4752]: I0929 10:45:09.405792 4752 generic.go:334] "Generic (PLEG): container finished" podID="94028c24-ec10-4d5c-b32c-1700e677d539" containerID="d7274c1c9e4153ac28534a3b9f58c87c2e5480650edf3522e235805aea87dd76" exitCode=1 Sep 29 10:45:09 crc kubenswrapper[4752]: I0929 10:45:09.407188 4752 scope.go:117] "RemoveContainer" containerID="d7274c1c9e4153ac28534a3b9f58c87c2e5480650edf3522e235805aea87dd76" Sep 29 10:45:09 crc kubenswrapper[4752]: I0929 10:45:09.407192 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" event={"ID":"94028c24-ec10-4d5c-b32c-1700e677d539","Type":"ContainerDied","Data":"d7274c1c9e4153ac28534a3b9f58c87c2e5480650edf3522e235805aea87dd76"} Sep 29 10:45:09 crc kubenswrapper[4752]: I0929 10:45:09.407320 4752 scope.go:117] "RemoveContainer" containerID="da3dc227c40a352ef71dec7f4fe6a59b773b7901f2ec3ec4f18c829adf8e87ed" Sep 29 10:45:09 crc kubenswrapper[4752]: E0929 10:45:09.407353 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-c2vrh_openshift-ovn-kubernetes(94028c24-ec10-4d5c-b32c-1700e677d539)\"" pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" podUID="94028c24-ec10-4d5c-b32c-1700e677d539" Sep 29 10:45:09 crc kubenswrapper[4752]: I0929 10:45:09.413253 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:09 crc kubenswrapper[4752]: I0929 10:45:09.413294 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:09 crc kubenswrapper[4752]: I0929 10:45:09.413304 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:09 crc kubenswrapper[4752]: I0929 10:45:09.413324 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:09 crc kubenswrapper[4752]: I0929 10:45:09.413337 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:09Z","lastTransitionTime":"2025-09-29T10:45:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:09 crc kubenswrapper[4752]: I0929 10:45:09.438730 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48ad7053-6039-4b1a-9729-fcbe1d938928\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00965359c30aa25677d4b114c00b339b155ab4b5316d5e355536bea5b65eaba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e2d86e0821e0155affe296e5cc70e9904f04c800943101e62509e3a5e4e0808\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9378a6f1ac902b030f4ecabac1eae40f884dc1546a360e178f38300e137d8b0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a174bcfad22c2a58c48792478272705c80a56775b45b14919ea1de1dd92b4cbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://828d416b69696f709d91feb8df8fead0f95be74a91c5dab25756e341e29413dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e4ae4f6e0a6df2f1e370b0ff37704c0b0252752c0d8e8a1cdd83088ca9ec951\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e4ae4f6e0a6df2f1e370b0ff37704c0b0252752c0d8e8a1cdd83088ca9ec951\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40c90938f79ba960fa16979dd5f239674df4b13cae8b0b5d3bb48b0e46219a34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40c90938f79ba960fa16979dd5f239674df4b13cae8b0b5d3bb48b0e46219a34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f99c6fe84624f3e518bbe35ee9b700effb126ff1f36d995262b7ed8b73364780\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f99c6fe84624f3e518bbe35ee9b700effb126ff1f36d995262b7ed8b73364780\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:09Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:09 crc kubenswrapper[4752]: I0929 10:45:09.457501 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:09Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:09 crc kubenswrapper[4752]: I0929 10:45:09.474126 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131d2c8a72fc6a373ebf6835840e6b9c1829db4c78b4961bf36642fd0e8a5636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:09Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:09 crc kubenswrapper[4752]: I0929 10:45:09.491052 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5863c243-797d-462a-b11f-71aaf005f8d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://166738b29f01996ec981fd00b49f422e4a97fe774396e7ea153ad29ef30a7370\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdtpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32155f6078e9c15abe4c659ac79b064ec182a232ea1d816998da4de273b7aa67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdtpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mgrvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:09Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:09 crc kubenswrapper[4752]: I0929 10:45:09.506492 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4whp8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"398b6e5c-29ac-4701-9207-d3d269b62224\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63db080ebca3f5ea23ddc9af874b6b500abe8044c73794ae0749df2949fb9520\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9hp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4whp8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:09Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:09 crc kubenswrapper[4752]: I0929 10:45:09.516311 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:09 crc kubenswrapper[4752]: I0929 10:45:09.516350 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:09 crc kubenswrapper[4752]: I0929 10:45:09.516362 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:09 crc kubenswrapper[4752]: I0929 10:45:09.516382 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:09 crc kubenswrapper[4752]: I0929 10:45:09.516396 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:09Z","lastTransitionTime":"2025-09-29T10:45:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:09 crc kubenswrapper[4752]: I0929 10:45:09.526123 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520a5d33-312c-4033-8b69-5dd582f13ccc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6223734bbce461c09916aea7629bba0cfa97ea17050bca7417020ece9ae031a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1157b82d6f3337270d30abdceadaa1f0a01b3c6d8de6bc8e9edf083a8264f19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://854abd6205c2eec2229d0d65aec3edb7cf1cc1e77759df41bd22deda4a08c8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://362298e6215cc1a9971973419e58a45e5ded2c4120b1e800afd87f480f6fd3d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c927118840179fccacbe6a18a329c117cef73a6e914bf38d20fc2439d6a5c1ee\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"message\\\":\\\"le observer\\\\nW0929 10:44:40.787758 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0929 10:44:40.787900 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 10:44:40.788558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1487283959/tls.crt::/tmp/serving-cert-1487283959/tls.key\\\\\\\"\\\\nI0929 10:44:41.256284 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 10:44:41.261265 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 10:44:41.261291 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 10:44:41.261311 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 10:44:41.261316 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 10:44:41.267824 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0929 10:44:41.267847 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0929 10:44:41.267849 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 10:44:41.267871 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 10:44:41.267876 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 10:44:41.267879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 10:44:41.267882 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 10:44:41.267884 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0929 10:44:41.270258 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbe61bb570ef2be352bb3a0e55da353ce7b618b397e3bf9f0d66da0c9b6f1d4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80f961b58569cce6d634f225369902695ccda2e78efb1c6fd635f1535467cc1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80f961b58569cce6d634f225369902695ccda2e78efb1c6fd635f1535467cc1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:09Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:09 crc kubenswrapper[4752]: I0929 10:45:09.544505 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca4f637cfcb1e52fa69f0ffa46b3a53459225d9ad4afd1178bff709e812c5418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b70242846937de5b4dda37a2b8c48947fded378c299ea4ad857168589d7c175\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:09Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:09 crc kubenswrapper[4752]: I0929 10:45:09.558321 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7kp7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66a61a7f-9be6-486b-a425-62ed62ec0ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4170732970e5e7c429279d239eb2d4b9d8249ff254b35f38ff80d0321087be2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kgr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7kp7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:09Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:09 crc kubenswrapper[4752]: I0929 10:45:09.576540 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vm6zb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f30a1f9-86ef-450e-9f8c-8ef8d4ac380a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6bc5aff417397c8b264553f67de7ebd1aeadb67fb83114c5bb13c2e0d10e397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239ca1f17b9f1e1d6ba63b196e34066fe7fb37373453460261044f5fcaf819af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://239ca1f17b9f1e1d6ba63b196e34066fe7fb37373453460261044f5fcaf819af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd5b369dc688f11e4ab502a3886b722cba392fce0d3ac7850bd59abffbf7dee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd5b369dc688f11e4ab502a3886b722cba392fce0d3ac7850bd59abffbf7dee2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d17821abed9aca5c20373738f44ca9a61e954d1eee46f0d16c3e9b34d810a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88d17821abed9aca5c20373738f44ca9a61e954d1eee46f0d16c3e9b34d810a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50f5727e0bd53639ba6b6632f2d62c7c62ae74b07a60aa1cb58c2020990cae42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f5727e0bd53639ba6b6632f2d62c7c62ae74b07a60aa1cb58c2020990cae42\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd84740e3b0a970decedcc3960fb987fa618f9627f06be1d2d0b034d0361f805\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd84740e3b0a970decedcc3960fb987fa618f9627f06be1d2d0b034d0361f805\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6af6d9f7c1ca6625f88dcaa9ef267cf11f3ebb16a0ce12d3c2442550bc0833ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6af6d9f7c1ca6625f88dcaa9ef267cf11f3ebb16a0ce12d3c2442550bc0833ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vm6zb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:09Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:09 crc kubenswrapper[4752]: I0929 10:45:09.606034 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94028c24-ec10-4d5c-b32c-1700e677d539\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://486ac9c45cc8e6cc88a199b152343c1db14c51125b4357c85d5d082467fc4560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2860691a355a598f52a1f13213198fa7889748e67cca21a617ed5714f5eabcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e34a55130babbc5fbe9fb81d05fc687dc1b06c3bffea762ba699f9f6c317b312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5985eb5ebc8fa2ca986873aea235335770621597493b43eaa58d98329cd37009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b46368b26939edaf377aa86ef45fc9dc3ec4fa274dfe1cba458bafb8d32309e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a98f237ee9baeb799b2ea76ccbe7b349ed70b50f47738fc514ae56b46ee8d1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7274c1c9e4153ac28534a3b9f58c87c2e5480650edf3522e235805aea87dd76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da3dc227c40a352ef71dec7f4fe6a59b773b7901f2ec3ec4f18c829adf8e87ed\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T10:44:53Z\\\",\\\"message\\\":\\\"mers/factory.go:160\\\\nI0929 10:44:53.327068 6174 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0929 10:44:53.326997 6174 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0929 10:44:53.327207 6174 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0929 10:44:53.327278 6174 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0929 10:44:53.327541 6174 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0929 10:44:53.328126 6174 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0929 10:44:53.328200 6174 factory.go:656] Stopping watch factory\\\\nI0929 10:44:53.328228 6174 handler.go:208] Removed *v1.Node event handler 2\\\\nI0929 10:44:53.331274 6174 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0929 10:44:53.331331 6174 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0929 10:44:53.331404 6174 ovnkube.go:599] Stopped ovnkube\\\\nI0929 10:44:53.331469 6174 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0929 10:44:53.331650 6174 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7274c1c9e4153ac28534a3b9f58c87c2e5480650edf3522e235805aea87dd76\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T10:45:09Z\\\",\\\"message\\\":\\\"update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-marketplace/marketplace-operator-metrics]} name:Service_openshift-marketplace/marketplace-operator-metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.53:8081: 10.217.5.53:8383:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {89fe421e-04e8-4967-ac75-77a0e6f784ef}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0929 10:45:08.961904 6414 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI0929 10:45:08.961922 6414 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI0929 10:45:08.961943 6414 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nF0929 10:45:08.961950 6414 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T10:45:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea11fb795febf50e35263b0a02c32a01fd69937dfbfe196696cd1792e40cc191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f22dfbbd26fb3ebf4869b46406913cc1963e33c11794193c815235be5acee338\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f22dfbbd26fb3ebf4869b46406913cc1963e33c11794193c815235be5acee338\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c2vrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:09Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:09 crc kubenswrapper[4752]: I0929 10:45:09.619761 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:09 crc kubenswrapper[4752]: I0929 10:45:09.619831 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:09 crc kubenswrapper[4752]: I0929 10:45:09.619842 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:09 crc kubenswrapper[4752]: I0929 10:45:09.619859 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:09 crc kubenswrapper[4752]: I0929 10:45:09.619873 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:09Z","lastTransitionTime":"2025-09-29T10:45:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:09 crc kubenswrapper[4752]: I0929 10:45:09.623218 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3e5d3a3-2f2d-4f61-ae95-26ebd1f72342\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66d77cd5048e199a6eae84be4079c3b00305f4f5223b5176a49df0feb2f0bf8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74b270e951a827068c908168bf04d4cd3bcba62e472e4a3f415de8b7463fdccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dd4d83f6d6b5db7fc93239bc1a6b731c67bc15ef1ca1990b53589e4ad36bfa7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39ef26bf3e7b95ac9a59199bbabe11fd4e831baba1b120ef97a4839c0c4aab7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:09Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:09 crc kubenswrapper[4752]: I0929 10:45:09.639433 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:09Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:09 crc kubenswrapper[4752]: I0929 10:45:09.653780 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fb781fd16d4a9f56202eb1724ed1a4ed6700ff7b81819573b955bcb07e563a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:09Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:09 crc kubenswrapper[4752]: I0929 10:45:09.670080 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xv5q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52fc9378-c37b-424b-afde-7b191bab5fde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30ee75a35da106cc9424c7a3f97f28d0c711200667372c023612db4a9701c189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4rqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xv5q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:09Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:09 crc kubenswrapper[4752]: I0929 10:45:09.683725 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mp5pm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65f5485e-9000-4512-aad3-7d367715ac2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db5dba49df10714a5f00ec40865af87528f6bee63ee58a89f299af7c10e4d769\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z772z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://073cf9e4675b04d77ad58f0b7e1b313e3fe15e8daee4e1c8934a90924b04ad22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z772z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mp5pm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:09Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:09 crc kubenswrapper[4752]: I0929 10:45:09.695413 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sq7f4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a33b92e-d79c-4162-8500-df7a89df8df3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qck2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qck2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sq7f4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:09Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:09 crc kubenswrapper[4752]: I0929 10:45:09.708414 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22c62d6c-d29c-416f-bfeb-476f97181a39\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bab580564f9dd31f6b2ea23a31918a9fdd2f247d13a0bd882f38dbaee4bf0b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44a5132d9611cf58eef747d86fd0cef4eb52366b9d1bacc6df0cf5be145d3998\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c6d3ad808fe69e726b66a03be183d33f000a614fadbc7f644015633fbb2b457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5335487c039d2e7e80a940cfe980fb46caf0cfc6302660b9318d9c8c525227cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5335487c039d2e7e80a940cfe980fb46caf0cfc6302660b9318d9c8c525227cb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:09Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:09 crc kubenswrapper[4752]: I0929 10:45:09.721862 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:09Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:09 crc kubenswrapper[4752]: I0929 10:45:09.722560 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:09 crc kubenswrapper[4752]: I0929 10:45:09.722626 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:09 crc kubenswrapper[4752]: I0929 10:45:09.722640 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:09 crc kubenswrapper[4752]: I0929 10:45:09.722661 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:09 crc kubenswrapper[4752]: I0929 10:45:09.722677 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:09Z","lastTransitionTime":"2025-09-29T10:45:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:09 crc kubenswrapper[4752]: I0929 10:45:09.825849 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:09 crc kubenswrapper[4752]: I0929 10:45:09.825892 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:09 crc kubenswrapper[4752]: I0929 10:45:09.825901 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:09 crc kubenswrapper[4752]: I0929 10:45:09.825917 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:09 crc kubenswrapper[4752]: I0929 10:45:09.825929 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:09Z","lastTransitionTime":"2025-09-29T10:45:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:09 crc kubenswrapper[4752]: I0929 10:45:09.929097 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:09 crc kubenswrapper[4752]: I0929 10:45:09.929140 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:09 crc kubenswrapper[4752]: I0929 10:45:09.929154 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:09 crc kubenswrapper[4752]: I0929 10:45:09.929174 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:09 crc kubenswrapper[4752]: I0929 10:45:09.929188 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:09Z","lastTransitionTime":"2025-09-29T10:45:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:10 crc kubenswrapper[4752]: I0929 10:45:10.032466 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:10 crc kubenswrapper[4752]: I0929 10:45:10.033162 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:10 crc kubenswrapper[4752]: I0929 10:45:10.033204 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:10 crc kubenswrapper[4752]: I0929 10:45:10.033231 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:10 crc kubenswrapper[4752]: I0929 10:45:10.033245 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:10Z","lastTransitionTime":"2025-09-29T10:45:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:10 crc kubenswrapper[4752]: I0929 10:45:10.046384 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7kp7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66a61a7f-9be6-486b-a425-62ed62ec0ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4170732970e5e7c429279d239eb2d4b9d8249ff254b35f38ff80d0321087be2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kgr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7kp7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:10Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:10 crc kubenswrapper[4752]: I0929 10:45:10.070147 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vm6zb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f30a1f9-86ef-450e-9f8c-8ef8d4ac380a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6bc5aff417397c8b264553f67de7ebd1aeadb67fb83114c5bb13c2e0d10e397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239ca1f17b9f1e1d6ba63b196e34066fe7fb37373453460261044f5fcaf819af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://239ca1f17b9f1e1d6ba63b196e34066fe7fb37373453460261044f5fcaf819af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd5b369dc688f11e4ab502a3886b722cba392fce0d3ac7850bd59abffbf7dee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd5b369dc688f11e4ab502a3886b722cba392fce0d3ac7850bd59abffbf7dee2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d17821abed9aca5c20373738f44ca9a61e954d1eee46f0d16c3e9b34d810a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88d17821abed9aca5c20373738f44ca9a61e954d1eee46f0d16c3e9b34d810a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50f5727e0bd53639ba6b6632f2d62c7c62ae74b07a60aa1cb58c2020990cae42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f5727e0bd53639ba6b6632f2d62c7c62ae74b07a60aa1cb58c2020990cae42\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd84740e3b0a970decedcc3960fb987fa618f9627f06be1d2d0b034d0361f805\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd84740e3b0a970decedcc3960fb987fa618f9627f06be1d2d0b034d0361f805\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6af6d9f7c1ca6625f88dcaa9ef267cf11f3ebb16a0ce12d3c2442550bc0833ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6af6d9f7c1ca6625f88dcaa9ef267cf11f3ebb16a0ce12d3c2442550bc0833ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vm6zb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:10Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:10 crc kubenswrapper[4752]: I0929 10:45:10.102053 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94028c24-ec10-4d5c-b32c-1700e677d539\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://486ac9c45cc8e6cc88a199b152343c1db14c51125b4357c85d5d082467fc4560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2860691a355a598f52a1f13213198fa7889748e67cca21a617ed5714f5eabcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e34a55130babbc5fbe9fb81d05fc687dc1b06c3bffea762ba699f9f6c317b312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5985eb5ebc8fa2ca986873aea235335770621597493b43eaa58d98329cd37009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b46368b26939edaf377aa86ef45fc9dc3ec4fa274dfe1cba458bafb8d32309e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a98f237ee9baeb799b2ea76ccbe7b349ed70b50f47738fc514ae56b46ee8d1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7274c1c9e4153ac28534a3b9f58c87c2e5480650edf3522e235805aea87dd76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da3dc227c40a352ef71dec7f4fe6a59b773b7901f2ec3ec4f18c829adf8e87ed\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T10:44:53Z\\\",\\\"message\\\":\\\"mers/factory.go:160\\\\nI0929 10:44:53.327068 6174 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0929 10:44:53.326997 6174 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0929 10:44:53.327207 6174 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0929 10:44:53.327278 6174 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0929 10:44:53.327541 6174 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0929 10:44:53.328126 6174 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0929 10:44:53.328200 6174 factory.go:656] Stopping watch factory\\\\nI0929 10:44:53.328228 6174 handler.go:208] Removed *v1.Node event handler 2\\\\nI0929 10:44:53.331274 6174 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0929 10:44:53.331331 6174 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0929 10:44:53.331404 6174 ovnkube.go:599] Stopped ovnkube\\\\nI0929 10:44:53.331469 6174 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0929 10:44:53.331650 6174 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7274c1c9e4153ac28534a3b9f58c87c2e5480650edf3522e235805aea87dd76\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T10:45:09Z\\\",\\\"message\\\":\\\"update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-marketplace/marketplace-operator-metrics]} name:Service_openshift-marketplace/marketplace-operator-metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.53:8081: 10.217.5.53:8383:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {89fe421e-04e8-4967-ac75-77a0e6f784ef}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0929 10:45:08.961904 6414 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI0929 10:45:08.961922 6414 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI0929 10:45:08.961943 6414 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nF0929 10:45:08.961950 6414 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T10:45:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea11fb795febf50e35263b0a02c32a01fd69937dfbfe196696cd1792e40cc191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f22dfbbd26fb3ebf4869b46406913cc1963e33c11794193c815235be5acee338\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f22dfbbd26fb3ebf4869b46406913cc1963e33c11794193c815235be5acee338\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c2vrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:10Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:10 crc kubenswrapper[4752]: I0929 10:45:10.123397 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520a5d33-312c-4033-8b69-5dd582f13ccc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6223734bbce461c09916aea7629bba0cfa97ea17050bca7417020ece9ae031a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1157b82d6f3337270d30abdceadaa1f0a01b3c6d8de6bc8e9edf083a8264f19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://854abd6205c2eec2229d0d65aec3edb7cf1cc1e77759df41bd22deda4a08c8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://362298e6215cc1a9971973419e58a45e5ded2c4120b1e800afd87f480f6fd3d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c927118840179fccacbe6a18a329c117cef73a6e914bf38d20fc2439d6a5c1ee\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"message\\\":\\\"le observer\\\\nW0929 10:44:40.787758 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0929 10:44:40.787900 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 10:44:40.788558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1487283959/tls.crt::/tmp/serving-cert-1487283959/tls.key\\\\\\\"\\\\nI0929 10:44:41.256284 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 10:44:41.261265 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 10:44:41.261291 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 10:44:41.261311 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 10:44:41.261316 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 10:44:41.267824 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0929 10:44:41.267847 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0929 10:44:41.267849 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 10:44:41.267871 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 10:44:41.267876 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 10:44:41.267879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 10:44:41.267882 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 10:44:41.267884 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0929 10:44:41.270258 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbe61bb570ef2be352bb3a0e55da353ce7b618b397e3bf9f0d66da0c9b6f1d4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80f961b58569cce6d634f225369902695ccda2e78efb1c6fd635f1535467cc1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80f961b58569cce6d634f225369902695ccda2e78efb1c6fd635f1535467cc1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:10Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:10 crc kubenswrapper[4752]: I0929 10:45:10.136173 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:10 crc kubenswrapper[4752]: I0929 10:45:10.136241 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:10 crc kubenswrapper[4752]: I0929 10:45:10.136251 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:10 crc kubenswrapper[4752]: I0929 10:45:10.136290 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:10 crc kubenswrapper[4752]: I0929 10:45:10.136303 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:10Z","lastTransitionTime":"2025-09-29T10:45:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:10 crc kubenswrapper[4752]: I0929 10:45:10.147187 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca4f637cfcb1e52fa69f0ffa46b3a53459225d9ad4afd1178bff709e812c5418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b70242846937de5b4dda37a2b8c48947fded378c299ea4ad857168589d7c175\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:10Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:10 crc kubenswrapper[4752]: I0929 10:45:10.164607 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:10Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:10 crc kubenswrapper[4752]: I0929 10:45:10.181579 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fb781fd16d4a9f56202eb1724ed1a4ed6700ff7b81819573b955bcb07e563a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:10Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:10 crc kubenswrapper[4752]: I0929 10:45:10.198003 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xv5q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52fc9378-c37b-424b-afde-7b191bab5fde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30ee75a35da106cc9424c7a3f97f28d0c711200667372c023612db4a9701c189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4rqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xv5q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:10Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:10 crc kubenswrapper[4752]: I0929 10:45:10.215222 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mp5pm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65f5485e-9000-4512-aad3-7d367715ac2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db5dba49df10714a5f00ec40865af87528f6bee63ee58a89f299af7c10e4d769\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z772z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://073cf9e4675b04d77ad58f0b7e1b313e3fe15e8daee4e1c8934a90924b04ad22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z772z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mp5pm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:10Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:10 crc kubenswrapper[4752]: I0929 10:45:10.230684 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sq7f4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a33b92e-d79c-4162-8500-df7a89df8df3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qck2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qck2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sq7f4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:10Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:10 crc kubenswrapper[4752]: I0929 10:45:10.239400 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:10 crc kubenswrapper[4752]: I0929 10:45:10.239458 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:10 crc kubenswrapper[4752]: I0929 10:45:10.239469 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:10 crc kubenswrapper[4752]: I0929 10:45:10.239492 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:10 crc kubenswrapper[4752]: I0929 10:45:10.239511 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:10Z","lastTransitionTime":"2025-09-29T10:45:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:10 crc kubenswrapper[4752]: I0929 10:45:10.251062 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3e5d3a3-2f2d-4f61-ae95-26ebd1f72342\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66d77cd5048e199a6eae84be4079c3b00305f4f5223b5176a49df0feb2f0bf8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74b270e951a827068c908168bf04d4cd3bcba62e472e4a3f415de8b7463fdccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dd4d83f6d6b5db7fc93239bc1a6b731c67bc15ef1ca1990b53589e4ad36bfa7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39ef26bf3e7b95ac9a59199bbabe11fd4e831baba1b120ef97a4839c0c4aab7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:10Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:10 crc kubenswrapper[4752]: I0929 10:45:10.266312 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22c62d6c-d29c-416f-bfeb-476f97181a39\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bab580564f9dd31f6b2ea23a31918a9fdd2f247d13a0bd882f38dbaee4bf0b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44a5132d9611cf58eef747d86fd0cef4eb52366b9d1bacc6df0cf5be145d3998\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c6d3ad808fe69e726b66a03be183d33f000a614fadbc7f644015633fbb2b457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5335487c039d2e7e80a940cfe980fb46caf0cfc6302660b9318d9c8c525227cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5335487c039d2e7e80a940cfe980fb46caf0cfc6302660b9318d9c8c525227cb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:10Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:10 crc kubenswrapper[4752]: I0929 10:45:10.282605 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:10Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:10 crc kubenswrapper[4752]: I0929 10:45:10.300625 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131d2c8a72fc6a373ebf6835840e6b9c1829db4c78b4961bf36642fd0e8a5636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:10Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:10 crc kubenswrapper[4752]: I0929 10:45:10.314824 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5863c243-797d-462a-b11f-71aaf005f8d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://166738b29f01996ec981fd00b49f422e4a97fe774396e7ea153ad29ef30a7370\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdtpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32155f6078e9c15abe4c659ac79b064ec182a232ea1d816998da4de273b7aa67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdtpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mgrvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:10Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:10 crc kubenswrapper[4752]: I0929 10:45:10.328543 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4whp8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"398b6e5c-29ac-4701-9207-d3d269b62224\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63db080ebca3f5ea23ddc9af874b6b500abe8044c73794ae0749df2949fb9520\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9hp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4whp8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:10Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:10 crc kubenswrapper[4752]: I0929 10:45:10.342836 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:10 crc kubenswrapper[4752]: I0929 10:45:10.342904 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:10 crc kubenswrapper[4752]: I0929 10:45:10.342919 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:10 crc kubenswrapper[4752]: I0929 10:45:10.342943 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:10 crc kubenswrapper[4752]: I0929 10:45:10.342958 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:10Z","lastTransitionTime":"2025-09-29T10:45:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:10 crc kubenswrapper[4752]: I0929 10:45:10.349209 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48ad7053-6039-4b1a-9729-fcbe1d938928\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00965359c30aa25677d4b114c00b339b155ab4b5316d5e355536bea5b65eaba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e2d86e0821e0155affe296e5cc70e9904f04c800943101e62509e3a5e4e0808\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9378a6f1ac902b030f4ecabac1eae40f884dc1546a360e178f38300e137d8b0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a174bcfad22c2a58c48792478272705c80a56775b45b14919ea1de1dd92b4cbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://828d416b69696f709d91feb8df8fead0f95be74a91c5dab25756e341e29413dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e4ae4f6e0a6df2f1e370b0ff37704c0b0252752c0d8e8a1cdd83088ca9ec951\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e4ae4f6e0a6df2f1e370b0ff37704c0b0252752c0d8e8a1cdd83088ca9ec951\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40c90938f79ba960fa16979dd5f239674df4b13cae8b0b5d3bb48b0e46219a34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40c90938f79ba960fa16979dd5f239674df4b13cae8b0b5d3bb48b0e46219a34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f99c6fe84624f3e518bbe35ee9b700effb126ff1f36d995262b7ed8b73364780\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f99c6fe84624f3e518bbe35ee9b700effb126ff1f36d995262b7ed8b73364780\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:10Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:10 crc kubenswrapper[4752]: I0929 10:45:10.366076 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:10Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:10 crc kubenswrapper[4752]: I0929 10:45:10.411879 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c2vrh_94028c24-ec10-4d5c-b32c-1700e677d539/ovnkube-controller/2.log" Sep 29 10:45:10 crc kubenswrapper[4752]: I0929 10:45:10.416492 4752 scope.go:117] "RemoveContainer" containerID="d7274c1c9e4153ac28534a3b9f58c87c2e5480650edf3522e235805aea87dd76" Sep 29 10:45:10 crc kubenswrapper[4752]: E0929 10:45:10.416700 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-c2vrh_openshift-ovn-kubernetes(94028c24-ec10-4d5c-b32c-1700e677d539)\"" pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" podUID="94028c24-ec10-4d5c-b32c-1700e677d539" Sep 29 10:45:10 crc kubenswrapper[4752]: I0929 10:45:10.439600 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94028c24-ec10-4d5c-b32c-1700e677d539\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://486ac9c45cc8e6cc88a199b152343c1db14c51125b4357c85d5d082467fc4560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2860691a355a598f52a1f13213198fa7889748e67cca21a617ed5714f5eabcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e34a55130babbc5fbe9fb81d05fc687dc1b06c3bffea762ba699f9f6c317b312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5985eb5ebc8fa2ca986873aea235335770621597493b43eaa58d98329cd37009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b46368b26939edaf377aa86ef45fc9dc3ec4fa274dfe1cba458bafb8d32309e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a98f237ee9baeb799b2ea76ccbe7b349ed70b50f47738fc514ae56b46ee8d1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7274c1c9e4153ac28534a3b9f58c87c2e5480650edf3522e235805aea87dd76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7274c1c9e4153ac28534a3b9f58c87c2e5480650edf3522e235805aea87dd76\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T10:45:09Z\\\",\\\"message\\\":\\\"update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-marketplace/marketplace-operator-metrics]} name:Service_openshift-marketplace/marketplace-operator-metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.53:8081: 10.217.5.53:8383:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {89fe421e-04e8-4967-ac75-77a0e6f784ef}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0929 10:45:08.961904 6414 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI0929 10:45:08.961922 6414 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI0929 10:45:08.961943 6414 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nF0929 10:45:08.961950 6414 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T10:45:08Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-c2vrh_openshift-ovn-kubernetes(94028c24-ec10-4d5c-b32c-1700e677d539)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea11fb795febf50e35263b0a02c32a01fd69937dfbfe196696cd1792e40cc191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f22dfbbd26fb3ebf4869b46406913cc1963e33c11794193c815235be5acee338\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f22dfbbd26fb3ebf4869b46406913cc1963e33c11794193c815235be5acee338\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c2vrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:10Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:10 crc kubenswrapper[4752]: I0929 10:45:10.445054 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:10 crc kubenswrapper[4752]: I0929 10:45:10.445101 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:10 crc kubenswrapper[4752]: I0929 10:45:10.445115 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:10 crc kubenswrapper[4752]: I0929 10:45:10.445133 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:10 crc kubenswrapper[4752]: I0929 10:45:10.445147 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:10Z","lastTransitionTime":"2025-09-29T10:45:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:10 crc kubenswrapper[4752]: I0929 10:45:10.458178 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520a5d33-312c-4033-8b69-5dd582f13ccc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6223734bbce461c09916aea7629bba0cfa97ea17050bca7417020ece9ae031a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1157b82d6f3337270d30abdceadaa1f0a01b3c6d8de6bc8e9edf083a8264f19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://854abd6205c2eec2229d0d65aec3edb7cf1cc1e77759df41bd22deda4a08c8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://362298e6215cc1a9971973419e58a45e5ded2c4120b1e800afd87f480f6fd3d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c927118840179fccacbe6a18a329c117cef73a6e914bf38d20fc2439d6a5c1ee\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"message\\\":\\\"le observer\\\\nW0929 10:44:40.787758 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0929 10:44:40.787900 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 10:44:40.788558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1487283959/tls.crt::/tmp/serving-cert-1487283959/tls.key\\\\\\\"\\\\nI0929 10:44:41.256284 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 10:44:41.261265 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 10:44:41.261291 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 10:44:41.261311 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 10:44:41.261316 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 10:44:41.267824 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0929 10:44:41.267847 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0929 10:44:41.267849 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 10:44:41.267871 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 10:44:41.267876 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 10:44:41.267879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 10:44:41.267882 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 10:44:41.267884 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0929 10:44:41.270258 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbe61bb570ef2be352bb3a0e55da353ce7b618b397e3bf9f0d66da0c9b6f1d4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80f961b58569cce6d634f225369902695ccda2e78efb1c6fd635f1535467cc1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80f961b58569cce6d634f225369902695ccda2e78efb1c6fd635f1535467cc1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:10Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:10 crc kubenswrapper[4752]: I0929 10:45:10.472902 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca4f637cfcb1e52fa69f0ffa46b3a53459225d9ad4afd1178bff709e812c5418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b70242846937de5b4dda37a2b8c48947fded378c299ea4ad857168589d7c175\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:10Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:10 crc kubenswrapper[4752]: I0929 10:45:10.485513 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7kp7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66a61a7f-9be6-486b-a425-62ed62ec0ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4170732970e5e7c429279d239eb2d4b9d8249ff254b35f38ff80d0321087be2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kgr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7kp7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:10Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:10 crc kubenswrapper[4752]: I0929 10:45:10.502423 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vm6zb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f30a1f9-86ef-450e-9f8c-8ef8d4ac380a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6bc5aff417397c8b264553f67de7ebd1aeadb67fb83114c5bb13c2e0d10e397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239ca1f17b9f1e1d6ba63b196e34066fe7fb37373453460261044f5fcaf819af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://239ca1f17b9f1e1d6ba63b196e34066fe7fb37373453460261044f5fcaf819af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd5b369dc688f11e4ab502a3886b722cba392fce0d3ac7850bd59abffbf7dee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd5b369dc688f11e4ab502a3886b722cba392fce0d3ac7850bd59abffbf7dee2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d17821abed9aca5c20373738f44ca9a61e954d1eee46f0d16c3e9b34d810a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88d17821abed9aca5c20373738f44ca9a61e954d1eee46f0d16c3e9b34d810a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50f5727e0bd53639ba6b6632f2d62c7c62ae74b07a60aa1cb58c2020990cae42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f5727e0bd53639ba6b6632f2d62c7c62ae74b07a60aa1cb58c2020990cae42\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd84740e3b0a970decedcc3960fb987fa618f9627f06be1d2d0b034d0361f805\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd84740e3b0a970decedcc3960fb987fa618f9627f06be1d2d0b034d0361f805\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6af6d9f7c1ca6625f88dcaa9ef267cf11f3ebb16a0ce12d3c2442550bc0833ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6af6d9f7c1ca6625f88dcaa9ef267cf11f3ebb16a0ce12d3c2442550bc0833ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vm6zb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:10Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:10 crc kubenswrapper[4752]: I0929 10:45:10.517006 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xv5q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52fc9378-c37b-424b-afde-7b191bab5fde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30ee75a35da106cc9424c7a3f97f28d0c711200667372c023612db4a9701c189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4rqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xv5q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:10Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:10 crc kubenswrapper[4752]: I0929 10:45:10.530126 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mp5pm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65f5485e-9000-4512-aad3-7d367715ac2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db5dba49df10714a5f00ec40865af87528f6bee63ee58a89f299af7c10e4d769\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z772z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://073cf9e4675b04d77ad58f0b7e1b313e3fe15e8daee4e1c8934a90924b04ad22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z772z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mp5pm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:10Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:10 crc kubenswrapper[4752]: I0929 10:45:10.543573 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sq7f4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a33b92e-d79c-4162-8500-df7a89df8df3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qck2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qck2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sq7f4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:10Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:10 crc kubenswrapper[4752]: I0929 10:45:10.547578 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:10 crc kubenswrapper[4752]: I0929 10:45:10.547622 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:10 crc kubenswrapper[4752]: I0929 10:45:10.547631 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:10 crc kubenswrapper[4752]: I0929 10:45:10.547646 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:10 crc kubenswrapper[4752]: I0929 10:45:10.547659 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:10Z","lastTransitionTime":"2025-09-29T10:45:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:10 crc kubenswrapper[4752]: I0929 10:45:10.559958 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3e5d3a3-2f2d-4f61-ae95-26ebd1f72342\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66d77cd5048e199a6eae84be4079c3b00305f4f5223b5176a49df0feb2f0bf8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74b270e951a827068c908168bf04d4cd3bcba62e472e4a3f415de8b7463fdccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dd4d83f6d6b5db7fc93239bc1a6b731c67bc15ef1ca1990b53589e4ad36bfa7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39ef26bf3e7b95ac9a59199bbabe11fd4e831baba1b120ef97a4839c0c4aab7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:10Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:10 crc kubenswrapper[4752]: I0929 10:45:10.573972 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:10Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:10 crc kubenswrapper[4752]: I0929 10:45:10.590622 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fb781fd16d4a9f56202eb1724ed1a4ed6700ff7b81819573b955bcb07e563a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:10Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:10 crc kubenswrapper[4752]: I0929 10:45:10.604591 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22c62d6c-d29c-416f-bfeb-476f97181a39\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bab580564f9dd31f6b2ea23a31918a9fdd2f247d13a0bd882f38dbaee4bf0b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44a5132d9611cf58eef747d86fd0cef4eb52366b9d1bacc6df0cf5be145d3998\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c6d3ad808fe69e726b66a03be183d33f000a614fadbc7f644015633fbb2b457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5335487c039d2e7e80a940cfe980fb46caf0cfc6302660b9318d9c8c525227cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5335487c039d2e7e80a940cfe980fb46caf0cfc6302660b9318d9c8c525227cb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:10Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:10 crc kubenswrapper[4752]: I0929 10:45:10.618941 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:10Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:10 crc kubenswrapper[4752]: I0929 10:45:10.633942 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4whp8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"398b6e5c-29ac-4701-9207-d3d269b62224\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63db080ebca3f5ea23ddc9af874b6b500abe8044c73794ae0749df2949fb9520\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9hp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4whp8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:10Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:10 crc kubenswrapper[4752]: I0929 10:45:10.651373 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:10 crc kubenswrapper[4752]: I0929 10:45:10.651438 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:10 crc kubenswrapper[4752]: I0929 10:45:10.651454 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:10 crc kubenswrapper[4752]: I0929 10:45:10.651481 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:10 crc kubenswrapper[4752]: I0929 10:45:10.651507 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:10Z","lastTransitionTime":"2025-09-29T10:45:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:10 crc kubenswrapper[4752]: I0929 10:45:10.659842 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48ad7053-6039-4b1a-9729-fcbe1d938928\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00965359c30aa25677d4b114c00b339b155ab4b5316d5e355536bea5b65eaba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e2d86e0821e0155affe296e5cc70e9904f04c800943101e62509e3a5e4e0808\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9378a6f1ac902b030f4ecabac1eae40f884dc1546a360e178f38300e137d8b0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a174bcfad22c2a58c48792478272705c80a56775b45b14919ea1de1dd92b4cbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://828d416b69696f709d91feb8df8fead0f95be74a91c5dab25756e341e29413dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e4ae4f6e0a6df2f1e370b0ff37704c0b0252752c0d8e8a1cdd83088ca9ec951\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e4ae4f6e0a6df2f1e370b0ff37704c0b0252752c0d8e8a1cdd83088ca9ec951\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40c90938f79ba960fa16979dd5f239674df4b13cae8b0b5d3bb48b0e46219a34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40c90938f79ba960fa16979dd5f239674df4b13cae8b0b5d3bb48b0e46219a34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f99c6fe84624f3e518bbe35ee9b700effb126ff1f36d995262b7ed8b73364780\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f99c6fe84624f3e518bbe35ee9b700effb126ff1f36d995262b7ed8b73364780\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:10Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:10 crc kubenswrapper[4752]: I0929 10:45:10.673433 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:10Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:10 crc kubenswrapper[4752]: I0929 10:45:10.686656 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131d2c8a72fc6a373ebf6835840e6b9c1829db4c78b4961bf36642fd0e8a5636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:10Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:10 crc kubenswrapper[4752]: I0929 10:45:10.698868 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5863c243-797d-462a-b11f-71aaf005f8d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://166738b29f01996ec981fd00b49f422e4a97fe774396e7ea153ad29ef30a7370\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdtpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32155f6078e9c15abe4c659ac79b064ec182a232ea1d816998da4de273b7aa67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdtpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mgrvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:10Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:10 crc kubenswrapper[4752]: I0929 10:45:10.754850 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:10 crc kubenswrapper[4752]: I0929 10:45:10.754901 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:10 crc kubenswrapper[4752]: I0929 10:45:10.754915 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:10 crc kubenswrapper[4752]: I0929 10:45:10.754933 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:10 crc kubenswrapper[4752]: I0929 10:45:10.754945 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:10Z","lastTransitionTime":"2025-09-29T10:45:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:10 crc kubenswrapper[4752]: I0929 10:45:10.857558 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:10 crc kubenswrapper[4752]: I0929 10:45:10.857618 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:10 crc kubenswrapper[4752]: I0929 10:45:10.857633 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:10 crc kubenswrapper[4752]: I0929 10:45:10.857653 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:10 crc kubenswrapper[4752]: I0929 10:45:10.857668 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:10Z","lastTransitionTime":"2025-09-29T10:45:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:10 crc kubenswrapper[4752]: I0929 10:45:10.960392 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:10 crc kubenswrapper[4752]: I0929 10:45:10.960448 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:10 crc kubenswrapper[4752]: I0929 10:45:10.960466 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:10 crc kubenswrapper[4752]: I0929 10:45:10.960490 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:10 crc kubenswrapper[4752]: I0929 10:45:10.960509 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:10Z","lastTransitionTime":"2025-09-29T10:45:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:11 crc kubenswrapper[4752]: I0929 10:45:11.031140 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sq7f4" Sep 29 10:45:11 crc kubenswrapper[4752]: I0929 10:45:11.031190 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 10:45:11 crc kubenswrapper[4752]: I0929 10:45:11.031190 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 10:45:11 crc kubenswrapper[4752]: E0929 10:45:11.031887 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 10:45:11 crc kubenswrapper[4752]: E0929 10:45:11.031645 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sq7f4" podUID="0a33b92e-d79c-4162-8500-df7a89df8df3" Sep 29 10:45:11 crc kubenswrapper[4752]: I0929 10:45:11.031235 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 10:45:11 crc kubenswrapper[4752]: E0929 10:45:11.031978 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 10:45:11 crc kubenswrapper[4752]: E0929 10:45:11.032032 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 10:45:11 crc kubenswrapper[4752]: I0929 10:45:11.064719 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:11 crc kubenswrapper[4752]: I0929 10:45:11.064789 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:11 crc kubenswrapper[4752]: I0929 10:45:11.064819 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:11 crc kubenswrapper[4752]: I0929 10:45:11.064840 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:11 crc kubenswrapper[4752]: I0929 10:45:11.064852 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:11Z","lastTransitionTime":"2025-09-29T10:45:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:11 crc kubenswrapper[4752]: I0929 10:45:11.168081 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:11 crc kubenswrapper[4752]: I0929 10:45:11.168131 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:11 crc kubenswrapper[4752]: I0929 10:45:11.168143 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:11 crc kubenswrapper[4752]: I0929 10:45:11.168164 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:11 crc kubenswrapper[4752]: I0929 10:45:11.168176 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:11Z","lastTransitionTime":"2025-09-29T10:45:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:11 crc kubenswrapper[4752]: I0929 10:45:11.271411 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:11 crc kubenswrapper[4752]: I0929 10:45:11.271455 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:11 crc kubenswrapper[4752]: I0929 10:45:11.271463 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:11 crc kubenswrapper[4752]: I0929 10:45:11.271483 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:11 crc kubenswrapper[4752]: I0929 10:45:11.271495 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:11Z","lastTransitionTime":"2025-09-29T10:45:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:11 crc kubenswrapper[4752]: I0929 10:45:11.374382 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:11 crc kubenswrapper[4752]: I0929 10:45:11.374422 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:11 crc kubenswrapper[4752]: I0929 10:45:11.374432 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:11 crc kubenswrapper[4752]: I0929 10:45:11.374446 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:11 crc kubenswrapper[4752]: I0929 10:45:11.374458 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:11Z","lastTransitionTime":"2025-09-29T10:45:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:11 crc kubenswrapper[4752]: I0929 10:45:11.478128 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:11 crc kubenswrapper[4752]: I0929 10:45:11.478194 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:11 crc kubenswrapper[4752]: I0929 10:45:11.478206 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:11 crc kubenswrapper[4752]: I0929 10:45:11.478227 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:11 crc kubenswrapper[4752]: I0929 10:45:11.478242 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:11Z","lastTransitionTime":"2025-09-29T10:45:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:11 crc kubenswrapper[4752]: I0929 10:45:11.562160 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0a33b92e-d79c-4162-8500-df7a89df8df3-metrics-certs\") pod \"network-metrics-daemon-sq7f4\" (UID: \"0a33b92e-d79c-4162-8500-df7a89df8df3\") " pod="openshift-multus/network-metrics-daemon-sq7f4" Sep 29 10:45:11 crc kubenswrapper[4752]: E0929 10:45:11.562405 4752 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 29 10:45:11 crc kubenswrapper[4752]: E0929 10:45:11.562534 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0a33b92e-d79c-4162-8500-df7a89df8df3-metrics-certs podName:0a33b92e-d79c-4162-8500-df7a89df8df3 nodeName:}" failed. No retries permitted until 2025-09-29 10:45:27.562508158 +0000 UTC m=+68.351650015 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0a33b92e-d79c-4162-8500-df7a89df8df3-metrics-certs") pod "network-metrics-daemon-sq7f4" (UID: "0a33b92e-d79c-4162-8500-df7a89df8df3") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 29 10:45:11 crc kubenswrapper[4752]: I0929 10:45:11.581746 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:11 crc kubenswrapper[4752]: I0929 10:45:11.581860 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:11 crc kubenswrapper[4752]: I0929 10:45:11.581878 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:11 crc kubenswrapper[4752]: I0929 10:45:11.581903 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:11 crc kubenswrapper[4752]: I0929 10:45:11.581917 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:11Z","lastTransitionTime":"2025-09-29T10:45:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:11 crc kubenswrapper[4752]: I0929 10:45:11.684777 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:11 crc kubenswrapper[4752]: I0929 10:45:11.684868 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:11 crc kubenswrapper[4752]: I0929 10:45:11.684889 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:11 crc kubenswrapper[4752]: I0929 10:45:11.684922 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:11 crc kubenswrapper[4752]: I0929 10:45:11.684945 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:11Z","lastTransitionTime":"2025-09-29T10:45:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:11 crc kubenswrapper[4752]: I0929 10:45:11.788388 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:11 crc kubenswrapper[4752]: I0929 10:45:11.788469 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:11 crc kubenswrapper[4752]: I0929 10:45:11.788497 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:11 crc kubenswrapper[4752]: I0929 10:45:11.788527 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:11 crc kubenswrapper[4752]: I0929 10:45:11.788553 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:11Z","lastTransitionTime":"2025-09-29T10:45:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:11 crc kubenswrapper[4752]: I0929 10:45:11.891927 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:11 crc kubenswrapper[4752]: I0929 10:45:11.891997 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:11 crc kubenswrapper[4752]: I0929 10:45:11.892013 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:11 crc kubenswrapper[4752]: I0929 10:45:11.892033 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:11 crc kubenswrapper[4752]: I0929 10:45:11.892047 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:11Z","lastTransitionTime":"2025-09-29T10:45:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:11 crc kubenswrapper[4752]: I0929 10:45:11.932907 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 29 10:45:11 crc kubenswrapper[4752]: I0929 10:45:11.955403 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520a5d33-312c-4033-8b69-5dd582f13ccc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6223734bbce461c09916aea7629bba0cfa97ea17050bca7417020ece9ae031a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1157b82d6f3337270d30abdceadaa1f0a01b3c6d8de6bc8e9edf083a8264f19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://854abd6205c2eec2229d0d65aec3edb7cf1cc1e77759df41bd22deda4a08c8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://362298e6215cc1a9971973419e58a45e5ded2c4120b1e800afd87f480f6fd3d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c927118840179fccacbe6a18a329c117cef73a6e914bf38d20fc2439d6a5c1ee\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"message\\\":\\\"le observer\\\\nW0929 10:44:40.787758 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0929 10:44:40.787900 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 10:44:40.788558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1487283959/tls.crt::/tmp/serving-cert-1487283959/tls.key\\\\\\\"\\\\nI0929 10:44:41.256284 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 10:44:41.261265 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 10:44:41.261291 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 10:44:41.261311 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 10:44:41.261316 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 10:44:41.267824 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0929 10:44:41.267847 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0929 10:44:41.267849 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 10:44:41.267871 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 10:44:41.267876 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 10:44:41.267879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 10:44:41.267882 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 10:44:41.267884 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0929 10:44:41.270258 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbe61bb570ef2be352bb3a0e55da353ce7b618b397e3bf9f0d66da0c9b6f1d4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80f961b58569cce6d634f225369902695ccda2e78efb1c6fd635f1535467cc1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80f961b58569cce6d634f225369902695ccda2e78efb1c6fd635f1535467cc1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:11Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:11 crc kubenswrapper[4752]: I0929 10:45:11.976225 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca4f637cfcb1e52fa69f0ffa46b3a53459225d9ad4afd1178bff709e812c5418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b70242846937de5b4dda37a2b8c48947fded378c299ea4ad857168589d7c175\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:11Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:11 crc kubenswrapper[4752]: I0929 10:45:11.991263 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7kp7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66a61a7f-9be6-486b-a425-62ed62ec0ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4170732970e5e7c429279d239eb2d4b9d8249ff254b35f38ff80d0321087be2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kgr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7kp7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:11Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:11 crc kubenswrapper[4752]: I0929 10:45:11.995174 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:11 crc kubenswrapper[4752]: I0929 10:45:11.995213 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:11 crc kubenswrapper[4752]: I0929 10:45:11.995224 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:11 crc kubenswrapper[4752]: I0929 10:45:11.995241 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:11 crc kubenswrapper[4752]: I0929 10:45:11.995253 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:11Z","lastTransitionTime":"2025-09-29T10:45:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:12 crc kubenswrapper[4752]: I0929 10:45:12.014436 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vm6zb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f30a1f9-86ef-450e-9f8c-8ef8d4ac380a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6bc5aff417397c8b264553f67de7ebd1aeadb67fb83114c5bb13c2e0d10e397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239ca1f17b9f1e1d6ba63b196e34066fe7fb37373453460261044f5fcaf819af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://239ca1f17b9f1e1d6ba63b196e34066fe7fb37373453460261044f5fcaf819af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd5b369dc688f11e4ab502a3886b722cba392fce0d3ac7850bd59abffbf7dee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd5b369dc688f11e4ab502a3886b722cba392fce0d3ac7850bd59abffbf7dee2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d17821abed9aca5c20373738f44ca9a61e954d1eee46f0d16c3e9b34d810a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88d17821abed9aca5c20373738f44ca9a61e954d1eee46f0d16c3e9b34d810a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50f5727e0bd53639ba6b6632f2d62c7c62ae74b07a60aa1cb58c2020990cae42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f5727e0bd53639ba6b6632f2d62c7c62ae74b07a60aa1cb58c2020990cae42\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd84740e3b0a970decedcc3960fb987fa618f9627f06be1d2d0b034d0361f805\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd84740e3b0a970decedcc3960fb987fa618f9627f06be1d2d0b034d0361f805\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6af6d9f7c1ca6625f88dcaa9ef267cf11f3ebb16a0ce12d3c2442550bc0833ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6af6d9f7c1ca6625f88dcaa9ef267cf11f3ebb16a0ce12d3c2442550bc0833ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vm6zb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:12Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:12 crc kubenswrapper[4752]: I0929 10:45:12.035198 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94028c24-ec10-4d5c-b32c-1700e677d539\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://486ac9c45cc8e6cc88a199b152343c1db14c51125b4357c85d5d082467fc4560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2860691a355a598f52a1f13213198fa7889748e67cca21a617ed5714f5eabcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e34a55130babbc5fbe9fb81d05fc687dc1b06c3bffea762ba699f9f6c317b312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5985eb5ebc8fa2ca986873aea235335770621597493b43eaa58d98329cd37009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b46368b26939edaf377aa86ef45fc9dc3ec4fa274dfe1cba458bafb8d32309e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a98f237ee9baeb799b2ea76ccbe7b349ed70b50f47738fc514ae56b46ee8d1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7274c1c9e4153ac28534a3b9f58c87c2e5480650edf3522e235805aea87dd76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7274c1c9e4153ac28534a3b9f58c87c2e5480650edf3522e235805aea87dd76\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T10:45:09Z\\\",\\\"message\\\":\\\"update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-marketplace/marketplace-operator-metrics]} name:Service_openshift-marketplace/marketplace-operator-metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.53:8081: 10.217.5.53:8383:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {89fe421e-04e8-4967-ac75-77a0e6f784ef}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0929 10:45:08.961904 6414 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI0929 10:45:08.961922 6414 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI0929 10:45:08.961943 6414 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nF0929 10:45:08.961950 6414 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T10:45:08Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-c2vrh_openshift-ovn-kubernetes(94028c24-ec10-4d5c-b32c-1700e677d539)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea11fb795febf50e35263b0a02c32a01fd69937dfbfe196696cd1792e40cc191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f22dfbbd26fb3ebf4869b46406913cc1963e33c11794193c815235be5acee338\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f22dfbbd26fb3ebf4869b46406913cc1963e33c11794193c815235be5acee338\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c2vrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:12Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:12 crc kubenswrapper[4752]: I0929 10:45:12.054938 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sq7f4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a33b92e-d79c-4162-8500-df7a89df8df3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qck2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qck2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sq7f4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:12Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:12 crc kubenswrapper[4752]: I0929 10:45:12.070172 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3e5d3a3-2f2d-4f61-ae95-26ebd1f72342\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66d77cd5048e199a6eae84be4079c3b00305f4f5223b5176a49df0feb2f0bf8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74b270e951a827068c908168bf04d4cd3bcba62e472e4a3f415de8b7463fdccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dd4d83f6d6b5db7fc93239bc1a6b731c67bc15ef1ca1990b53589e4ad36bfa7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39ef26bf3e7b95ac9a59199bbabe11fd4e831baba1b120ef97a4839c0c4aab7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:12Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:12 crc kubenswrapper[4752]: I0929 10:45:12.085362 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:12Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:12 crc kubenswrapper[4752]: I0929 10:45:12.097636 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:12 crc kubenswrapper[4752]: I0929 10:45:12.097712 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:12 crc kubenswrapper[4752]: I0929 10:45:12.097726 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:12 crc kubenswrapper[4752]: I0929 10:45:12.097752 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:12 crc kubenswrapper[4752]: I0929 10:45:12.097767 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:12Z","lastTransitionTime":"2025-09-29T10:45:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:12 crc kubenswrapper[4752]: I0929 10:45:12.100095 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fb781fd16d4a9f56202eb1724ed1a4ed6700ff7b81819573b955bcb07e563a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:12Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:12 crc kubenswrapper[4752]: I0929 10:45:12.114595 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xv5q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52fc9378-c37b-424b-afde-7b191bab5fde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30ee75a35da106cc9424c7a3f97f28d0c711200667372c023612db4a9701c189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4rqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xv5q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:12Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:12 crc kubenswrapper[4752]: I0929 10:45:12.126326 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mp5pm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65f5485e-9000-4512-aad3-7d367715ac2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db5dba49df10714a5f00ec40865af87528f6bee63ee58a89f299af7c10e4d769\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z772z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://073cf9e4675b04d77ad58f0b7e1b313e3fe15e8daee4e1c8934a90924b04ad22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z772z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mp5pm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:12Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:12 crc kubenswrapper[4752]: I0929 10:45:12.139699 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22c62d6c-d29c-416f-bfeb-476f97181a39\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bab580564f9dd31f6b2ea23a31918a9fdd2f247d13a0bd882f38dbaee4bf0b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44a5132d9611cf58eef747d86fd0cef4eb52366b9d1bacc6df0cf5be145d3998\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c6d3ad808fe69e726b66a03be183d33f000a614fadbc7f644015633fbb2b457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5335487c039d2e7e80a940cfe980fb46caf0cfc6302660b9318d9c8c525227cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5335487c039d2e7e80a940cfe980fb46caf0cfc6302660b9318d9c8c525227cb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:12Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:12 crc kubenswrapper[4752]: I0929 10:45:12.157829 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:12Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:12 crc kubenswrapper[4752]: I0929 10:45:12.177754 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48ad7053-6039-4b1a-9729-fcbe1d938928\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00965359c30aa25677d4b114c00b339b155ab4b5316d5e355536bea5b65eaba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e2d86e0821e0155affe296e5cc70e9904f04c800943101e62509e3a5e4e0808\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9378a6f1ac902b030f4ecabac1eae40f884dc1546a360e178f38300e137d8b0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a174bcfad22c2a58c48792478272705c80a56775b45b14919ea1de1dd92b4cbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://828d416b69696f709d91feb8df8fead0f95be74a91c5dab25756e341e29413dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e4ae4f6e0a6df2f1e370b0ff37704c0b0252752c0d8e8a1cdd83088ca9ec951\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e4ae4f6e0a6df2f1e370b0ff37704c0b0252752c0d8e8a1cdd83088ca9ec951\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40c90938f79ba960fa16979dd5f239674df4b13cae8b0b5d3bb48b0e46219a34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40c90938f79ba960fa16979dd5f239674df4b13cae8b0b5d3bb48b0e46219a34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f99c6fe84624f3e518bbe35ee9b700effb126ff1f36d995262b7ed8b73364780\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f99c6fe84624f3e518bbe35ee9b700effb126ff1f36d995262b7ed8b73364780\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:12Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:12 crc kubenswrapper[4752]: I0929 10:45:12.191759 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:12Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:12 crc kubenswrapper[4752]: I0929 10:45:12.200029 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:12 crc kubenswrapper[4752]: I0929 10:45:12.200108 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:12 crc kubenswrapper[4752]: I0929 10:45:12.200125 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:12 crc kubenswrapper[4752]: I0929 10:45:12.200154 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:12 crc kubenswrapper[4752]: I0929 10:45:12.200171 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:12Z","lastTransitionTime":"2025-09-29T10:45:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:12 crc kubenswrapper[4752]: I0929 10:45:12.205945 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131d2c8a72fc6a373ebf6835840e6b9c1829db4c78b4961bf36642fd0e8a5636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:12Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:12 crc kubenswrapper[4752]: I0929 10:45:12.218222 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5863c243-797d-462a-b11f-71aaf005f8d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://166738b29f01996ec981fd00b49f422e4a97fe774396e7ea153ad29ef30a7370\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdtpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32155f6078e9c15abe4c659ac79b064ec182a232ea1d816998da4de273b7aa67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdtpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mgrvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:12Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:12 crc kubenswrapper[4752]: I0929 10:45:12.232158 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4whp8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"398b6e5c-29ac-4701-9207-d3d269b62224\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63db080ebca3f5ea23ddc9af874b6b500abe8044c73794ae0749df2949fb9520\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9hp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4whp8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:12Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:12 crc kubenswrapper[4752]: I0929 10:45:12.303233 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:12 crc kubenswrapper[4752]: I0929 10:45:12.303287 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:12 crc kubenswrapper[4752]: I0929 10:45:12.303299 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:12 crc kubenswrapper[4752]: I0929 10:45:12.303318 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:12 crc kubenswrapper[4752]: I0929 10:45:12.303328 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:12Z","lastTransitionTime":"2025-09-29T10:45:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:12 crc kubenswrapper[4752]: I0929 10:45:12.405883 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:12 crc kubenswrapper[4752]: I0929 10:45:12.405917 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:12 crc kubenswrapper[4752]: I0929 10:45:12.405926 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:12 crc kubenswrapper[4752]: I0929 10:45:12.405941 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:12 crc kubenswrapper[4752]: I0929 10:45:12.405949 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:12Z","lastTransitionTime":"2025-09-29T10:45:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:12 crc kubenswrapper[4752]: I0929 10:45:12.509466 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:12 crc kubenswrapper[4752]: I0929 10:45:12.509618 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:12 crc kubenswrapper[4752]: I0929 10:45:12.509639 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:12 crc kubenswrapper[4752]: I0929 10:45:12.509703 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:12 crc kubenswrapper[4752]: I0929 10:45:12.509729 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:12Z","lastTransitionTime":"2025-09-29T10:45:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:12 crc kubenswrapper[4752]: I0929 10:45:12.613139 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:12 crc kubenswrapper[4752]: I0929 10:45:12.613209 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:12 crc kubenswrapper[4752]: I0929 10:45:12.613222 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:12 crc kubenswrapper[4752]: I0929 10:45:12.613245 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:12 crc kubenswrapper[4752]: I0929 10:45:12.613264 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:12Z","lastTransitionTime":"2025-09-29T10:45:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:12 crc kubenswrapper[4752]: I0929 10:45:12.716501 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:12 crc kubenswrapper[4752]: I0929 10:45:12.716572 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:12 crc kubenswrapper[4752]: I0929 10:45:12.716584 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:12 crc kubenswrapper[4752]: I0929 10:45:12.716601 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:12 crc kubenswrapper[4752]: I0929 10:45:12.716612 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:12Z","lastTransitionTime":"2025-09-29T10:45:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:12 crc kubenswrapper[4752]: I0929 10:45:12.772517 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 10:45:12 crc kubenswrapper[4752]: E0929 10:45:12.772681 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 10:45:44.772660529 +0000 UTC m=+85.561802196 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:45:12 crc kubenswrapper[4752]: I0929 10:45:12.819597 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:12 crc kubenswrapper[4752]: I0929 10:45:12.819705 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:12 crc kubenswrapper[4752]: I0929 10:45:12.819728 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:12 crc kubenswrapper[4752]: I0929 10:45:12.819790 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:12 crc kubenswrapper[4752]: I0929 10:45:12.819850 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:12Z","lastTransitionTime":"2025-09-29T10:45:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:12 crc kubenswrapper[4752]: I0929 10:45:12.873329 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 10:45:12 crc kubenswrapper[4752]: I0929 10:45:12.873416 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 10:45:12 crc kubenswrapper[4752]: I0929 10:45:12.873465 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 10:45:12 crc kubenswrapper[4752]: I0929 10:45:12.873505 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 10:45:12 crc kubenswrapper[4752]: E0929 10:45:12.873551 4752 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 29 10:45:12 crc kubenswrapper[4752]: E0929 10:45:12.873666 4752 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 29 10:45:12 crc kubenswrapper[4752]: E0929 10:45:12.873690 4752 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 29 10:45:12 crc kubenswrapper[4752]: E0929 10:45:12.873717 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-29 10:45:44.873691645 +0000 UTC m=+85.662833372 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 29 10:45:12 crc kubenswrapper[4752]: E0929 10:45:12.873762 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-29 10:45:44.873753567 +0000 UTC m=+85.662895244 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 29 10:45:12 crc kubenswrapper[4752]: E0929 10:45:12.873698 4752 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 29 10:45:12 crc kubenswrapper[4752]: E0929 10:45:12.873788 4752 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 10:45:12 crc kubenswrapper[4752]: E0929 10:45:12.873668 4752 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 29 10:45:12 crc kubenswrapper[4752]: E0929 10:45:12.873882 4752 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 29 10:45:12 crc kubenswrapper[4752]: E0929 10:45:12.873894 4752 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 10:45:12 crc kubenswrapper[4752]: E0929 10:45:12.873854 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-29 10:45:44.873845949 +0000 UTC m=+85.662987706 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 10:45:12 crc kubenswrapper[4752]: E0929 10:45:12.873933 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-29 10:45:44.873921801 +0000 UTC m=+85.663063528 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 10:45:12 crc kubenswrapper[4752]: I0929 10:45:12.924936 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:12 crc kubenswrapper[4752]: I0929 10:45:12.924988 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:12 crc kubenswrapper[4752]: I0929 10:45:12.925001 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:12 crc kubenswrapper[4752]: I0929 10:45:12.925021 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:12 crc kubenswrapper[4752]: I0929 10:45:12.925035 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:12Z","lastTransitionTime":"2025-09-29T10:45:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:13 crc kubenswrapper[4752]: I0929 10:45:13.027578 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:13 crc kubenswrapper[4752]: I0929 10:45:13.027649 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:13 crc kubenswrapper[4752]: I0929 10:45:13.027668 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:13 crc kubenswrapper[4752]: I0929 10:45:13.027692 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:13 crc kubenswrapper[4752]: I0929 10:45:13.027708 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:13Z","lastTransitionTime":"2025-09-29T10:45:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:13 crc kubenswrapper[4752]: I0929 10:45:13.030879 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 10:45:13 crc kubenswrapper[4752]: I0929 10:45:13.030921 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sq7f4" Sep 29 10:45:13 crc kubenswrapper[4752]: I0929 10:45:13.031012 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 10:45:13 crc kubenswrapper[4752]: E0929 10:45:13.031113 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 10:45:13 crc kubenswrapper[4752]: E0929 10:45:13.031225 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 10:45:13 crc kubenswrapper[4752]: I0929 10:45:13.031453 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 10:45:13 crc kubenswrapper[4752]: E0929 10:45:13.031544 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sq7f4" podUID="0a33b92e-d79c-4162-8500-df7a89df8df3" Sep 29 10:45:13 crc kubenswrapper[4752]: E0929 10:45:13.031868 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 10:45:13 crc kubenswrapper[4752]: I0929 10:45:13.130955 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:13 crc kubenswrapper[4752]: I0929 10:45:13.131039 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:13 crc kubenswrapper[4752]: I0929 10:45:13.131078 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:13 crc kubenswrapper[4752]: I0929 10:45:13.131110 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:13 crc kubenswrapper[4752]: I0929 10:45:13.131139 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:13Z","lastTransitionTime":"2025-09-29T10:45:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:13 crc kubenswrapper[4752]: I0929 10:45:13.233696 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:13 crc kubenswrapper[4752]: I0929 10:45:13.234025 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:13 crc kubenswrapper[4752]: I0929 10:45:13.234090 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:13 crc kubenswrapper[4752]: I0929 10:45:13.234153 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:13 crc kubenswrapper[4752]: I0929 10:45:13.234224 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:13Z","lastTransitionTime":"2025-09-29T10:45:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:13 crc kubenswrapper[4752]: I0929 10:45:13.337017 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:13 crc kubenswrapper[4752]: I0929 10:45:13.337124 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:13 crc kubenswrapper[4752]: I0929 10:45:13.337147 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:13 crc kubenswrapper[4752]: I0929 10:45:13.337168 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:13 crc kubenswrapper[4752]: I0929 10:45:13.337179 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:13Z","lastTransitionTime":"2025-09-29T10:45:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:13 crc kubenswrapper[4752]: I0929 10:45:13.440371 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:13 crc kubenswrapper[4752]: I0929 10:45:13.440460 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:13 crc kubenswrapper[4752]: I0929 10:45:13.440495 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:13 crc kubenswrapper[4752]: I0929 10:45:13.440549 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:13 crc kubenswrapper[4752]: I0929 10:45:13.440574 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:13Z","lastTransitionTime":"2025-09-29T10:45:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:13 crc kubenswrapper[4752]: I0929 10:45:13.542787 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:13 crc kubenswrapper[4752]: I0929 10:45:13.542849 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:13 crc kubenswrapper[4752]: I0929 10:45:13.542862 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:13 crc kubenswrapper[4752]: I0929 10:45:13.542877 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:13 crc kubenswrapper[4752]: I0929 10:45:13.542887 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:13Z","lastTransitionTime":"2025-09-29T10:45:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:13 crc kubenswrapper[4752]: I0929 10:45:13.645650 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:13 crc kubenswrapper[4752]: I0929 10:45:13.645724 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:13 crc kubenswrapper[4752]: I0929 10:45:13.645745 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:13 crc kubenswrapper[4752]: I0929 10:45:13.645771 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:13 crc kubenswrapper[4752]: I0929 10:45:13.645795 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:13Z","lastTransitionTime":"2025-09-29T10:45:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:13 crc kubenswrapper[4752]: I0929 10:45:13.748938 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:13 crc kubenswrapper[4752]: I0929 10:45:13.749027 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:13 crc kubenswrapper[4752]: I0929 10:45:13.749256 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:13 crc kubenswrapper[4752]: I0929 10:45:13.749357 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:13 crc kubenswrapper[4752]: I0929 10:45:13.749407 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:13Z","lastTransitionTime":"2025-09-29T10:45:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:13 crc kubenswrapper[4752]: I0929 10:45:13.852475 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:13 crc kubenswrapper[4752]: I0929 10:45:13.853023 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:13 crc kubenswrapper[4752]: I0929 10:45:13.853255 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:13 crc kubenswrapper[4752]: I0929 10:45:13.853457 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:13 crc kubenswrapper[4752]: I0929 10:45:13.853643 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:13Z","lastTransitionTime":"2025-09-29T10:45:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:13 crc kubenswrapper[4752]: I0929 10:45:13.957345 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:13 crc kubenswrapper[4752]: I0929 10:45:13.957448 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:13 crc kubenswrapper[4752]: I0929 10:45:13.957459 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:13 crc kubenswrapper[4752]: I0929 10:45:13.957477 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:13 crc kubenswrapper[4752]: I0929 10:45:13.957487 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:13Z","lastTransitionTime":"2025-09-29T10:45:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:14 crc kubenswrapper[4752]: I0929 10:45:14.061180 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:14 crc kubenswrapper[4752]: I0929 10:45:14.061238 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:14 crc kubenswrapper[4752]: I0929 10:45:14.061249 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:14 crc kubenswrapper[4752]: I0929 10:45:14.061274 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:14 crc kubenswrapper[4752]: I0929 10:45:14.061285 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:14Z","lastTransitionTime":"2025-09-29T10:45:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:14 crc kubenswrapper[4752]: I0929 10:45:14.164392 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:14 crc kubenswrapper[4752]: I0929 10:45:14.164440 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:14 crc kubenswrapper[4752]: I0929 10:45:14.164455 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:14 crc kubenswrapper[4752]: I0929 10:45:14.164473 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:14 crc kubenswrapper[4752]: I0929 10:45:14.164484 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:14Z","lastTransitionTime":"2025-09-29T10:45:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:14 crc kubenswrapper[4752]: I0929 10:45:14.266997 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:14 crc kubenswrapper[4752]: I0929 10:45:14.267051 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:14 crc kubenswrapper[4752]: I0929 10:45:14.267060 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:14 crc kubenswrapper[4752]: I0929 10:45:14.267079 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:14 crc kubenswrapper[4752]: I0929 10:45:14.267092 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:14Z","lastTransitionTime":"2025-09-29T10:45:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:14 crc kubenswrapper[4752]: I0929 10:45:14.371110 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:14 crc kubenswrapper[4752]: I0929 10:45:14.371169 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:14 crc kubenswrapper[4752]: I0929 10:45:14.371187 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:14 crc kubenswrapper[4752]: I0929 10:45:14.371214 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:14 crc kubenswrapper[4752]: I0929 10:45:14.371237 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:14Z","lastTransitionTime":"2025-09-29T10:45:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:14 crc kubenswrapper[4752]: I0929 10:45:14.474538 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:14 crc kubenswrapper[4752]: I0929 10:45:14.474626 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:14 crc kubenswrapper[4752]: I0929 10:45:14.474670 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:14 crc kubenswrapper[4752]: I0929 10:45:14.474694 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:14 crc kubenswrapper[4752]: I0929 10:45:14.474708 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:14Z","lastTransitionTime":"2025-09-29T10:45:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:14 crc kubenswrapper[4752]: I0929 10:45:14.577969 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:14 crc kubenswrapper[4752]: I0929 10:45:14.578018 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:14 crc kubenswrapper[4752]: I0929 10:45:14.578027 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:14 crc kubenswrapper[4752]: I0929 10:45:14.578044 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:14 crc kubenswrapper[4752]: I0929 10:45:14.578055 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:14Z","lastTransitionTime":"2025-09-29T10:45:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:14 crc kubenswrapper[4752]: I0929 10:45:14.681363 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:14 crc kubenswrapper[4752]: I0929 10:45:14.681437 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:14 crc kubenswrapper[4752]: I0929 10:45:14.681457 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:14 crc kubenswrapper[4752]: I0929 10:45:14.681491 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:14 crc kubenswrapper[4752]: I0929 10:45:14.681511 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:14Z","lastTransitionTime":"2025-09-29T10:45:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:14 crc kubenswrapper[4752]: I0929 10:45:14.784403 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:14 crc kubenswrapper[4752]: I0929 10:45:14.784471 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:14 crc kubenswrapper[4752]: I0929 10:45:14.784485 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:14 crc kubenswrapper[4752]: I0929 10:45:14.784513 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:14 crc kubenswrapper[4752]: I0929 10:45:14.784531 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:14Z","lastTransitionTime":"2025-09-29T10:45:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:14 crc kubenswrapper[4752]: I0929 10:45:14.887631 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:14 crc kubenswrapper[4752]: I0929 10:45:14.887676 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:14 crc kubenswrapper[4752]: I0929 10:45:14.887689 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:14 crc kubenswrapper[4752]: I0929 10:45:14.887709 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:14 crc kubenswrapper[4752]: I0929 10:45:14.887722 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:14Z","lastTransitionTime":"2025-09-29T10:45:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:14 crc kubenswrapper[4752]: I0929 10:45:14.990362 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:14 crc kubenswrapper[4752]: I0929 10:45:14.990426 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:14 crc kubenswrapper[4752]: I0929 10:45:14.990439 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:14 crc kubenswrapper[4752]: I0929 10:45:14.990460 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:14 crc kubenswrapper[4752]: I0929 10:45:14.990475 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:14Z","lastTransitionTime":"2025-09-29T10:45:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:15 crc kubenswrapper[4752]: I0929 10:45:15.030289 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sq7f4" Sep 29 10:45:15 crc kubenswrapper[4752]: I0929 10:45:15.030379 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 10:45:15 crc kubenswrapper[4752]: I0929 10:45:15.030448 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 10:45:15 crc kubenswrapper[4752]: E0929 10:45:15.030459 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sq7f4" podUID="0a33b92e-d79c-4162-8500-df7a89df8df3" Sep 29 10:45:15 crc kubenswrapper[4752]: I0929 10:45:15.030504 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 10:45:15 crc kubenswrapper[4752]: E0929 10:45:15.030666 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 10:45:15 crc kubenswrapper[4752]: E0929 10:45:15.030751 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 10:45:15 crc kubenswrapper[4752]: E0929 10:45:15.030965 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 10:45:15 crc kubenswrapper[4752]: I0929 10:45:15.093339 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:15 crc kubenswrapper[4752]: I0929 10:45:15.093389 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:15 crc kubenswrapper[4752]: I0929 10:45:15.093401 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:15 crc kubenswrapper[4752]: I0929 10:45:15.093419 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:15 crc kubenswrapper[4752]: I0929 10:45:15.093432 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:15Z","lastTransitionTime":"2025-09-29T10:45:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:15 crc kubenswrapper[4752]: I0929 10:45:15.196382 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:15 crc kubenswrapper[4752]: I0929 10:45:15.196476 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:15 crc kubenswrapper[4752]: I0929 10:45:15.196507 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:15 crc kubenswrapper[4752]: I0929 10:45:15.196544 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:15 crc kubenswrapper[4752]: I0929 10:45:15.196568 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:15Z","lastTransitionTime":"2025-09-29T10:45:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:15 crc kubenswrapper[4752]: I0929 10:45:15.299405 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:15 crc kubenswrapper[4752]: I0929 10:45:15.299456 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:15 crc kubenswrapper[4752]: I0929 10:45:15.299474 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:15 crc kubenswrapper[4752]: I0929 10:45:15.299493 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:15 crc kubenswrapper[4752]: I0929 10:45:15.299505 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:15Z","lastTransitionTime":"2025-09-29T10:45:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:15 crc kubenswrapper[4752]: I0929 10:45:15.402347 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:15 crc kubenswrapper[4752]: I0929 10:45:15.402428 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:15 crc kubenswrapper[4752]: I0929 10:45:15.402449 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:15 crc kubenswrapper[4752]: I0929 10:45:15.402499 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:15 crc kubenswrapper[4752]: I0929 10:45:15.402514 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:15Z","lastTransitionTime":"2025-09-29T10:45:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:15 crc kubenswrapper[4752]: I0929 10:45:15.505929 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:15 crc kubenswrapper[4752]: I0929 10:45:15.505999 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:15 crc kubenswrapper[4752]: I0929 10:45:15.506017 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:15 crc kubenswrapper[4752]: I0929 10:45:15.506044 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:15 crc kubenswrapper[4752]: I0929 10:45:15.506060 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:15Z","lastTransitionTime":"2025-09-29T10:45:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:15 crc kubenswrapper[4752]: I0929 10:45:15.609178 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:15 crc kubenswrapper[4752]: I0929 10:45:15.609220 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:15 crc kubenswrapper[4752]: I0929 10:45:15.609229 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:15 crc kubenswrapper[4752]: I0929 10:45:15.609246 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:15 crc kubenswrapper[4752]: I0929 10:45:15.609258 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:15Z","lastTransitionTime":"2025-09-29T10:45:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:15 crc kubenswrapper[4752]: I0929 10:45:15.712907 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:15 crc kubenswrapper[4752]: I0929 10:45:15.712996 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:15 crc kubenswrapper[4752]: I0929 10:45:15.713010 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:15 crc kubenswrapper[4752]: I0929 10:45:15.713033 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:15 crc kubenswrapper[4752]: I0929 10:45:15.713046 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:15Z","lastTransitionTime":"2025-09-29T10:45:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:15 crc kubenswrapper[4752]: I0929 10:45:15.815502 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:15 crc kubenswrapper[4752]: I0929 10:45:15.815549 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:15 crc kubenswrapper[4752]: I0929 10:45:15.815566 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:15 crc kubenswrapper[4752]: I0929 10:45:15.815582 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:15 crc kubenswrapper[4752]: I0929 10:45:15.815593 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:15Z","lastTransitionTime":"2025-09-29T10:45:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:15 crc kubenswrapper[4752]: I0929 10:45:15.918291 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:15 crc kubenswrapper[4752]: I0929 10:45:15.918330 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:15 crc kubenswrapper[4752]: I0929 10:45:15.918339 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:15 crc kubenswrapper[4752]: I0929 10:45:15.918356 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:15 crc kubenswrapper[4752]: I0929 10:45:15.918366 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:15Z","lastTransitionTime":"2025-09-29T10:45:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:16 crc kubenswrapper[4752]: I0929 10:45:16.022326 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:16 crc kubenswrapper[4752]: I0929 10:45:16.022452 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:16 crc kubenswrapper[4752]: I0929 10:45:16.022472 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:16 crc kubenswrapper[4752]: I0929 10:45:16.022497 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:16 crc kubenswrapper[4752]: I0929 10:45:16.022546 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:16Z","lastTransitionTime":"2025-09-29T10:45:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:16 crc kubenswrapper[4752]: I0929 10:45:16.125949 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:16 crc kubenswrapper[4752]: I0929 10:45:16.126027 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:16 crc kubenswrapper[4752]: I0929 10:45:16.126044 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:16 crc kubenswrapper[4752]: I0929 10:45:16.126069 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:16 crc kubenswrapper[4752]: I0929 10:45:16.126083 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:16Z","lastTransitionTime":"2025-09-29T10:45:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:16 crc kubenswrapper[4752]: I0929 10:45:16.229186 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:16 crc kubenswrapper[4752]: I0929 10:45:16.229243 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:16 crc kubenswrapper[4752]: I0929 10:45:16.229253 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:16 crc kubenswrapper[4752]: I0929 10:45:16.229273 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:16 crc kubenswrapper[4752]: I0929 10:45:16.229283 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:16Z","lastTransitionTime":"2025-09-29T10:45:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:16 crc kubenswrapper[4752]: I0929 10:45:16.332499 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:16 crc kubenswrapper[4752]: I0929 10:45:16.332548 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:16 crc kubenswrapper[4752]: I0929 10:45:16.332560 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:16 crc kubenswrapper[4752]: I0929 10:45:16.332580 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:16 crc kubenswrapper[4752]: I0929 10:45:16.332598 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:16Z","lastTransitionTime":"2025-09-29T10:45:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:16 crc kubenswrapper[4752]: I0929 10:45:16.434894 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:16 crc kubenswrapper[4752]: I0929 10:45:16.434942 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:16 crc kubenswrapper[4752]: I0929 10:45:16.434951 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:16 crc kubenswrapper[4752]: I0929 10:45:16.434966 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:16 crc kubenswrapper[4752]: I0929 10:45:16.434976 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:16Z","lastTransitionTime":"2025-09-29T10:45:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:16 crc kubenswrapper[4752]: I0929 10:45:16.537710 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:16 crc kubenswrapper[4752]: I0929 10:45:16.537788 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:16 crc kubenswrapper[4752]: I0929 10:45:16.537813 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:16 crc kubenswrapper[4752]: I0929 10:45:16.537829 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:16 crc kubenswrapper[4752]: I0929 10:45:16.537840 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:16Z","lastTransitionTime":"2025-09-29T10:45:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:16 crc kubenswrapper[4752]: I0929 10:45:16.640469 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:16 crc kubenswrapper[4752]: I0929 10:45:16.640525 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:16 crc kubenswrapper[4752]: I0929 10:45:16.640538 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:16 crc kubenswrapper[4752]: I0929 10:45:16.640560 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:16 crc kubenswrapper[4752]: I0929 10:45:16.640576 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:16Z","lastTransitionTime":"2025-09-29T10:45:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:16 crc kubenswrapper[4752]: I0929 10:45:16.744082 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:16 crc kubenswrapper[4752]: I0929 10:45:16.744157 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:16 crc kubenswrapper[4752]: I0929 10:45:16.744170 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:16 crc kubenswrapper[4752]: I0929 10:45:16.744195 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:16 crc kubenswrapper[4752]: I0929 10:45:16.744208 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:16Z","lastTransitionTime":"2025-09-29T10:45:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:16 crc kubenswrapper[4752]: I0929 10:45:16.846929 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:16 crc kubenswrapper[4752]: I0929 10:45:16.846996 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:16 crc kubenswrapper[4752]: I0929 10:45:16.847008 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:16 crc kubenswrapper[4752]: I0929 10:45:16.847029 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:16 crc kubenswrapper[4752]: I0929 10:45:16.847041 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:16Z","lastTransitionTime":"2025-09-29T10:45:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:16 crc kubenswrapper[4752]: I0929 10:45:16.950657 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:16 crc kubenswrapper[4752]: I0929 10:45:16.950723 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:16 crc kubenswrapper[4752]: I0929 10:45:16.950736 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:16 crc kubenswrapper[4752]: I0929 10:45:16.950757 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:16 crc kubenswrapper[4752]: I0929 10:45:16.950771 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:16Z","lastTransitionTime":"2025-09-29T10:45:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:17 crc kubenswrapper[4752]: I0929 10:45:17.030391 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sq7f4" Sep 29 10:45:17 crc kubenswrapper[4752]: I0929 10:45:17.030654 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 10:45:17 crc kubenswrapper[4752]: E0929 10:45:17.030896 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sq7f4" podUID="0a33b92e-d79c-4162-8500-df7a89df8df3" Sep 29 10:45:17 crc kubenswrapper[4752]: I0929 10:45:17.031366 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 10:45:17 crc kubenswrapper[4752]: E0929 10:45:17.031495 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 10:45:17 crc kubenswrapper[4752]: I0929 10:45:17.031554 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 10:45:17 crc kubenswrapper[4752]: E0929 10:45:17.031624 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 10:45:17 crc kubenswrapper[4752]: E0929 10:45:17.031744 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 10:45:17 crc kubenswrapper[4752]: I0929 10:45:17.054714 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:17 crc kubenswrapper[4752]: I0929 10:45:17.054762 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:17 crc kubenswrapper[4752]: I0929 10:45:17.054773 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:17 crc kubenswrapper[4752]: I0929 10:45:17.054791 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:17 crc kubenswrapper[4752]: I0929 10:45:17.054817 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:17Z","lastTransitionTime":"2025-09-29T10:45:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:17 crc kubenswrapper[4752]: I0929 10:45:17.157973 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:17 crc kubenswrapper[4752]: I0929 10:45:17.158050 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:17 crc kubenswrapper[4752]: I0929 10:45:17.158065 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:17 crc kubenswrapper[4752]: I0929 10:45:17.158092 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:17 crc kubenswrapper[4752]: I0929 10:45:17.158109 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:17Z","lastTransitionTime":"2025-09-29T10:45:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:17 crc kubenswrapper[4752]: I0929 10:45:17.260842 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:17 crc kubenswrapper[4752]: I0929 10:45:17.260894 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:17 crc kubenswrapper[4752]: I0929 10:45:17.260911 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:17 crc kubenswrapper[4752]: I0929 10:45:17.260930 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:17 crc kubenswrapper[4752]: I0929 10:45:17.260945 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:17Z","lastTransitionTime":"2025-09-29T10:45:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:17 crc kubenswrapper[4752]: I0929 10:45:17.364046 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:17 crc kubenswrapper[4752]: I0929 10:45:17.364126 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:17 crc kubenswrapper[4752]: I0929 10:45:17.364141 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:17 crc kubenswrapper[4752]: I0929 10:45:17.364164 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:17 crc kubenswrapper[4752]: I0929 10:45:17.364522 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:17Z","lastTransitionTime":"2025-09-29T10:45:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:17 crc kubenswrapper[4752]: I0929 10:45:17.468328 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:17 crc kubenswrapper[4752]: I0929 10:45:17.468386 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:17 crc kubenswrapper[4752]: I0929 10:45:17.468396 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:17 crc kubenswrapper[4752]: I0929 10:45:17.468413 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:17 crc kubenswrapper[4752]: I0929 10:45:17.468423 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:17Z","lastTransitionTime":"2025-09-29T10:45:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:17 crc kubenswrapper[4752]: I0929 10:45:17.571305 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:17 crc kubenswrapper[4752]: I0929 10:45:17.571350 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:17 crc kubenswrapper[4752]: I0929 10:45:17.571361 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:17 crc kubenswrapper[4752]: I0929 10:45:17.571379 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:17 crc kubenswrapper[4752]: I0929 10:45:17.571393 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:17Z","lastTransitionTime":"2025-09-29T10:45:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:17 crc kubenswrapper[4752]: I0929 10:45:17.673360 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:17 crc kubenswrapper[4752]: I0929 10:45:17.673404 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:17 crc kubenswrapper[4752]: I0929 10:45:17.673415 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:17 crc kubenswrapper[4752]: I0929 10:45:17.673434 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:17 crc kubenswrapper[4752]: I0929 10:45:17.673446 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:17Z","lastTransitionTime":"2025-09-29T10:45:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:17 crc kubenswrapper[4752]: I0929 10:45:17.720786 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:17 crc kubenswrapper[4752]: I0929 10:45:17.720866 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:17 crc kubenswrapper[4752]: I0929 10:45:17.720880 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:17 crc kubenswrapper[4752]: I0929 10:45:17.720899 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:17 crc kubenswrapper[4752]: I0929 10:45:17.720912 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:17Z","lastTransitionTime":"2025-09-29T10:45:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:17 crc kubenswrapper[4752]: E0929 10:45:17.733899 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:45:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:45:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:45:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:45:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"67757396-6dfe-4e60-ba89-bdfd50031eb3\\\",\\\"systemUUID\\\":\\\"d8106fc8-56a6-4aa2-998a-aa38bb8caa68\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:17Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:17 crc kubenswrapper[4752]: I0929 10:45:17.738052 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:17 crc kubenswrapper[4752]: I0929 10:45:17.738103 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:17 crc kubenswrapper[4752]: I0929 10:45:17.738114 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:17 crc kubenswrapper[4752]: I0929 10:45:17.738133 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:17 crc kubenswrapper[4752]: I0929 10:45:17.738147 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:17Z","lastTransitionTime":"2025-09-29T10:45:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:17 crc kubenswrapper[4752]: E0929 10:45:17.752689 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:45:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:45:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:45:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:45:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"67757396-6dfe-4e60-ba89-bdfd50031eb3\\\",\\\"systemUUID\\\":\\\"d8106fc8-56a6-4aa2-998a-aa38bb8caa68\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:17Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:17 crc kubenswrapper[4752]: I0929 10:45:17.757010 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:17 crc kubenswrapper[4752]: I0929 10:45:17.757076 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:17 crc kubenswrapper[4752]: I0929 10:45:17.757095 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:17 crc kubenswrapper[4752]: I0929 10:45:17.757120 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:17 crc kubenswrapper[4752]: I0929 10:45:17.757138 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:17Z","lastTransitionTime":"2025-09-29T10:45:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:17 crc kubenswrapper[4752]: E0929 10:45:17.769736 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:45:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:45:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:45:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:45:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"67757396-6dfe-4e60-ba89-bdfd50031eb3\\\",\\\"systemUUID\\\":\\\"d8106fc8-56a6-4aa2-998a-aa38bb8caa68\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:17Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:17 crc kubenswrapper[4752]: I0929 10:45:17.773109 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:17 crc kubenswrapper[4752]: I0929 10:45:17.773141 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:17 crc kubenswrapper[4752]: I0929 10:45:17.773151 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:17 crc kubenswrapper[4752]: I0929 10:45:17.773168 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:17 crc kubenswrapper[4752]: I0929 10:45:17.773179 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:17Z","lastTransitionTime":"2025-09-29T10:45:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:17 crc kubenswrapper[4752]: E0929 10:45:17.784319 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:45:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:45:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:45:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:45:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"67757396-6dfe-4e60-ba89-bdfd50031eb3\\\",\\\"systemUUID\\\":\\\"d8106fc8-56a6-4aa2-998a-aa38bb8caa68\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:17Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:17 crc kubenswrapper[4752]: I0929 10:45:17.788273 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:17 crc kubenswrapper[4752]: I0929 10:45:17.788308 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:17 crc kubenswrapper[4752]: I0929 10:45:17.788316 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:17 crc kubenswrapper[4752]: I0929 10:45:17.788333 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:17 crc kubenswrapper[4752]: I0929 10:45:17.788345 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:17Z","lastTransitionTime":"2025-09-29T10:45:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:17 crc kubenswrapper[4752]: E0929 10:45:17.803987 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:45:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:45:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:45:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:45:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"67757396-6dfe-4e60-ba89-bdfd50031eb3\\\",\\\"systemUUID\\\":\\\"d8106fc8-56a6-4aa2-998a-aa38bb8caa68\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:17Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:17 crc kubenswrapper[4752]: E0929 10:45:17.804150 4752 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 29 10:45:17 crc kubenswrapper[4752]: I0929 10:45:17.806296 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:17 crc kubenswrapper[4752]: I0929 10:45:17.806344 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:17 crc kubenswrapper[4752]: I0929 10:45:17.806353 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:17 crc kubenswrapper[4752]: I0929 10:45:17.806370 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:17 crc kubenswrapper[4752]: I0929 10:45:17.806381 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:17Z","lastTransitionTime":"2025-09-29T10:45:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:17 crc kubenswrapper[4752]: I0929 10:45:17.909386 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:17 crc kubenswrapper[4752]: I0929 10:45:17.909441 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:17 crc kubenswrapper[4752]: I0929 10:45:17.909453 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:17 crc kubenswrapper[4752]: I0929 10:45:17.909473 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:17 crc kubenswrapper[4752]: I0929 10:45:17.909487 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:17Z","lastTransitionTime":"2025-09-29T10:45:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:18 crc kubenswrapper[4752]: I0929 10:45:18.012264 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:18 crc kubenswrapper[4752]: I0929 10:45:18.012344 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:18 crc kubenswrapper[4752]: I0929 10:45:18.012363 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:18 crc kubenswrapper[4752]: I0929 10:45:18.012393 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:18 crc kubenswrapper[4752]: I0929 10:45:18.012414 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:18Z","lastTransitionTime":"2025-09-29T10:45:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:18 crc kubenswrapper[4752]: I0929 10:45:18.115364 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:18 crc kubenswrapper[4752]: I0929 10:45:18.115421 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:18 crc kubenswrapper[4752]: I0929 10:45:18.115431 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:18 crc kubenswrapper[4752]: I0929 10:45:18.115449 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:18 crc kubenswrapper[4752]: I0929 10:45:18.115461 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:18Z","lastTransitionTime":"2025-09-29T10:45:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:18 crc kubenswrapper[4752]: I0929 10:45:18.218647 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:18 crc kubenswrapper[4752]: I0929 10:45:18.218718 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:18 crc kubenswrapper[4752]: I0929 10:45:18.218733 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:18 crc kubenswrapper[4752]: I0929 10:45:18.218753 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:18 crc kubenswrapper[4752]: I0929 10:45:18.218765 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:18Z","lastTransitionTime":"2025-09-29T10:45:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:18 crc kubenswrapper[4752]: I0929 10:45:18.321638 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:18 crc kubenswrapper[4752]: I0929 10:45:18.321701 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:18 crc kubenswrapper[4752]: I0929 10:45:18.321710 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:18 crc kubenswrapper[4752]: I0929 10:45:18.321728 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:18 crc kubenswrapper[4752]: I0929 10:45:18.321739 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:18Z","lastTransitionTime":"2025-09-29T10:45:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:18 crc kubenswrapper[4752]: I0929 10:45:18.424154 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:18 crc kubenswrapper[4752]: I0929 10:45:18.424203 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:18 crc kubenswrapper[4752]: I0929 10:45:18.424211 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:18 crc kubenswrapper[4752]: I0929 10:45:18.424226 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:18 crc kubenswrapper[4752]: I0929 10:45:18.424237 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:18Z","lastTransitionTime":"2025-09-29T10:45:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:18 crc kubenswrapper[4752]: I0929 10:45:18.527479 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:18 crc kubenswrapper[4752]: I0929 10:45:18.527532 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:18 crc kubenswrapper[4752]: I0929 10:45:18.527544 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:18 crc kubenswrapper[4752]: I0929 10:45:18.527563 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:18 crc kubenswrapper[4752]: I0929 10:45:18.527575 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:18Z","lastTransitionTime":"2025-09-29T10:45:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:18 crc kubenswrapper[4752]: I0929 10:45:18.631109 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:18 crc kubenswrapper[4752]: I0929 10:45:18.631208 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:18 crc kubenswrapper[4752]: I0929 10:45:18.631237 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:18 crc kubenswrapper[4752]: I0929 10:45:18.631269 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:18 crc kubenswrapper[4752]: I0929 10:45:18.631293 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:18Z","lastTransitionTime":"2025-09-29T10:45:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:18 crc kubenswrapper[4752]: I0929 10:45:18.734428 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:18 crc kubenswrapper[4752]: I0929 10:45:18.734472 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:18 crc kubenswrapper[4752]: I0929 10:45:18.734482 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:18 crc kubenswrapper[4752]: I0929 10:45:18.734497 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:18 crc kubenswrapper[4752]: I0929 10:45:18.734506 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:18Z","lastTransitionTime":"2025-09-29T10:45:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:18 crc kubenswrapper[4752]: I0929 10:45:18.837218 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:18 crc kubenswrapper[4752]: I0929 10:45:18.837263 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:18 crc kubenswrapper[4752]: I0929 10:45:18.837273 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:18 crc kubenswrapper[4752]: I0929 10:45:18.837293 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:18 crc kubenswrapper[4752]: I0929 10:45:18.837306 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:18Z","lastTransitionTime":"2025-09-29T10:45:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:18 crc kubenswrapper[4752]: I0929 10:45:18.940179 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:18 crc kubenswrapper[4752]: I0929 10:45:18.940268 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:18 crc kubenswrapper[4752]: I0929 10:45:18.940281 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:18 crc kubenswrapper[4752]: I0929 10:45:18.940301 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:18 crc kubenswrapper[4752]: I0929 10:45:18.940314 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:18Z","lastTransitionTime":"2025-09-29T10:45:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:19 crc kubenswrapper[4752]: I0929 10:45:19.030751 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sq7f4" Sep 29 10:45:19 crc kubenswrapper[4752]: I0929 10:45:19.030764 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 10:45:19 crc kubenswrapper[4752]: E0929 10:45:19.031005 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sq7f4" podUID="0a33b92e-d79c-4162-8500-df7a89df8df3" Sep 29 10:45:19 crc kubenswrapper[4752]: I0929 10:45:19.030777 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 10:45:19 crc kubenswrapper[4752]: I0929 10:45:19.030783 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 10:45:19 crc kubenswrapper[4752]: E0929 10:45:19.031172 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 10:45:19 crc kubenswrapper[4752]: E0929 10:45:19.031233 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 10:45:19 crc kubenswrapper[4752]: E0929 10:45:19.031274 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 10:45:19 crc kubenswrapper[4752]: I0929 10:45:19.043407 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:19 crc kubenswrapper[4752]: I0929 10:45:19.043485 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:19 crc kubenswrapper[4752]: I0929 10:45:19.043497 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:19 crc kubenswrapper[4752]: I0929 10:45:19.043513 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:19 crc kubenswrapper[4752]: I0929 10:45:19.043524 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:19Z","lastTransitionTime":"2025-09-29T10:45:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:19 crc kubenswrapper[4752]: I0929 10:45:19.146653 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:19 crc kubenswrapper[4752]: I0929 10:45:19.146713 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:19 crc kubenswrapper[4752]: I0929 10:45:19.146726 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:19 crc kubenswrapper[4752]: I0929 10:45:19.146750 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:19 crc kubenswrapper[4752]: I0929 10:45:19.146763 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:19Z","lastTransitionTime":"2025-09-29T10:45:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:19 crc kubenswrapper[4752]: I0929 10:45:19.250736 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:19 crc kubenswrapper[4752]: I0929 10:45:19.250857 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:19 crc kubenswrapper[4752]: I0929 10:45:19.250871 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:19 crc kubenswrapper[4752]: I0929 10:45:19.250893 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:19 crc kubenswrapper[4752]: I0929 10:45:19.250924 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:19Z","lastTransitionTime":"2025-09-29T10:45:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:19 crc kubenswrapper[4752]: I0929 10:45:19.354569 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:19 crc kubenswrapper[4752]: I0929 10:45:19.354623 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:19 crc kubenswrapper[4752]: I0929 10:45:19.354634 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:19 crc kubenswrapper[4752]: I0929 10:45:19.354652 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:19 crc kubenswrapper[4752]: I0929 10:45:19.354665 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:19Z","lastTransitionTime":"2025-09-29T10:45:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:19 crc kubenswrapper[4752]: I0929 10:45:19.457388 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:19 crc kubenswrapper[4752]: I0929 10:45:19.457469 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:19 crc kubenswrapper[4752]: I0929 10:45:19.457490 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:19 crc kubenswrapper[4752]: I0929 10:45:19.457512 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:19 crc kubenswrapper[4752]: I0929 10:45:19.457527 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:19Z","lastTransitionTime":"2025-09-29T10:45:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:19 crc kubenswrapper[4752]: I0929 10:45:19.560539 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:19 crc kubenswrapper[4752]: I0929 10:45:19.560589 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:19 crc kubenswrapper[4752]: I0929 10:45:19.560603 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:19 crc kubenswrapper[4752]: I0929 10:45:19.560620 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:19 crc kubenswrapper[4752]: I0929 10:45:19.560632 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:19Z","lastTransitionTime":"2025-09-29T10:45:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:19 crc kubenswrapper[4752]: I0929 10:45:19.663561 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:19 crc kubenswrapper[4752]: I0929 10:45:19.663634 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:19 crc kubenswrapper[4752]: I0929 10:45:19.663644 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:19 crc kubenswrapper[4752]: I0929 10:45:19.663665 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:19 crc kubenswrapper[4752]: I0929 10:45:19.663679 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:19Z","lastTransitionTime":"2025-09-29T10:45:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:19 crc kubenswrapper[4752]: I0929 10:45:19.766340 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:19 crc kubenswrapper[4752]: I0929 10:45:19.766394 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:19 crc kubenswrapper[4752]: I0929 10:45:19.766407 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:19 crc kubenswrapper[4752]: I0929 10:45:19.766428 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:19 crc kubenswrapper[4752]: I0929 10:45:19.766441 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:19Z","lastTransitionTime":"2025-09-29T10:45:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:19 crc kubenswrapper[4752]: I0929 10:45:19.869514 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:19 crc kubenswrapper[4752]: I0929 10:45:19.869580 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:19 crc kubenswrapper[4752]: I0929 10:45:19.869598 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:19 crc kubenswrapper[4752]: I0929 10:45:19.869619 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:19 crc kubenswrapper[4752]: I0929 10:45:19.869634 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:19Z","lastTransitionTime":"2025-09-29T10:45:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:19 crc kubenswrapper[4752]: I0929 10:45:19.971831 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:19 crc kubenswrapper[4752]: I0929 10:45:19.971883 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:19 crc kubenswrapper[4752]: I0929 10:45:19.971896 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:19 crc kubenswrapper[4752]: I0929 10:45:19.971919 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:19 crc kubenswrapper[4752]: I0929 10:45:19.971934 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:19Z","lastTransitionTime":"2025-09-29T10:45:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:20 crc kubenswrapper[4752]: I0929 10:45:20.047007 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22c62d6c-d29c-416f-bfeb-476f97181a39\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bab580564f9dd31f6b2ea23a31918a9fdd2f247d13a0bd882f38dbaee4bf0b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44a5132d9611cf58eef747d86fd0cef4eb52366b9d1bacc6df0cf5be145d3998\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c6d3ad808fe69e726b66a03be183d33f000a614fadbc7f644015633fbb2b457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5335487c039d2e7e80a940cfe980fb46caf0cfc6302660b9318d9c8c525227cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5335487c039d2e7e80a940cfe980fb46caf0cfc6302660b9318d9c8c525227cb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:20Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:20 crc kubenswrapper[4752]: I0929 10:45:20.060143 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:20Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:20 crc kubenswrapper[4752]: I0929 10:45:20.074716 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:20 crc kubenswrapper[4752]: I0929 10:45:20.074783 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:20 crc kubenswrapper[4752]: I0929 10:45:20.074814 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:20 crc kubenswrapper[4752]: I0929 10:45:20.074837 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:20 crc kubenswrapper[4752]: I0929 10:45:20.074852 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:20Z","lastTransitionTime":"2025-09-29T10:45:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:20 crc kubenswrapper[4752]: I0929 10:45:20.075517 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131d2c8a72fc6a373ebf6835840e6b9c1829db4c78b4961bf36642fd0e8a5636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:20Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:20 crc kubenswrapper[4752]: I0929 10:45:20.090841 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5863c243-797d-462a-b11f-71aaf005f8d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://166738b29f01996ec981fd00b49f422e4a97fe774396e7ea153ad29ef30a7370\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdtpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32155f6078e9c15abe4c659ac79b064ec182a232ea1d816998da4de273b7aa67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdtpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mgrvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:20Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:20 crc kubenswrapper[4752]: I0929 10:45:20.106714 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4whp8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"398b6e5c-29ac-4701-9207-d3d269b62224\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63db080ebca3f5ea23ddc9af874b6b500abe8044c73794ae0749df2949fb9520\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9hp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4whp8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:20Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:20 crc kubenswrapper[4752]: I0929 10:45:20.132211 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48ad7053-6039-4b1a-9729-fcbe1d938928\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00965359c30aa25677d4b114c00b339b155ab4b5316d5e355536bea5b65eaba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e2d86e0821e0155affe296e5cc70e9904f04c800943101e62509e3a5e4e0808\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9378a6f1ac902b030f4ecabac1eae40f884dc1546a360e178f38300e137d8b0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a174bcfad22c2a58c48792478272705c80a56775b45b14919ea1de1dd92b4cbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://828d416b69696f709d91feb8df8fead0f95be74a91c5dab25756e341e29413dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e4ae4f6e0a6df2f1e370b0ff37704c0b0252752c0d8e8a1cdd83088ca9ec951\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e4ae4f6e0a6df2f1e370b0ff37704c0b0252752c0d8e8a1cdd83088ca9ec951\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40c90938f79ba960fa16979dd5f239674df4b13cae8b0b5d3bb48b0e46219a34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40c90938f79ba960fa16979dd5f239674df4b13cae8b0b5d3bb48b0e46219a34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f99c6fe84624f3e518bbe35ee9b700effb126ff1f36d995262b7ed8b73364780\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f99c6fe84624f3e518bbe35ee9b700effb126ff1f36d995262b7ed8b73364780\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:20Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:20 crc kubenswrapper[4752]: I0929 10:45:20.148422 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:20Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:20 crc kubenswrapper[4752]: I0929 10:45:20.160601 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7kp7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66a61a7f-9be6-486b-a425-62ed62ec0ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4170732970e5e7c429279d239eb2d4b9d8249ff254b35f38ff80d0321087be2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kgr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7kp7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:20Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:20 crc kubenswrapper[4752]: I0929 10:45:20.178139 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:20 crc kubenswrapper[4752]: I0929 10:45:20.178205 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:20 crc kubenswrapper[4752]: I0929 10:45:20.178215 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:20 crc kubenswrapper[4752]: I0929 10:45:20.178235 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:20 crc kubenswrapper[4752]: I0929 10:45:20.178246 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:20Z","lastTransitionTime":"2025-09-29T10:45:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:20 crc kubenswrapper[4752]: I0929 10:45:20.179615 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vm6zb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f30a1f9-86ef-450e-9f8c-8ef8d4ac380a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6bc5aff417397c8b264553f67de7ebd1aeadb67fb83114c5bb13c2e0d10e397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239ca1f17b9f1e1d6ba63b196e34066fe7fb37373453460261044f5fcaf819af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://239ca1f17b9f1e1d6ba63b196e34066fe7fb37373453460261044f5fcaf819af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd5b369dc688f11e4ab502a3886b722cba392fce0d3ac7850bd59abffbf7dee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd5b369dc688f11e4ab502a3886b722cba392fce0d3ac7850bd59abffbf7dee2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d17821abed9aca5c20373738f44ca9a61e954d1eee46f0d16c3e9b34d810a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88d17821abed9aca5c20373738f44ca9a61e954d1eee46f0d16c3e9b34d810a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50f5727e0bd53639ba6b6632f2d62c7c62ae74b07a60aa1cb58c2020990cae42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f5727e0bd53639ba6b6632f2d62c7c62ae74b07a60aa1cb58c2020990cae42\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd84740e3b0a970decedcc3960fb987fa618f9627f06be1d2d0b034d0361f805\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd84740e3b0a970decedcc3960fb987fa618f9627f06be1d2d0b034d0361f805\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6af6d9f7c1ca6625f88dcaa9ef267cf11f3ebb16a0ce12d3c2442550bc0833ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6af6d9f7c1ca6625f88dcaa9ef267cf11f3ebb16a0ce12d3c2442550bc0833ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vm6zb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:20Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:20 crc kubenswrapper[4752]: I0929 10:45:20.199958 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94028c24-ec10-4d5c-b32c-1700e677d539\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://486ac9c45cc8e6cc88a199b152343c1db14c51125b4357c85d5d082467fc4560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2860691a355a598f52a1f13213198fa7889748e67cca21a617ed5714f5eabcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e34a55130babbc5fbe9fb81d05fc687dc1b06c3bffea762ba699f9f6c317b312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5985eb5ebc8fa2ca986873aea235335770621597493b43eaa58d98329cd37009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b46368b26939edaf377aa86ef45fc9dc3ec4fa274dfe1cba458bafb8d32309e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a98f237ee9baeb799b2ea76ccbe7b349ed70b50f47738fc514ae56b46ee8d1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7274c1c9e4153ac28534a3b9f58c87c2e5480650edf3522e235805aea87dd76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7274c1c9e4153ac28534a3b9f58c87c2e5480650edf3522e235805aea87dd76\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T10:45:09Z\\\",\\\"message\\\":\\\"update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-marketplace/marketplace-operator-metrics]} name:Service_openshift-marketplace/marketplace-operator-metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.53:8081: 10.217.5.53:8383:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {89fe421e-04e8-4967-ac75-77a0e6f784ef}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0929 10:45:08.961904 6414 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI0929 10:45:08.961922 6414 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI0929 10:45:08.961943 6414 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nF0929 10:45:08.961950 6414 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T10:45:08Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-c2vrh_openshift-ovn-kubernetes(94028c24-ec10-4d5c-b32c-1700e677d539)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea11fb795febf50e35263b0a02c32a01fd69937dfbfe196696cd1792e40cc191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f22dfbbd26fb3ebf4869b46406913cc1963e33c11794193c815235be5acee338\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f22dfbbd26fb3ebf4869b46406913cc1963e33c11794193c815235be5acee338\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c2vrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:20Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:20 crc kubenswrapper[4752]: I0929 10:45:20.214543 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520a5d33-312c-4033-8b69-5dd582f13ccc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6223734bbce461c09916aea7629bba0cfa97ea17050bca7417020ece9ae031a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1157b82d6f3337270d30abdceadaa1f0a01b3c6d8de6bc8e9edf083a8264f19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://854abd6205c2eec2229d0d65aec3edb7cf1cc1e77759df41bd22deda4a08c8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://362298e6215cc1a9971973419e58a45e5ded2c4120b1e800afd87f480f6fd3d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c927118840179fccacbe6a18a329c117cef73a6e914bf38d20fc2439d6a5c1ee\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"message\\\":\\\"le observer\\\\nW0929 10:44:40.787758 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0929 10:44:40.787900 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 10:44:40.788558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1487283959/tls.crt::/tmp/serving-cert-1487283959/tls.key\\\\\\\"\\\\nI0929 10:44:41.256284 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 10:44:41.261265 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 10:44:41.261291 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 10:44:41.261311 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 10:44:41.261316 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 10:44:41.267824 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0929 10:44:41.267847 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0929 10:44:41.267849 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 10:44:41.267871 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 10:44:41.267876 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 10:44:41.267879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 10:44:41.267882 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 10:44:41.267884 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0929 10:44:41.270258 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbe61bb570ef2be352bb3a0e55da353ce7b618b397e3bf9f0d66da0c9b6f1d4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80f961b58569cce6d634f225369902695ccda2e78efb1c6fd635f1535467cc1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80f961b58569cce6d634f225369902695ccda2e78efb1c6fd635f1535467cc1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:20Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:20 crc kubenswrapper[4752]: I0929 10:45:20.227251 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca4f637cfcb1e52fa69f0ffa46b3a53459225d9ad4afd1178bff709e812c5418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b70242846937de5b4dda37a2b8c48947fded378c299ea4ad857168589d7c175\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:20Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:20 crc kubenswrapper[4752]: I0929 10:45:20.239593 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:20Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:20 crc kubenswrapper[4752]: I0929 10:45:20.253949 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fb781fd16d4a9f56202eb1724ed1a4ed6700ff7b81819573b955bcb07e563a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:20Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:20 crc kubenswrapper[4752]: I0929 10:45:20.267985 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xv5q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52fc9378-c37b-424b-afde-7b191bab5fde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30ee75a35da106cc9424c7a3f97f28d0c711200667372c023612db4a9701c189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4rqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xv5q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:20Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:20 crc kubenswrapper[4752]: I0929 10:45:20.280867 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:20 crc kubenswrapper[4752]: I0929 10:45:20.280911 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:20 crc kubenswrapper[4752]: I0929 10:45:20.280923 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:20 crc kubenswrapper[4752]: I0929 10:45:20.280940 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:20 crc kubenswrapper[4752]: I0929 10:45:20.280952 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:20Z","lastTransitionTime":"2025-09-29T10:45:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:20 crc kubenswrapper[4752]: I0929 10:45:20.286168 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mp5pm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65f5485e-9000-4512-aad3-7d367715ac2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db5dba49df10714a5f00ec40865af87528f6bee63ee58a89f299af7c10e4d769\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z772z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://073cf9e4675b04d77ad58f0b7e1b313e3fe15e8daee4e1c8934a90924b04ad22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z772z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mp5pm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:20Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:20 crc kubenswrapper[4752]: I0929 10:45:20.300474 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sq7f4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a33b92e-d79c-4162-8500-df7a89df8df3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qck2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qck2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sq7f4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:20Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:20 crc kubenswrapper[4752]: I0929 10:45:20.315553 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3e5d3a3-2f2d-4f61-ae95-26ebd1f72342\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66d77cd5048e199a6eae84be4079c3b00305f4f5223b5176a49df0feb2f0bf8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74b270e951a827068c908168bf04d4cd3bcba62e472e4a3f415de8b7463fdccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dd4d83f6d6b5db7fc93239bc1a6b731c67bc15ef1ca1990b53589e4ad36bfa7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39ef26bf3e7b95ac9a59199bbabe11fd4e831baba1b120ef97a4839c0c4aab7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:20Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:20 crc kubenswrapper[4752]: I0929 10:45:20.384491 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:20 crc kubenswrapper[4752]: I0929 10:45:20.384554 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:20 crc kubenswrapper[4752]: I0929 10:45:20.384564 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:20 crc kubenswrapper[4752]: I0929 10:45:20.384585 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:20 crc kubenswrapper[4752]: I0929 10:45:20.384595 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:20Z","lastTransitionTime":"2025-09-29T10:45:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:20 crc kubenswrapper[4752]: I0929 10:45:20.488092 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:20 crc kubenswrapper[4752]: I0929 10:45:20.488139 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:20 crc kubenswrapper[4752]: I0929 10:45:20.488151 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:20 crc kubenswrapper[4752]: I0929 10:45:20.488166 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:20 crc kubenswrapper[4752]: I0929 10:45:20.488176 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:20Z","lastTransitionTime":"2025-09-29T10:45:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:20 crc kubenswrapper[4752]: I0929 10:45:20.591818 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:20 crc kubenswrapper[4752]: I0929 10:45:20.592192 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:20 crc kubenswrapper[4752]: I0929 10:45:20.592206 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:20 crc kubenswrapper[4752]: I0929 10:45:20.592227 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:20 crc kubenswrapper[4752]: I0929 10:45:20.592241 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:20Z","lastTransitionTime":"2025-09-29T10:45:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:20 crc kubenswrapper[4752]: I0929 10:45:20.696032 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:20 crc kubenswrapper[4752]: I0929 10:45:20.696100 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:20 crc kubenswrapper[4752]: I0929 10:45:20.696122 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:20 crc kubenswrapper[4752]: I0929 10:45:20.696142 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:20 crc kubenswrapper[4752]: I0929 10:45:20.696154 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:20Z","lastTransitionTime":"2025-09-29T10:45:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:20 crc kubenswrapper[4752]: I0929 10:45:20.799397 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:20 crc kubenswrapper[4752]: I0929 10:45:20.799447 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:20 crc kubenswrapper[4752]: I0929 10:45:20.799459 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:20 crc kubenswrapper[4752]: I0929 10:45:20.799478 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:20 crc kubenswrapper[4752]: I0929 10:45:20.799491 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:20Z","lastTransitionTime":"2025-09-29T10:45:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:20 crc kubenswrapper[4752]: I0929 10:45:20.902508 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:20 crc kubenswrapper[4752]: I0929 10:45:20.902589 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:20 crc kubenswrapper[4752]: I0929 10:45:20.902600 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:20 crc kubenswrapper[4752]: I0929 10:45:20.902622 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:20 crc kubenswrapper[4752]: I0929 10:45:20.902637 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:20Z","lastTransitionTime":"2025-09-29T10:45:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:21 crc kubenswrapper[4752]: I0929 10:45:21.006259 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:21 crc kubenswrapper[4752]: I0929 10:45:21.006321 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:21 crc kubenswrapper[4752]: I0929 10:45:21.006337 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:21 crc kubenswrapper[4752]: I0929 10:45:21.006361 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:21 crc kubenswrapper[4752]: I0929 10:45:21.006380 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:21Z","lastTransitionTime":"2025-09-29T10:45:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:21 crc kubenswrapper[4752]: I0929 10:45:21.030111 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 10:45:21 crc kubenswrapper[4752]: I0929 10:45:21.030111 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 10:45:21 crc kubenswrapper[4752]: E0929 10:45:21.030317 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 10:45:21 crc kubenswrapper[4752]: I0929 10:45:21.030140 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 10:45:21 crc kubenswrapper[4752]: E0929 10:45:21.030403 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 10:45:21 crc kubenswrapper[4752]: I0929 10:45:21.030123 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sq7f4" Sep 29 10:45:21 crc kubenswrapper[4752]: E0929 10:45:21.030605 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sq7f4" podUID="0a33b92e-d79c-4162-8500-df7a89df8df3" Sep 29 10:45:21 crc kubenswrapper[4752]: E0929 10:45:21.030706 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 10:45:21 crc kubenswrapper[4752]: I0929 10:45:21.109180 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:21 crc kubenswrapper[4752]: I0929 10:45:21.109280 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:21 crc kubenswrapper[4752]: I0929 10:45:21.109291 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:21 crc kubenswrapper[4752]: I0929 10:45:21.109313 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:21 crc kubenswrapper[4752]: I0929 10:45:21.109328 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:21Z","lastTransitionTime":"2025-09-29T10:45:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:21 crc kubenswrapper[4752]: I0929 10:45:21.212523 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:21 crc kubenswrapper[4752]: I0929 10:45:21.212575 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:21 crc kubenswrapper[4752]: I0929 10:45:21.212584 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:21 crc kubenswrapper[4752]: I0929 10:45:21.212601 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:21 crc kubenswrapper[4752]: I0929 10:45:21.212611 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:21Z","lastTransitionTime":"2025-09-29T10:45:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:21 crc kubenswrapper[4752]: I0929 10:45:21.315165 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:21 crc kubenswrapper[4752]: I0929 10:45:21.315201 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:21 crc kubenswrapper[4752]: I0929 10:45:21.315217 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:21 crc kubenswrapper[4752]: I0929 10:45:21.315235 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:21 crc kubenswrapper[4752]: I0929 10:45:21.315249 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:21Z","lastTransitionTime":"2025-09-29T10:45:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:21 crc kubenswrapper[4752]: I0929 10:45:21.418526 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:21 crc kubenswrapper[4752]: I0929 10:45:21.418614 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:21 crc kubenswrapper[4752]: I0929 10:45:21.418629 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:21 crc kubenswrapper[4752]: I0929 10:45:21.418648 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:21 crc kubenswrapper[4752]: I0929 10:45:21.418662 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:21Z","lastTransitionTime":"2025-09-29T10:45:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:21 crc kubenswrapper[4752]: I0929 10:45:21.521385 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:21 crc kubenswrapper[4752]: I0929 10:45:21.521443 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:21 crc kubenswrapper[4752]: I0929 10:45:21.521451 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:21 crc kubenswrapper[4752]: I0929 10:45:21.521469 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:21 crc kubenswrapper[4752]: I0929 10:45:21.521480 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:21Z","lastTransitionTime":"2025-09-29T10:45:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:21 crc kubenswrapper[4752]: I0929 10:45:21.624933 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:21 crc kubenswrapper[4752]: I0929 10:45:21.624990 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:21 crc kubenswrapper[4752]: I0929 10:45:21.625002 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:21 crc kubenswrapper[4752]: I0929 10:45:21.625022 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:21 crc kubenswrapper[4752]: I0929 10:45:21.625034 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:21Z","lastTransitionTime":"2025-09-29T10:45:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:21 crc kubenswrapper[4752]: I0929 10:45:21.731079 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:21 crc kubenswrapper[4752]: I0929 10:45:21.731190 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:21 crc kubenswrapper[4752]: I0929 10:45:21.731206 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:21 crc kubenswrapper[4752]: I0929 10:45:21.731227 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:21 crc kubenswrapper[4752]: I0929 10:45:21.731372 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:21Z","lastTransitionTime":"2025-09-29T10:45:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:21 crc kubenswrapper[4752]: I0929 10:45:21.834878 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:21 crc kubenswrapper[4752]: I0929 10:45:21.834929 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:21 crc kubenswrapper[4752]: I0929 10:45:21.834965 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:21 crc kubenswrapper[4752]: I0929 10:45:21.834985 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:21 crc kubenswrapper[4752]: I0929 10:45:21.835014 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:21Z","lastTransitionTime":"2025-09-29T10:45:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:21 crc kubenswrapper[4752]: I0929 10:45:21.937566 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:21 crc kubenswrapper[4752]: I0929 10:45:21.937617 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:21 crc kubenswrapper[4752]: I0929 10:45:21.937630 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:21 crc kubenswrapper[4752]: I0929 10:45:21.937651 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:21 crc kubenswrapper[4752]: I0929 10:45:21.937666 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:21Z","lastTransitionTime":"2025-09-29T10:45:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:22 crc kubenswrapper[4752]: I0929 10:45:22.040677 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:22 crc kubenswrapper[4752]: I0929 10:45:22.040750 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:22 crc kubenswrapper[4752]: I0929 10:45:22.040762 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:22 crc kubenswrapper[4752]: I0929 10:45:22.040781 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:22 crc kubenswrapper[4752]: I0929 10:45:22.040794 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:22Z","lastTransitionTime":"2025-09-29T10:45:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:22 crc kubenswrapper[4752]: I0929 10:45:22.143888 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:22 crc kubenswrapper[4752]: I0929 10:45:22.143939 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:22 crc kubenswrapper[4752]: I0929 10:45:22.143951 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:22 crc kubenswrapper[4752]: I0929 10:45:22.143971 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:22 crc kubenswrapper[4752]: I0929 10:45:22.143983 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:22Z","lastTransitionTime":"2025-09-29T10:45:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:22 crc kubenswrapper[4752]: I0929 10:45:22.247266 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:22 crc kubenswrapper[4752]: I0929 10:45:22.247327 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:22 crc kubenswrapper[4752]: I0929 10:45:22.247345 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:22 crc kubenswrapper[4752]: I0929 10:45:22.247370 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:22 crc kubenswrapper[4752]: I0929 10:45:22.247390 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:22Z","lastTransitionTime":"2025-09-29T10:45:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:22 crc kubenswrapper[4752]: I0929 10:45:22.349929 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:22 crc kubenswrapper[4752]: I0929 10:45:22.350014 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:22 crc kubenswrapper[4752]: I0929 10:45:22.350059 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:22 crc kubenswrapper[4752]: I0929 10:45:22.350079 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:22 crc kubenswrapper[4752]: I0929 10:45:22.350094 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:22Z","lastTransitionTime":"2025-09-29T10:45:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:22 crc kubenswrapper[4752]: I0929 10:45:22.452660 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:22 crc kubenswrapper[4752]: I0929 10:45:22.452710 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:22 crc kubenswrapper[4752]: I0929 10:45:22.452722 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:22 crc kubenswrapper[4752]: I0929 10:45:22.452744 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:22 crc kubenswrapper[4752]: I0929 10:45:22.452756 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:22Z","lastTransitionTime":"2025-09-29T10:45:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:22 crc kubenswrapper[4752]: I0929 10:45:22.555236 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:22 crc kubenswrapper[4752]: I0929 10:45:22.555311 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:22 crc kubenswrapper[4752]: I0929 10:45:22.555325 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:22 crc kubenswrapper[4752]: I0929 10:45:22.555369 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:22 crc kubenswrapper[4752]: I0929 10:45:22.555382 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:22Z","lastTransitionTime":"2025-09-29T10:45:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:22 crc kubenswrapper[4752]: I0929 10:45:22.659510 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:22 crc kubenswrapper[4752]: I0929 10:45:22.659574 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:22 crc kubenswrapper[4752]: I0929 10:45:22.659585 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:22 crc kubenswrapper[4752]: I0929 10:45:22.659605 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:22 crc kubenswrapper[4752]: I0929 10:45:22.659619 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:22Z","lastTransitionTime":"2025-09-29T10:45:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:22 crc kubenswrapper[4752]: I0929 10:45:22.762513 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:22 crc kubenswrapper[4752]: I0929 10:45:22.762575 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:22 crc kubenswrapper[4752]: I0929 10:45:22.762585 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:22 crc kubenswrapper[4752]: I0929 10:45:22.762602 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:22 crc kubenswrapper[4752]: I0929 10:45:22.762613 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:22Z","lastTransitionTime":"2025-09-29T10:45:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:22 crc kubenswrapper[4752]: I0929 10:45:22.865232 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:22 crc kubenswrapper[4752]: I0929 10:45:22.865278 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:22 crc kubenswrapper[4752]: I0929 10:45:22.865291 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:22 crc kubenswrapper[4752]: I0929 10:45:22.865310 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:22 crc kubenswrapper[4752]: I0929 10:45:22.865321 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:22Z","lastTransitionTime":"2025-09-29T10:45:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:22 crc kubenswrapper[4752]: I0929 10:45:22.968243 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:22 crc kubenswrapper[4752]: I0929 10:45:22.968295 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:22 crc kubenswrapper[4752]: I0929 10:45:22.968315 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:22 crc kubenswrapper[4752]: I0929 10:45:22.968334 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:22 crc kubenswrapper[4752]: I0929 10:45:22.968345 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:22Z","lastTransitionTime":"2025-09-29T10:45:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:23 crc kubenswrapper[4752]: I0929 10:45:23.030225 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 10:45:23 crc kubenswrapper[4752]: I0929 10:45:23.030338 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 10:45:23 crc kubenswrapper[4752]: I0929 10:45:23.030340 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sq7f4" Sep 29 10:45:23 crc kubenswrapper[4752]: E0929 10:45:23.030392 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 10:45:23 crc kubenswrapper[4752]: I0929 10:45:23.030436 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 10:45:23 crc kubenswrapper[4752]: E0929 10:45:23.030581 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 10:45:23 crc kubenswrapper[4752]: E0929 10:45:23.030689 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sq7f4" podUID="0a33b92e-d79c-4162-8500-df7a89df8df3" Sep 29 10:45:23 crc kubenswrapper[4752]: E0929 10:45:23.030759 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 10:45:23 crc kubenswrapper[4752]: I0929 10:45:23.070453 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:23 crc kubenswrapper[4752]: I0929 10:45:23.070508 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:23 crc kubenswrapper[4752]: I0929 10:45:23.070520 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:23 crc kubenswrapper[4752]: I0929 10:45:23.070543 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:23 crc kubenswrapper[4752]: I0929 10:45:23.070557 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:23Z","lastTransitionTime":"2025-09-29T10:45:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:23 crc kubenswrapper[4752]: I0929 10:45:23.173549 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:23 crc kubenswrapper[4752]: I0929 10:45:23.173589 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:23 crc kubenswrapper[4752]: I0929 10:45:23.173599 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:23 crc kubenswrapper[4752]: I0929 10:45:23.173615 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:23 crc kubenswrapper[4752]: I0929 10:45:23.173627 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:23Z","lastTransitionTime":"2025-09-29T10:45:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:23 crc kubenswrapper[4752]: I0929 10:45:23.280045 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:23 crc kubenswrapper[4752]: I0929 10:45:23.280074 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:23 crc kubenswrapper[4752]: I0929 10:45:23.280083 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:23 crc kubenswrapper[4752]: I0929 10:45:23.280099 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:23 crc kubenswrapper[4752]: I0929 10:45:23.280108 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:23Z","lastTransitionTime":"2025-09-29T10:45:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:23 crc kubenswrapper[4752]: I0929 10:45:23.383234 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:23 crc kubenswrapper[4752]: I0929 10:45:23.383285 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:23 crc kubenswrapper[4752]: I0929 10:45:23.383303 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:23 crc kubenswrapper[4752]: I0929 10:45:23.383329 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:23 crc kubenswrapper[4752]: I0929 10:45:23.383347 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:23Z","lastTransitionTime":"2025-09-29T10:45:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:23 crc kubenswrapper[4752]: I0929 10:45:23.486292 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:23 crc kubenswrapper[4752]: I0929 10:45:23.486359 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:23 crc kubenswrapper[4752]: I0929 10:45:23.486374 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:23 crc kubenswrapper[4752]: I0929 10:45:23.486397 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:23 crc kubenswrapper[4752]: I0929 10:45:23.486412 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:23Z","lastTransitionTime":"2025-09-29T10:45:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:23 crc kubenswrapper[4752]: I0929 10:45:23.600118 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:23 crc kubenswrapper[4752]: I0929 10:45:23.600202 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:23 crc kubenswrapper[4752]: I0929 10:45:23.600229 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:23 crc kubenswrapper[4752]: I0929 10:45:23.600264 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:23 crc kubenswrapper[4752]: I0929 10:45:23.600291 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:23Z","lastTransitionTime":"2025-09-29T10:45:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:23 crc kubenswrapper[4752]: I0929 10:45:23.703531 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:23 crc kubenswrapper[4752]: I0929 10:45:23.703561 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:23 crc kubenswrapper[4752]: I0929 10:45:23.703571 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:23 crc kubenswrapper[4752]: I0929 10:45:23.703588 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:23 crc kubenswrapper[4752]: I0929 10:45:23.703600 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:23Z","lastTransitionTime":"2025-09-29T10:45:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:23 crc kubenswrapper[4752]: I0929 10:45:23.808102 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:23 crc kubenswrapper[4752]: I0929 10:45:23.808170 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:23 crc kubenswrapper[4752]: I0929 10:45:23.808183 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:23 crc kubenswrapper[4752]: I0929 10:45:23.808202 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:23 crc kubenswrapper[4752]: I0929 10:45:23.808215 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:23Z","lastTransitionTime":"2025-09-29T10:45:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:23 crc kubenswrapper[4752]: I0929 10:45:23.911680 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:23 crc kubenswrapper[4752]: I0929 10:45:23.911733 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:23 crc kubenswrapper[4752]: I0929 10:45:23.911750 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:23 crc kubenswrapper[4752]: I0929 10:45:23.911772 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:23 crc kubenswrapper[4752]: I0929 10:45:23.911789 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:23Z","lastTransitionTime":"2025-09-29T10:45:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:24 crc kubenswrapper[4752]: I0929 10:45:24.017231 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:24 crc kubenswrapper[4752]: I0929 10:45:24.017296 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:24 crc kubenswrapper[4752]: I0929 10:45:24.017311 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:24 crc kubenswrapper[4752]: I0929 10:45:24.017334 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:24 crc kubenswrapper[4752]: I0929 10:45:24.017352 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:24Z","lastTransitionTime":"2025-09-29T10:45:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:24 crc kubenswrapper[4752]: I0929 10:45:24.120475 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:24 crc kubenswrapper[4752]: I0929 10:45:24.120538 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:24 crc kubenswrapper[4752]: I0929 10:45:24.120555 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:24 crc kubenswrapper[4752]: I0929 10:45:24.120577 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:24 crc kubenswrapper[4752]: I0929 10:45:24.120592 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:24Z","lastTransitionTime":"2025-09-29T10:45:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:24 crc kubenswrapper[4752]: I0929 10:45:24.223756 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:24 crc kubenswrapper[4752]: I0929 10:45:24.223823 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:24 crc kubenswrapper[4752]: I0929 10:45:24.223833 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:24 crc kubenswrapper[4752]: I0929 10:45:24.223853 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:24 crc kubenswrapper[4752]: I0929 10:45:24.223865 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:24Z","lastTransitionTime":"2025-09-29T10:45:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:24 crc kubenswrapper[4752]: I0929 10:45:24.326933 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:24 crc kubenswrapper[4752]: I0929 10:45:24.326986 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:24 crc kubenswrapper[4752]: I0929 10:45:24.326999 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:24 crc kubenswrapper[4752]: I0929 10:45:24.327016 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:24 crc kubenswrapper[4752]: I0929 10:45:24.327029 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:24Z","lastTransitionTime":"2025-09-29T10:45:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:24 crc kubenswrapper[4752]: I0929 10:45:24.429214 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:24 crc kubenswrapper[4752]: I0929 10:45:24.429259 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:24 crc kubenswrapper[4752]: I0929 10:45:24.429271 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:24 crc kubenswrapper[4752]: I0929 10:45:24.429297 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:24 crc kubenswrapper[4752]: I0929 10:45:24.429322 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:24Z","lastTransitionTime":"2025-09-29T10:45:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:24 crc kubenswrapper[4752]: I0929 10:45:24.531632 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:24 crc kubenswrapper[4752]: I0929 10:45:24.531683 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:24 crc kubenswrapper[4752]: I0929 10:45:24.531694 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:24 crc kubenswrapper[4752]: I0929 10:45:24.531711 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:24 crc kubenswrapper[4752]: I0929 10:45:24.531723 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:24Z","lastTransitionTime":"2025-09-29T10:45:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:24 crc kubenswrapper[4752]: I0929 10:45:24.634364 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:24 crc kubenswrapper[4752]: I0929 10:45:24.634418 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:24 crc kubenswrapper[4752]: I0929 10:45:24.634429 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:24 crc kubenswrapper[4752]: I0929 10:45:24.634446 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:24 crc kubenswrapper[4752]: I0929 10:45:24.634460 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:24Z","lastTransitionTime":"2025-09-29T10:45:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:24 crc kubenswrapper[4752]: I0929 10:45:24.738628 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:24 crc kubenswrapper[4752]: I0929 10:45:24.738698 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:24 crc kubenswrapper[4752]: I0929 10:45:24.738712 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:24 crc kubenswrapper[4752]: I0929 10:45:24.738730 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:24 crc kubenswrapper[4752]: I0929 10:45:24.738742 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:24Z","lastTransitionTime":"2025-09-29T10:45:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:24 crc kubenswrapper[4752]: I0929 10:45:24.841067 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:24 crc kubenswrapper[4752]: I0929 10:45:24.841126 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:24 crc kubenswrapper[4752]: I0929 10:45:24.841137 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:24 crc kubenswrapper[4752]: I0929 10:45:24.841157 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:24 crc kubenswrapper[4752]: I0929 10:45:24.841168 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:24Z","lastTransitionTime":"2025-09-29T10:45:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:24 crc kubenswrapper[4752]: I0929 10:45:24.943944 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:24 crc kubenswrapper[4752]: I0929 10:45:24.943994 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:24 crc kubenswrapper[4752]: I0929 10:45:24.944006 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:24 crc kubenswrapper[4752]: I0929 10:45:24.944023 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:24 crc kubenswrapper[4752]: I0929 10:45:24.944037 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:24Z","lastTransitionTime":"2025-09-29T10:45:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:25 crc kubenswrapper[4752]: I0929 10:45:25.030754 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sq7f4" Sep 29 10:45:25 crc kubenswrapper[4752]: I0929 10:45:25.030864 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 10:45:25 crc kubenswrapper[4752]: I0929 10:45:25.030766 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 10:45:25 crc kubenswrapper[4752]: E0929 10:45:25.030939 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sq7f4" podUID="0a33b92e-d79c-4162-8500-df7a89df8df3" Sep 29 10:45:25 crc kubenswrapper[4752]: E0929 10:45:25.031005 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 10:45:25 crc kubenswrapper[4752]: I0929 10:45:25.030774 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 10:45:25 crc kubenswrapper[4752]: E0929 10:45:25.031646 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 10:45:25 crc kubenswrapper[4752]: E0929 10:45:25.031731 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 10:45:25 crc kubenswrapper[4752]: I0929 10:45:25.046352 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:25 crc kubenswrapper[4752]: I0929 10:45:25.046396 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:25 crc kubenswrapper[4752]: I0929 10:45:25.046408 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:25 crc kubenswrapper[4752]: I0929 10:45:25.046424 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:25 crc kubenswrapper[4752]: I0929 10:45:25.046436 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:25Z","lastTransitionTime":"2025-09-29T10:45:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:25 crc kubenswrapper[4752]: I0929 10:45:25.149548 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:25 crc kubenswrapper[4752]: I0929 10:45:25.149603 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:25 crc kubenswrapper[4752]: I0929 10:45:25.149619 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:25 crc kubenswrapper[4752]: I0929 10:45:25.149645 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:25 crc kubenswrapper[4752]: I0929 10:45:25.149662 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:25Z","lastTransitionTime":"2025-09-29T10:45:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:25 crc kubenswrapper[4752]: I0929 10:45:25.252342 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:25 crc kubenswrapper[4752]: I0929 10:45:25.252418 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:25 crc kubenswrapper[4752]: I0929 10:45:25.252433 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:25 crc kubenswrapper[4752]: I0929 10:45:25.252454 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:25 crc kubenswrapper[4752]: I0929 10:45:25.252497 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:25Z","lastTransitionTime":"2025-09-29T10:45:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:25 crc kubenswrapper[4752]: I0929 10:45:25.354919 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:25 crc kubenswrapper[4752]: I0929 10:45:25.354951 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:25 crc kubenswrapper[4752]: I0929 10:45:25.354960 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:25 crc kubenswrapper[4752]: I0929 10:45:25.354972 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:25 crc kubenswrapper[4752]: I0929 10:45:25.354980 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:25Z","lastTransitionTime":"2025-09-29T10:45:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:25 crc kubenswrapper[4752]: I0929 10:45:25.457196 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:25 crc kubenswrapper[4752]: I0929 10:45:25.457254 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:25 crc kubenswrapper[4752]: I0929 10:45:25.457264 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:25 crc kubenswrapper[4752]: I0929 10:45:25.457294 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:25 crc kubenswrapper[4752]: I0929 10:45:25.457305 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:25Z","lastTransitionTime":"2025-09-29T10:45:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:25 crc kubenswrapper[4752]: I0929 10:45:25.560243 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:25 crc kubenswrapper[4752]: I0929 10:45:25.560300 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:25 crc kubenswrapper[4752]: I0929 10:45:25.560316 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:25 crc kubenswrapper[4752]: I0929 10:45:25.560354 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:25 crc kubenswrapper[4752]: I0929 10:45:25.560367 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:25Z","lastTransitionTime":"2025-09-29T10:45:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:25 crc kubenswrapper[4752]: I0929 10:45:25.663613 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:25 crc kubenswrapper[4752]: I0929 10:45:25.663659 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:25 crc kubenswrapper[4752]: I0929 10:45:25.663667 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:25 crc kubenswrapper[4752]: I0929 10:45:25.663683 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:25 crc kubenswrapper[4752]: I0929 10:45:25.663692 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:25Z","lastTransitionTime":"2025-09-29T10:45:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:25 crc kubenswrapper[4752]: I0929 10:45:25.766451 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:25 crc kubenswrapper[4752]: I0929 10:45:25.766514 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:25 crc kubenswrapper[4752]: I0929 10:45:25.766526 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:25 crc kubenswrapper[4752]: I0929 10:45:25.766543 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:25 crc kubenswrapper[4752]: I0929 10:45:25.766555 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:25Z","lastTransitionTime":"2025-09-29T10:45:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:25 crc kubenswrapper[4752]: I0929 10:45:25.869021 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:25 crc kubenswrapper[4752]: I0929 10:45:25.869073 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:25 crc kubenswrapper[4752]: I0929 10:45:25.869086 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:25 crc kubenswrapper[4752]: I0929 10:45:25.869102 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:25 crc kubenswrapper[4752]: I0929 10:45:25.869115 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:25Z","lastTransitionTime":"2025-09-29T10:45:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:25 crc kubenswrapper[4752]: I0929 10:45:25.971962 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:25 crc kubenswrapper[4752]: I0929 10:45:25.972024 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:25 crc kubenswrapper[4752]: I0929 10:45:25.972039 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:25 crc kubenswrapper[4752]: I0929 10:45:25.972062 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:25 crc kubenswrapper[4752]: I0929 10:45:25.972078 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:25Z","lastTransitionTime":"2025-09-29T10:45:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:26 crc kubenswrapper[4752]: I0929 10:45:26.031683 4752 scope.go:117] "RemoveContainer" containerID="d7274c1c9e4153ac28534a3b9f58c87c2e5480650edf3522e235805aea87dd76" Sep 29 10:45:26 crc kubenswrapper[4752]: E0929 10:45:26.032107 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-c2vrh_openshift-ovn-kubernetes(94028c24-ec10-4d5c-b32c-1700e677d539)\"" pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" podUID="94028c24-ec10-4d5c-b32c-1700e677d539" Sep 29 10:45:26 crc kubenswrapper[4752]: I0929 10:45:26.075626 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:26 crc kubenswrapper[4752]: I0929 10:45:26.075711 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:26 crc kubenswrapper[4752]: I0929 10:45:26.075945 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:26 crc kubenswrapper[4752]: I0929 10:45:26.075978 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:26 crc kubenswrapper[4752]: I0929 10:45:26.075996 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:26Z","lastTransitionTime":"2025-09-29T10:45:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:26 crc kubenswrapper[4752]: I0929 10:45:26.178384 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:26 crc kubenswrapper[4752]: I0929 10:45:26.178419 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:26 crc kubenswrapper[4752]: I0929 10:45:26.178429 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:26 crc kubenswrapper[4752]: I0929 10:45:26.178445 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:26 crc kubenswrapper[4752]: I0929 10:45:26.178455 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:26Z","lastTransitionTime":"2025-09-29T10:45:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:26 crc kubenswrapper[4752]: I0929 10:45:26.280943 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:26 crc kubenswrapper[4752]: I0929 10:45:26.280978 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:26 crc kubenswrapper[4752]: I0929 10:45:26.280987 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:26 crc kubenswrapper[4752]: I0929 10:45:26.281000 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:26 crc kubenswrapper[4752]: I0929 10:45:26.281011 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:26Z","lastTransitionTime":"2025-09-29T10:45:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:26 crc kubenswrapper[4752]: I0929 10:45:26.383587 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:26 crc kubenswrapper[4752]: I0929 10:45:26.383921 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:26 crc kubenswrapper[4752]: I0929 10:45:26.384058 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:26 crc kubenswrapper[4752]: I0929 10:45:26.384187 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:26 crc kubenswrapper[4752]: I0929 10:45:26.384310 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:26Z","lastTransitionTime":"2025-09-29T10:45:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:26 crc kubenswrapper[4752]: I0929 10:45:26.488033 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:26 crc kubenswrapper[4752]: I0929 10:45:26.488077 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:26 crc kubenswrapper[4752]: I0929 10:45:26.488089 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:26 crc kubenswrapper[4752]: I0929 10:45:26.488106 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:26 crc kubenswrapper[4752]: I0929 10:45:26.488120 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:26Z","lastTransitionTime":"2025-09-29T10:45:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:26 crc kubenswrapper[4752]: I0929 10:45:26.590828 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:26 crc kubenswrapper[4752]: I0929 10:45:26.590876 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:26 crc kubenswrapper[4752]: I0929 10:45:26.590889 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:26 crc kubenswrapper[4752]: I0929 10:45:26.590914 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:26 crc kubenswrapper[4752]: I0929 10:45:26.590927 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:26Z","lastTransitionTime":"2025-09-29T10:45:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:26 crc kubenswrapper[4752]: I0929 10:45:26.693575 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:26 crc kubenswrapper[4752]: I0929 10:45:26.693618 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:26 crc kubenswrapper[4752]: I0929 10:45:26.693633 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:26 crc kubenswrapper[4752]: I0929 10:45:26.693654 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:26 crc kubenswrapper[4752]: I0929 10:45:26.693668 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:26Z","lastTransitionTime":"2025-09-29T10:45:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:26 crc kubenswrapper[4752]: I0929 10:45:26.796160 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:26 crc kubenswrapper[4752]: I0929 10:45:26.796222 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:26 crc kubenswrapper[4752]: I0929 10:45:26.796234 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:26 crc kubenswrapper[4752]: I0929 10:45:26.796252 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:26 crc kubenswrapper[4752]: I0929 10:45:26.796264 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:26Z","lastTransitionTime":"2025-09-29T10:45:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:26 crc kubenswrapper[4752]: I0929 10:45:26.900243 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:26 crc kubenswrapper[4752]: I0929 10:45:26.900301 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:26 crc kubenswrapper[4752]: I0929 10:45:26.900312 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:26 crc kubenswrapper[4752]: I0929 10:45:26.900333 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:26 crc kubenswrapper[4752]: I0929 10:45:26.900348 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:26Z","lastTransitionTime":"2025-09-29T10:45:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:27 crc kubenswrapper[4752]: I0929 10:45:27.003129 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:27 crc kubenswrapper[4752]: I0929 10:45:27.003182 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:27 crc kubenswrapper[4752]: I0929 10:45:27.003198 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:27 crc kubenswrapper[4752]: I0929 10:45:27.003219 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:27 crc kubenswrapper[4752]: I0929 10:45:27.003233 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:27Z","lastTransitionTime":"2025-09-29T10:45:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:27 crc kubenswrapper[4752]: I0929 10:45:27.030783 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sq7f4" Sep 29 10:45:27 crc kubenswrapper[4752]: I0929 10:45:27.030927 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 10:45:27 crc kubenswrapper[4752]: I0929 10:45:27.030919 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 10:45:27 crc kubenswrapper[4752]: I0929 10:45:27.030850 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 10:45:27 crc kubenswrapper[4752]: E0929 10:45:27.031261 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sq7f4" podUID="0a33b92e-d79c-4162-8500-df7a89df8df3" Sep 29 10:45:27 crc kubenswrapper[4752]: E0929 10:45:27.031374 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 10:45:27 crc kubenswrapper[4752]: E0929 10:45:27.031482 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 10:45:27 crc kubenswrapper[4752]: E0929 10:45:27.031616 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 10:45:27 crc kubenswrapper[4752]: I0929 10:45:27.106867 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:27 crc kubenswrapper[4752]: I0929 10:45:27.107373 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:27 crc kubenswrapper[4752]: I0929 10:45:27.107516 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:27 crc kubenswrapper[4752]: I0929 10:45:27.107651 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:27 crc kubenswrapper[4752]: I0929 10:45:27.107793 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:27Z","lastTransitionTime":"2025-09-29T10:45:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:27 crc kubenswrapper[4752]: I0929 10:45:27.211103 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:27 crc kubenswrapper[4752]: I0929 10:45:27.211945 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:27 crc kubenswrapper[4752]: I0929 10:45:27.212021 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:27 crc kubenswrapper[4752]: I0929 10:45:27.212062 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:27 crc kubenswrapper[4752]: I0929 10:45:27.212092 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:27Z","lastTransitionTime":"2025-09-29T10:45:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:27 crc kubenswrapper[4752]: I0929 10:45:27.314816 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:27 crc kubenswrapper[4752]: I0929 10:45:27.315144 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:27 crc kubenswrapper[4752]: I0929 10:45:27.315223 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:27 crc kubenswrapper[4752]: I0929 10:45:27.315344 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:27 crc kubenswrapper[4752]: I0929 10:45:27.315415 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:27Z","lastTransitionTime":"2025-09-29T10:45:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:27 crc kubenswrapper[4752]: I0929 10:45:27.418907 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:27 crc kubenswrapper[4752]: I0929 10:45:27.418977 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:27 crc kubenswrapper[4752]: I0929 10:45:27.418990 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:27 crc kubenswrapper[4752]: I0929 10:45:27.419014 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:27 crc kubenswrapper[4752]: I0929 10:45:27.419029 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:27Z","lastTransitionTime":"2025-09-29T10:45:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:27 crc kubenswrapper[4752]: I0929 10:45:27.522205 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:27 crc kubenswrapper[4752]: I0929 10:45:27.522266 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:27 crc kubenswrapper[4752]: I0929 10:45:27.522279 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:27 crc kubenswrapper[4752]: I0929 10:45:27.522297 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:27 crc kubenswrapper[4752]: I0929 10:45:27.522310 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:27Z","lastTransitionTime":"2025-09-29T10:45:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:27 crc kubenswrapper[4752]: I0929 10:45:27.625255 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:27 crc kubenswrapper[4752]: I0929 10:45:27.625337 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:27 crc kubenswrapper[4752]: I0929 10:45:27.625352 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:27 crc kubenswrapper[4752]: I0929 10:45:27.625380 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:27 crc kubenswrapper[4752]: I0929 10:45:27.625396 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:27Z","lastTransitionTime":"2025-09-29T10:45:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:27 crc kubenswrapper[4752]: I0929 10:45:27.648051 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0a33b92e-d79c-4162-8500-df7a89df8df3-metrics-certs\") pod \"network-metrics-daemon-sq7f4\" (UID: \"0a33b92e-d79c-4162-8500-df7a89df8df3\") " pod="openshift-multus/network-metrics-daemon-sq7f4" Sep 29 10:45:27 crc kubenswrapper[4752]: E0929 10:45:27.648230 4752 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 29 10:45:27 crc kubenswrapper[4752]: E0929 10:45:27.648315 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0a33b92e-d79c-4162-8500-df7a89df8df3-metrics-certs podName:0a33b92e-d79c-4162-8500-df7a89df8df3 nodeName:}" failed. No retries permitted until 2025-09-29 10:45:59.648292509 +0000 UTC m=+100.437434176 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0a33b92e-d79c-4162-8500-df7a89df8df3-metrics-certs") pod "network-metrics-daemon-sq7f4" (UID: "0a33b92e-d79c-4162-8500-df7a89df8df3") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 29 10:45:27 crc kubenswrapper[4752]: I0929 10:45:27.728721 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:27 crc kubenswrapper[4752]: I0929 10:45:27.728777 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:27 crc kubenswrapper[4752]: I0929 10:45:27.728790 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:27 crc kubenswrapper[4752]: I0929 10:45:27.728831 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:27 crc kubenswrapper[4752]: I0929 10:45:27.728846 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:27Z","lastTransitionTime":"2025-09-29T10:45:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:27 crc kubenswrapper[4752]: I0929 10:45:27.832317 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:27 crc kubenswrapper[4752]: I0929 10:45:27.832367 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:27 crc kubenswrapper[4752]: I0929 10:45:27.832381 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:27 crc kubenswrapper[4752]: I0929 10:45:27.832399 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:27 crc kubenswrapper[4752]: I0929 10:45:27.832409 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:27Z","lastTransitionTime":"2025-09-29T10:45:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:27 crc kubenswrapper[4752]: I0929 10:45:27.918854 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:27 crc kubenswrapper[4752]: I0929 10:45:27.918919 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:27 crc kubenswrapper[4752]: I0929 10:45:27.918930 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:27 crc kubenswrapper[4752]: I0929 10:45:27.918953 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:27 crc kubenswrapper[4752]: I0929 10:45:27.918968 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:27Z","lastTransitionTime":"2025-09-29T10:45:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:27 crc kubenswrapper[4752]: E0929 10:45:27.940012 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:45:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:45:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:45:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:45:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"67757396-6dfe-4e60-ba89-bdfd50031eb3\\\",\\\"systemUUID\\\":\\\"d8106fc8-56a6-4aa2-998a-aa38bb8caa68\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:27Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:27 crc kubenswrapper[4752]: I0929 10:45:27.944912 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:27 crc kubenswrapper[4752]: I0929 10:45:27.944992 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:27 crc kubenswrapper[4752]: I0929 10:45:27.945005 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:27 crc kubenswrapper[4752]: I0929 10:45:27.945027 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:27 crc kubenswrapper[4752]: I0929 10:45:27.945039 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:27Z","lastTransitionTime":"2025-09-29T10:45:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:27 crc kubenswrapper[4752]: E0929 10:45:27.959605 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:45:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:45:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:45:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:45:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"67757396-6dfe-4e60-ba89-bdfd50031eb3\\\",\\\"systemUUID\\\":\\\"d8106fc8-56a6-4aa2-998a-aa38bb8caa68\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:27Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:27 crc kubenswrapper[4752]: I0929 10:45:27.964566 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:27 crc kubenswrapper[4752]: I0929 10:45:27.964626 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:27 crc kubenswrapper[4752]: I0929 10:45:27.964689 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:27 crc kubenswrapper[4752]: I0929 10:45:27.964713 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:27 crc kubenswrapper[4752]: I0929 10:45:27.964726 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:27Z","lastTransitionTime":"2025-09-29T10:45:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:27 crc kubenswrapper[4752]: E0929 10:45:27.978781 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:45:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:45:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:45:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:45:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"67757396-6dfe-4e60-ba89-bdfd50031eb3\\\",\\\"systemUUID\\\":\\\"d8106fc8-56a6-4aa2-998a-aa38bb8caa68\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:27Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:27 crc kubenswrapper[4752]: I0929 10:45:27.983480 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:27 crc kubenswrapper[4752]: I0929 10:45:27.983725 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:27 crc kubenswrapper[4752]: I0929 10:45:27.983735 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:27 crc kubenswrapper[4752]: I0929 10:45:27.983754 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:27 crc kubenswrapper[4752]: I0929 10:45:27.983767 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:27Z","lastTransitionTime":"2025-09-29T10:45:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:27 crc kubenswrapper[4752]: E0929 10:45:27.999156 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:45:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:45:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:45:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:45:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"67757396-6dfe-4e60-ba89-bdfd50031eb3\\\",\\\"systemUUID\\\":\\\"d8106fc8-56a6-4aa2-998a-aa38bb8caa68\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:27Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:28 crc kubenswrapper[4752]: I0929 10:45:28.003516 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:28 crc kubenswrapper[4752]: I0929 10:45:28.003753 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:28 crc kubenswrapper[4752]: I0929 10:45:28.003928 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:28 crc kubenswrapper[4752]: I0929 10:45:28.004056 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:28 crc kubenswrapper[4752]: I0929 10:45:28.004135 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:28Z","lastTransitionTime":"2025-09-29T10:45:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:28 crc kubenswrapper[4752]: E0929 10:45:28.017475 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:45:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:45:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:45:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:45:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"67757396-6dfe-4e60-ba89-bdfd50031eb3\\\",\\\"systemUUID\\\":\\\"d8106fc8-56a6-4aa2-998a-aa38bb8caa68\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:28Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:28 crc kubenswrapper[4752]: E0929 10:45:28.017928 4752 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 29 10:45:28 crc kubenswrapper[4752]: I0929 10:45:28.019761 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:28 crc kubenswrapper[4752]: I0929 10:45:28.019825 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:28 crc kubenswrapper[4752]: I0929 10:45:28.019837 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:28 crc kubenswrapper[4752]: I0929 10:45:28.019856 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:28 crc kubenswrapper[4752]: I0929 10:45:28.019867 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:28Z","lastTransitionTime":"2025-09-29T10:45:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:28 crc kubenswrapper[4752]: I0929 10:45:28.122585 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:28 crc kubenswrapper[4752]: I0929 10:45:28.122635 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:28 crc kubenswrapper[4752]: I0929 10:45:28.122648 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:28 crc kubenswrapper[4752]: I0929 10:45:28.122671 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:28 crc kubenswrapper[4752]: I0929 10:45:28.122685 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:28Z","lastTransitionTime":"2025-09-29T10:45:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:28 crc kubenswrapper[4752]: I0929 10:45:28.225913 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:28 crc kubenswrapper[4752]: I0929 10:45:28.225965 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:28 crc kubenswrapper[4752]: I0929 10:45:28.225979 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:28 crc kubenswrapper[4752]: I0929 10:45:28.226003 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:28 crc kubenswrapper[4752]: I0929 10:45:28.226021 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:28Z","lastTransitionTime":"2025-09-29T10:45:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:28 crc kubenswrapper[4752]: I0929 10:45:28.328991 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:28 crc kubenswrapper[4752]: I0929 10:45:28.329075 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:28 crc kubenswrapper[4752]: I0929 10:45:28.329104 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:28 crc kubenswrapper[4752]: I0929 10:45:28.329137 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:28 crc kubenswrapper[4752]: I0929 10:45:28.329161 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:28Z","lastTransitionTime":"2025-09-29T10:45:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:28 crc kubenswrapper[4752]: I0929 10:45:28.432342 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:28 crc kubenswrapper[4752]: I0929 10:45:28.432386 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:28 crc kubenswrapper[4752]: I0929 10:45:28.432400 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:28 crc kubenswrapper[4752]: I0929 10:45:28.432417 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:28 crc kubenswrapper[4752]: I0929 10:45:28.432430 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:28Z","lastTransitionTime":"2025-09-29T10:45:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:28 crc kubenswrapper[4752]: I0929 10:45:28.534600 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:28 crc kubenswrapper[4752]: I0929 10:45:28.534651 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:28 crc kubenswrapper[4752]: I0929 10:45:28.534662 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:28 crc kubenswrapper[4752]: I0929 10:45:28.534679 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:28 crc kubenswrapper[4752]: I0929 10:45:28.534692 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:28Z","lastTransitionTime":"2025-09-29T10:45:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:28 crc kubenswrapper[4752]: I0929 10:45:28.638295 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:28 crc kubenswrapper[4752]: I0929 10:45:28.638361 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:28 crc kubenswrapper[4752]: I0929 10:45:28.638375 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:28 crc kubenswrapper[4752]: I0929 10:45:28.638392 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:28 crc kubenswrapper[4752]: I0929 10:45:28.638402 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:28Z","lastTransitionTime":"2025-09-29T10:45:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:28 crc kubenswrapper[4752]: I0929 10:45:28.740789 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:28 crc kubenswrapper[4752]: I0929 10:45:28.740846 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:28 crc kubenswrapper[4752]: I0929 10:45:28.740854 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:28 crc kubenswrapper[4752]: I0929 10:45:28.740871 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:28 crc kubenswrapper[4752]: I0929 10:45:28.740883 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:28Z","lastTransitionTime":"2025-09-29T10:45:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:28 crc kubenswrapper[4752]: I0929 10:45:28.843879 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:28 crc kubenswrapper[4752]: I0929 10:45:28.843945 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:28 crc kubenswrapper[4752]: I0929 10:45:28.843958 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:28 crc kubenswrapper[4752]: I0929 10:45:28.843978 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:28 crc kubenswrapper[4752]: I0929 10:45:28.843990 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:28Z","lastTransitionTime":"2025-09-29T10:45:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:28 crc kubenswrapper[4752]: I0929 10:45:28.947095 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:28 crc kubenswrapper[4752]: I0929 10:45:28.947488 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:28 crc kubenswrapper[4752]: I0929 10:45:28.947714 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:28 crc kubenswrapper[4752]: I0929 10:45:28.948036 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:28 crc kubenswrapper[4752]: I0929 10:45:28.948245 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:28Z","lastTransitionTime":"2025-09-29T10:45:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:29 crc kubenswrapper[4752]: I0929 10:45:29.030184 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 10:45:29 crc kubenswrapper[4752]: I0929 10:45:29.030185 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sq7f4" Sep 29 10:45:29 crc kubenswrapper[4752]: I0929 10:45:29.030337 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 10:45:29 crc kubenswrapper[4752]: I0929 10:45:29.030578 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 10:45:29 crc kubenswrapper[4752]: E0929 10:45:29.030570 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 10:45:29 crc kubenswrapper[4752]: E0929 10:45:29.030794 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sq7f4" podUID="0a33b92e-d79c-4162-8500-df7a89df8df3" Sep 29 10:45:29 crc kubenswrapper[4752]: E0929 10:45:29.030887 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 10:45:29 crc kubenswrapper[4752]: E0929 10:45:29.030992 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 10:45:29 crc kubenswrapper[4752]: I0929 10:45:29.051222 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:29 crc kubenswrapper[4752]: I0929 10:45:29.051273 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:29 crc kubenswrapper[4752]: I0929 10:45:29.051281 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:29 crc kubenswrapper[4752]: I0929 10:45:29.051298 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:29 crc kubenswrapper[4752]: I0929 10:45:29.051308 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:29Z","lastTransitionTime":"2025-09-29T10:45:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:29 crc kubenswrapper[4752]: I0929 10:45:29.156969 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:29 crc kubenswrapper[4752]: I0929 10:45:29.157023 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:29 crc kubenswrapper[4752]: I0929 10:45:29.157035 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:29 crc kubenswrapper[4752]: I0929 10:45:29.157057 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:29 crc kubenswrapper[4752]: I0929 10:45:29.157074 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:29Z","lastTransitionTime":"2025-09-29T10:45:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:29 crc kubenswrapper[4752]: I0929 10:45:29.260846 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:29 crc kubenswrapper[4752]: I0929 10:45:29.260882 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:29 crc kubenswrapper[4752]: I0929 10:45:29.260892 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:29 crc kubenswrapper[4752]: I0929 10:45:29.260908 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:29 crc kubenswrapper[4752]: I0929 10:45:29.260920 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:29Z","lastTransitionTime":"2025-09-29T10:45:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:29 crc kubenswrapper[4752]: I0929 10:45:29.364620 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:29 crc kubenswrapper[4752]: I0929 10:45:29.365187 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:29 crc kubenswrapper[4752]: I0929 10:45:29.365199 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:29 crc kubenswrapper[4752]: I0929 10:45:29.365215 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:29 crc kubenswrapper[4752]: I0929 10:45:29.365224 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:29Z","lastTransitionTime":"2025-09-29T10:45:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:29 crc kubenswrapper[4752]: I0929 10:45:29.468228 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:29 crc kubenswrapper[4752]: I0929 10:45:29.468295 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:29 crc kubenswrapper[4752]: I0929 10:45:29.468308 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:29 crc kubenswrapper[4752]: I0929 10:45:29.468324 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:29 crc kubenswrapper[4752]: I0929 10:45:29.468336 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:29Z","lastTransitionTime":"2025-09-29T10:45:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:29 crc kubenswrapper[4752]: I0929 10:45:29.483221 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xv5q7_52fc9378-c37b-424b-afde-7b191bab5fde/kube-multus/0.log" Sep 29 10:45:29 crc kubenswrapper[4752]: I0929 10:45:29.483275 4752 generic.go:334] "Generic (PLEG): container finished" podID="52fc9378-c37b-424b-afde-7b191bab5fde" containerID="30ee75a35da106cc9424c7a3f97f28d0c711200667372c023612db4a9701c189" exitCode=1 Sep 29 10:45:29 crc kubenswrapper[4752]: I0929 10:45:29.483320 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xv5q7" event={"ID":"52fc9378-c37b-424b-afde-7b191bab5fde","Type":"ContainerDied","Data":"30ee75a35da106cc9424c7a3f97f28d0c711200667372c023612db4a9701c189"} Sep 29 10:45:29 crc kubenswrapper[4752]: I0929 10:45:29.483965 4752 scope.go:117] "RemoveContainer" containerID="30ee75a35da106cc9424c7a3f97f28d0c711200667372c023612db4a9701c189" Sep 29 10:45:29 crc kubenswrapper[4752]: I0929 10:45:29.504342 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131d2c8a72fc6a373ebf6835840e6b9c1829db4c78b4961bf36642fd0e8a5636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:29Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:29 crc kubenswrapper[4752]: I0929 10:45:29.519854 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5863c243-797d-462a-b11f-71aaf005f8d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://166738b29f01996ec981fd00b49f422e4a97fe774396e7ea153ad29ef30a7370\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdtpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32155f6078e9c15abe4c659ac79b064ec182a232ea1d816998da4de273b7aa67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdtpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mgrvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:29Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:29 crc kubenswrapper[4752]: I0929 10:45:29.532035 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4whp8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"398b6e5c-29ac-4701-9207-d3d269b62224\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63db080ebca3f5ea23ddc9af874b6b500abe8044c73794ae0749df2949fb9520\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9hp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4whp8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:29Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:29 crc kubenswrapper[4752]: I0929 10:45:29.552263 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48ad7053-6039-4b1a-9729-fcbe1d938928\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00965359c30aa25677d4b114c00b339b155ab4b5316d5e355536bea5b65eaba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e2d86e0821e0155affe296e5cc70e9904f04c800943101e62509e3a5e4e0808\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9378a6f1ac902b030f4ecabac1eae40f884dc1546a360e178f38300e137d8b0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a174bcfad22c2a58c48792478272705c80a56775b45b14919ea1de1dd92b4cbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://828d416b69696f709d91feb8df8fead0f95be74a91c5dab25756e341e29413dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e4ae4f6e0a6df2f1e370b0ff37704c0b0252752c0d8e8a1cdd83088ca9ec951\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e4ae4f6e0a6df2f1e370b0ff37704c0b0252752c0d8e8a1cdd83088ca9ec951\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40c90938f79ba960fa16979dd5f239674df4b13cae8b0b5d3bb48b0e46219a34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40c90938f79ba960fa16979dd5f239674df4b13cae8b0b5d3bb48b0e46219a34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f99c6fe84624f3e518bbe35ee9b700effb126ff1f36d995262b7ed8b73364780\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f99c6fe84624f3e518bbe35ee9b700effb126ff1f36d995262b7ed8b73364780\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:29Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:29 crc kubenswrapper[4752]: I0929 10:45:29.568759 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:29Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:29 crc kubenswrapper[4752]: I0929 10:45:29.571078 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:29 crc kubenswrapper[4752]: I0929 10:45:29.571125 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:29 crc kubenswrapper[4752]: I0929 10:45:29.571138 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:29 crc kubenswrapper[4752]: I0929 10:45:29.571157 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:29 crc kubenswrapper[4752]: I0929 10:45:29.571170 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:29Z","lastTransitionTime":"2025-09-29T10:45:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:29 crc kubenswrapper[4752]: I0929 10:45:29.584392 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7kp7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66a61a7f-9be6-486b-a425-62ed62ec0ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4170732970e5e7c429279d239eb2d4b9d8249ff254b35f38ff80d0321087be2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kgr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7kp7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:29Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:29 crc kubenswrapper[4752]: I0929 10:45:29.599406 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vm6zb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f30a1f9-86ef-450e-9f8c-8ef8d4ac380a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6bc5aff417397c8b264553f67de7ebd1aeadb67fb83114c5bb13c2e0d10e397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239ca1f17b9f1e1d6ba63b196e34066fe7fb37373453460261044f5fcaf819af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://239ca1f17b9f1e1d6ba63b196e34066fe7fb37373453460261044f5fcaf819af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd5b369dc688f11e4ab502a3886b722cba392fce0d3ac7850bd59abffbf7dee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd5b369dc688f11e4ab502a3886b722cba392fce0d3ac7850bd59abffbf7dee2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d17821abed9aca5c20373738f44ca9a61e954d1eee46f0d16c3e9b34d810a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88d17821abed9aca5c20373738f44ca9a61e954d1eee46f0d16c3e9b34d810a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50f5727e0bd53639ba6b6632f2d62c7c62ae74b07a60aa1cb58c2020990cae42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f5727e0bd53639ba6b6632f2d62c7c62ae74b07a60aa1cb58c2020990cae42\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd84740e3b0a970decedcc3960fb987fa618f9627f06be1d2d0b034d0361f805\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd84740e3b0a970decedcc3960fb987fa618f9627f06be1d2d0b034d0361f805\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6af6d9f7c1ca6625f88dcaa9ef267cf11f3ebb16a0ce12d3c2442550bc0833ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6af6d9f7c1ca6625f88dcaa9ef267cf11f3ebb16a0ce12d3c2442550bc0833ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vm6zb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:29Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:29 crc kubenswrapper[4752]: I0929 10:45:29.620373 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94028c24-ec10-4d5c-b32c-1700e677d539\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://486ac9c45cc8e6cc88a199b152343c1db14c51125b4357c85d5d082467fc4560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2860691a355a598f52a1f13213198fa7889748e67cca21a617ed5714f5eabcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e34a55130babbc5fbe9fb81d05fc687dc1b06c3bffea762ba699f9f6c317b312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5985eb5ebc8fa2ca986873aea235335770621597493b43eaa58d98329cd37009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b46368b26939edaf377aa86ef45fc9dc3ec4fa274dfe1cba458bafb8d32309e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a98f237ee9baeb799b2ea76ccbe7b349ed70b50f47738fc514ae56b46ee8d1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7274c1c9e4153ac28534a3b9f58c87c2e5480650edf3522e235805aea87dd76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7274c1c9e4153ac28534a3b9f58c87c2e5480650edf3522e235805aea87dd76\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T10:45:09Z\\\",\\\"message\\\":\\\"update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-marketplace/marketplace-operator-metrics]} name:Service_openshift-marketplace/marketplace-operator-metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.53:8081: 10.217.5.53:8383:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {89fe421e-04e8-4967-ac75-77a0e6f784ef}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0929 10:45:08.961904 6414 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI0929 10:45:08.961922 6414 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI0929 10:45:08.961943 6414 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nF0929 10:45:08.961950 6414 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T10:45:08Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-c2vrh_openshift-ovn-kubernetes(94028c24-ec10-4d5c-b32c-1700e677d539)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea11fb795febf50e35263b0a02c32a01fd69937dfbfe196696cd1792e40cc191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f22dfbbd26fb3ebf4869b46406913cc1963e33c11794193c815235be5acee338\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f22dfbbd26fb3ebf4869b46406913cc1963e33c11794193c815235be5acee338\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c2vrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:29Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:29 crc kubenswrapper[4752]: I0929 10:45:29.636182 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520a5d33-312c-4033-8b69-5dd582f13ccc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6223734bbce461c09916aea7629bba0cfa97ea17050bca7417020ece9ae031a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1157b82d6f3337270d30abdceadaa1f0a01b3c6d8de6bc8e9edf083a8264f19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://854abd6205c2eec2229d0d65aec3edb7cf1cc1e77759df41bd22deda4a08c8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://362298e6215cc1a9971973419e58a45e5ded2c4120b1e800afd87f480f6fd3d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c927118840179fccacbe6a18a329c117cef73a6e914bf38d20fc2439d6a5c1ee\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"message\\\":\\\"le observer\\\\nW0929 10:44:40.787758 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0929 10:44:40.787900 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 10:44:40.788558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1487283959/tls.crt::/tmp/serving-cert-1487283959/tls.key\\\\\\\"\\\\nI0929 10:44:41.256284 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 10:44:41.261265 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 10:44:41.261291 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 10:44:41.261311 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 10:44:41.261316 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 10:44:41.267824 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0929 10:44:41.267847 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0929 10:44:41.267849 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 10:44:41.267871 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 10:44:41.267876 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 10:44:41.267879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 10:44:41.267882 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 10:44:41.267884 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0929 10:44:41.270258 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbe61bb570ef2be352bb3a0e55da353ce7b618b397e3bf9f0d66da0c9b6f1d4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80f961b58569cce6d634f225369902695ccda2e78efb1c6fd635f1535467cc1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80f961b58569cce6d634f225369902695ccda2e78efb1c6fd635f1535467cc1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:29Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:29 crc kubenswrapper[4752]: I0929 10:45:29.651010 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca4f637cfcb1e52fa69f0ffa46b3a53459225d9ad4afd1178bff709e812c5418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b70242846937de5b4dda37a2b8c48947fded378c299ea4ad857168589d7c175\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:29Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:29 crc kubenswrapper[4752]: I0929 10:45:29.665913 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:29Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:29 crc kubenswrapper[4752]: I0929 10:45:29.673753 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:29 crc kubenswrapper[4752]: I0929 10:45:29.674048 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:29 crc kubenswrapper[4752]: I0929 10:45:29.674149 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:29 crc kubenswrapper[4752]: I0929 10:45:29.674247 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:29 crc kubenswrapper[4752]: I0929 10:45:29.674339 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:29Z","lastTransitionTime":"2025-09-29T10:45:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:29 crc kubenswrapper[4752]: I0929 10:45:29.680383 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fb781fd16d4a9f56202eb1724ed1a4ed6700ff7b81819573b955bcb07e563a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:29Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:29 crc kubenswrapper[4752]: I0929 10:45:29.693928 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xv5q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52fc9378-c37b-424b-afde-7b191bab5fde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30ee75a35da106cc9424c7a3f97f28d0c711200667372c023612db4a9701c189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30ee75a35da106cc9424c7a3f97f28d0c711200667372c023612db4a9701c189\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T10:45:29Z\\\",\\\"message\\\":\\\"2025-09-29T10:44:43+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_46ee9782-bab0-46a6-9758-0bdfa32662ba\\\\n2025-09-29T10:44:43+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_46ee9782-bab0-46a6-9758-0bdfa32662ba to /host/opt/cni/bin/\\\\n2025-09-29T10:44:44Z [verbose] multus-daemon started\\\\n2025-09-29T10:44:44Z [verbose] Readiness Indicator file check\\\\n2025-09-29T10:45:29Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4rqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xv5q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:29Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:29 crc kubenswrapper[4752]: I0929 10:45:29.707652 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mp5pm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65f5485e-9000-4512-aad3-7d367715ac2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db5dba49df10714a5f00ec40865af87528f6bee63ee58a89f299af7c10e4d769\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z772z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://073cf9e4675b04d77ad58f0b7e1b313e3fe15e8daee4e1c8934a90924b04ad22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z772z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mp5pm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:29Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:29 crc kubenswrapper[4752]: I0929 10:45:29.728268 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sq7f4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a33b92e-d79c-4162-8500-df7a89df8df3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qck2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qck2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sq7f4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:29Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:29 crc kubenswrapper[4752]: I0929 10:45:29.742399 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3e5d3a3-2f2d-4f61-ae95-26ebd1f72342\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66d77cd5048e199a6eae84be4079c3b00305f4f5223b5176a49df0feb2f0bf8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74b270e951a827068c908168bf04d4cd3bcba62e472e4a3f415de8b7463fdccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dd4d83f6d6b5db7fc93239bc1a6b731c67bc15ef1ca1990b53589e4ad36bfa7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39ef26bf3e7b95ac9a59199bbabe11fd4e831baba1b120ef97a4839c0c4aab7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:29Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:29 crc kubenswrapper[4752]: I0929 10:45:29.754128 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22c62d6c-d29c-416f-bfeb-476f97181a39\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bab580564f9dd31f6b2ea23a31918a9fdd2f247d13a0bd882f38dbaee4bf0b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44a5132d9611cf58eef747d86fd0cef4eb52366b9d1bacc6df0cf5be145d3998\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c6d3ad808fe69e726b66a03be183d33f000a614fadbc7f644015633fbb2b457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5335487c039d2e7e80a940cfe980fb46caf0cfc6302660b9318d9c8c525227cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5335487c039d2e7e80a940cfe980fb46caf0cfc6302660b9318d9c8c525227cb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:29Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:29 crc kubenswrapper[4752]: I0929 10:45:29.765886 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:29Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:29 crc kubenswrapper[4752]: I0929 10:45:29.776469 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:29 crc kubenswrapper[4752]: I0929 10:45:29.776509 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:29 crc kubenswrapper[4752]: I0929 10:45:29.776568 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:29 crc kubenswrapper[4752]: I0929 10:45:29.776624 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:29 crc kubenswrapper[4752]: I0929 10:45:29.776636 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:29Z","lastTransitionTime":"2025-09-29T10:45:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:29 crc kubenswrapper[4752]: I0929 10:45:29.879138 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:29 crc kubenswrapper[4752]: I0929 10:45:29.879208 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:29 crc kubenswrapper[4752]: I0929 10:45:29.879222 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:29 crc kubenswrapper[4752]: I0929 10:45:29.879241 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:29 crc kubenswrapper[4752]: I0929 10:45:29.879253 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:29Z","lastTransitionTime":"2025-09-29T10:45:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:29 crc kubenswrapper[4752]: I0929 10:45:29.981718 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:29 crc kubenswrapper[4752]: I0929 10:45:29.981769 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:29 crc kubenswrapper[4752]: I0929 10:45:29.981781 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:29 crc kubenswrapper[4752]: I0929 10:45:29.981816 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:29 crc kubenswrapper[4752]: I0929 10:45:29.981831 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:29Z","lastTransitionTime":"2025-09-29T10:45:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:30 crc kubenswrapper[4752]: I0929 10:45:30.054786 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5863c243-797d-462a-b11f-71aaf005f8d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://166738b29f01996ec981fd00b49f422e4a97fe774396e7ea153ad29ef30a7370\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdtpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32155f6078e9c15abe4c659ac79b064ec182a232ea1d816998da4de273b7aa67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdtpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mgrvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:30Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:30 crc kubenswrapper[4752]: I0929 10:45:30.071490 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4whp8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"398b6e5c-29ac-4701-9207-d3d269b62224\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63db080ebca3f5ea23ddc9af874b6b500abe8044c73794ae0749df2949fb9520\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9hp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4whp8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:30Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:30 crc kubenswrapper[4752]: I0929 10:45:30.084780 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:30 crc kubenswrapper[4752]: I0929 10:45:30.085053 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:30 crc kubenswrapper[4752]: I0929 10:45:30.085163 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:30 crc kubenswrapper[4752]: I0929 10:45:30.085259 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:30 crc kubenswrapper[4752]: I0929 10:45:30.085338 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:30Z","lastTransitionTime":"2025-09-29T10:45:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:30 crc kubenswrapper[4752]: I0929 10:45:30.100218 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48ad7053-6039-4b1a-9729-fcbe1d938928\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00965359c30aa25677d4b114c00b339b155ab4b5316d5e355536bea5b65eaba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e2d86e0821e0155affe296e5cc70e9904f04c800943101e62509e3a5e4e0808\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9378a6f1ac902b030f4ecabac1eae40f884dc1546a360e178f38300e137d8b0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a174bcfad22c2a58c48792478272705c80a56775b45b14919ea1de1dd92b4cbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://828d416b69696f709d91feb8df8fead0f95be74a91c5dab25756e341e29413dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e4ae4f6e0a6df2f1e370b0ff37704c0b0252752c0d8e8a1cdd83088ca9ec951\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e4ae4f6e0a6df2f1e370b0ff37704c0b0252752c0d8e8a1cdd83088ca9ec951\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40c90938f79ba960fa16979dd5f239674df4b13cae8b0b5d3bb48b0e46219a34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40c90938f79ba960fa16979dd5f239674df4b13cae8b0b5d3bb48b0e46219a34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f99c6fe84624f3e518bbe35ee9b700effb126ff1f36d995262b7ed8b73364780\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f99c6fe84624f3e518bbe35ee9b700effb126ff1f36d995262b7ed8b73364780\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:30Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:30 crc kubenswrapper[4752]: I0929 10:45:30.114675 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:30Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:30 crc kubenswrapper[4752]: I0929 10:45:30.127956 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131d2c8a72fc6a373ebf6835840e6b9c1829db4c78b4961bf36642fd0e8a5636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:30Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:30 crc kubenswrapper[4752]: I0929 10:45:30.148741 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vm6zb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f30a1f9-86ef-450e-9f8c-8ef8d4ac380a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6bc5aff417397c8b264553f67de7ebd1aeadb67fb83114c5bb13c2e0d10e397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239ca1f17b9f1e1d6ba63b196e34066fe7fb37373453460261044f5fcaf819af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://239ca1f17b9f1e1d6ba63b196e34066fe7fb37373453460261044f5fcaf819af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd5b369dc688f11e4ab502a3886b722cba392fce0d3ac7850bd59abffbf7dee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd5b369dc688f11e4ab502a3886b722cba392fce0d3ac7850bd59abffbf7dee2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d17821abed9aca5c20373738f44ca9a61e954d1eee46f0d16c3e9b34d810a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88d17821abed9aca5c20373738f44ca9a61e954d1eee46f0d16c3e9b34d810a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50f5727e0bd53639ba6b6632f2d62c7c62ae74b07a60aa1cb58c2020990cae42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f5727e0bd53639ba6b6632f2d62c7c62ae74b07a60aa1cb58c2020990cae42\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd84740e3b0a970decedcc3960fb987fa618f9627f06be1d2d0b034d0361f805\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd84740e3b0a970decedcc3960fb987fa618f9627f06be1d2d0b034d0361f805\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6af6d9f7c1ca6625f88dcaa9ef267cf11f3ebb16a0ce12d3c2442550bc0833ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6af6d9f7c1ca6625f88dcaa9ef267cf11f3ebb16a0ce12d3c2442550bc0833ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vm6zb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:30Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:30 crc kubenswrapper[4752]: I0929 10:45:30.166751 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94028c24-ec10-4d5c-b32c-1700e677d539\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://486ac9c45cc8e6cc88a199b152343c1db14c51125b4357c85d5d082467fc4560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2860691a355a598f52a1f13213198fa7889748e67cca21a617ed5714f5eabcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e34a55130babbc5fbe9fb81d05fc687dc1b06c3bffea762ba699f9f6c317b312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5985eb5ebc8fa2ca986873aea235335770621597493b43eaa58d98329cd37009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b46368b26939edaf377aa86ef45fc9dc3ec4fa274dfe1cba458bafb8d32309e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a98f237ee9baeb799b2ea76ccbe7b349ed70b50f47738fc514ae56b46ee8d1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7274c1c9e4153ac28534a3b9f58c87c2e5480650edf3522e235805aea87dd76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7274c1c9e4153ac28534a3b9f58c87c2e5480650edf3522e235805aea87dd76\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T10:45:09Z\\\",\\\"message\\\":\\\"update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-marketplace/marketplace-operator-metrics]} name:Service_openshift-marketplace/marketplace-operator-metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.53:8081: 10.217.5.53:8383:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {89fe421e-04e8-4967-ac75-77a0e6f784ef}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0929 10:45:08.961904 6414 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI0929 10:45:08.961922 6414 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI0929 10:45:08.961943 6414 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nF0929 10:45:08.961950 6414 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T10:45:08Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-c2vrh_openshift-ovn-kubernetes(94028c24-ec10-4d5c-b32c-1700e677d539)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea11fb795febf50e35263b0a02c32a01fd69937dfbfe196696cd1792e40cc191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f22dfbbd26fb3ebf4869b46406913cc1963e33c11794193c815235be5acee338\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f22dfbbd26fb3ebf4869b46406913cc1963e33c11794193c815235be5acee338\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c2vrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:30Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:30 crc kubenswrapper[4752]: I0929 10:45:30.181191 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520a5d33-312c-4033-8b69-5dd582f13ccc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6223734bbce461c09916aea7629bba0cfa97ea17050bca7417020ece9ae031a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1157b82d6f3337270d30abdceadaa1f0a01b3c6d8de6bc8e9edf083a8264f19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://854abd6205c2eec2229d0d65aec3edb7cf1cc1e77759df41bd22deda4a08c8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://362298e6215cc1a9971973419e58a45e5ded2c4120b1e800afd87f480f6fd3d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c927118840179fccacbe6a18a329c117cef73a6e914bf38d20fc2439d6a5c1ee\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"message\\\":\\\"le observer\\\\nW0929 10:44:40.787758 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0929 10:44:40.787900 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 10:44:40.788558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1487283959/tls.crt::/tmp/serving-cert-1487283959/tls.key\\\\\\\"\\\\nI0929 10:44:41.256284 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 10:44:41.261265 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 10:44:41.261291 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 10:44:41.261311 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 10:44:41.261316 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 10:44:41.267824 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0929 10:44:41.267847 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0929 10:44:41.267849 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 10:44:41.267871 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 10:44:41.267876 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 10:44:41.267879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 10:44:41.267882 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 10:44:41.267884 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0929 10:44:41.270258 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbe61bb570ef2be352bb3a0e55da353ce7b618b397e3bf9f0d66da0c9b6f1d4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80f961b58569cce6d634f225369902695ccda2e78efb1c6fd635f1535467cc1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80f961b58569cce6d634f225369902695ccda2e78efb1c6fd635f1535467cc1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:30Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:30 crc kubenswrapper[4752]: I0929 10:45:30.187536 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:30 crc kubenswrapper[4752]: I0929 10:45:30.187584 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:30 crc kubenswrapper[4752]: I0929 10:45:30.187597 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:30 crc kubenswrapper[4752]: I0929 10:45:30.187619 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:30 crc kubenswrapper[4752]: I0929 10:45:30.187757 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:30Z","lastTransitionTime":"2025-09-29T10:45:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:30 crc kubenswrapper[4752]: I0929 10:45:30.197729 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca4f637cfcb1e52fa69f0ffa46b3a53459225d9ad4afd1178bff709e812c5418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b70242846937de5b4dda37a2b8c48947fded378c299ea4ad857168589d7c175\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:30Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:30 crc kubenswrapper[4752]: I0929 10:45:30.209643 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7kp7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66a61a7f-9be6-486b-a425-62ed62ec0ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4170732970e5e7c429279d239eb2d4b9d8249ff254b35f38ff80d0321087be2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kgr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7kp7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:30Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:30 crc kubenswrapper[4752]: I0929 10:45:30.223829 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fb781fd16d4a9f56202eb1724ed1a4ed6700ff7b81819573b955bcb07e563a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:30Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:30 crc kubenswrapper[4752]: I0929 10:45:30.236363 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xv5q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52fc9378-c37b-424b-afde-7b191bab5fde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30ee75a35da106cc9424c7a3f97f28d0c711200667372c023612db4a9701c189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30ee75a35da106cc9424c7a3f97f28d0c711200667372c023612db4a9701c189\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T10:45:29Z\\\",\\\"message\\\":\\\"2025-09-29T10:44:43+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_46ee9782-bab0-46a6-9758-0bdfa32662ba\\\\n2025-09-29T10:44:43+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_46ee9782-bab0-46a6-9758-0bdfa32662ba to /host/opt/cni/bin/\\\\n2025-09-29T10:44:44Z [verbose] multus-daemon started\\\\n2025-09-29T10:44:44Z [verbose] Readiness Indicator file check\\\\n2025-09-29T10:45:29Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4rqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xv5q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:30Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:30 crc kubenswrapper[4752]: I0929 10:45:30.249133 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mp5pm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65f5485e-9000-4512-aad3-7d367715ac2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db5dba49df10714a5f00ec40865af87528f6bee63ee58a89f299af7c10e4d769\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z772z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://073cf9e4675b04d77ad58f0b7e1b313e3fe15e8daee4e1c8934a90924b04ad22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z772z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mp5pm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:30Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:30 crc kubenswrapper[4752]: I0929 10:45:30.261766 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sq7f4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a33b92e-d79c-4162-8500-df7a89df8df3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qck2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qck2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sq7f4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:30Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:30 crc kubenswrapper[4752]: I0929 10:45:30.276933 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3e5d3a3-2f2d-4f61-ae95-26ebd1f72342\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66d77cd5048e199a6eae84be4079c3b00305f4f5223b5176a49df0feb2f0bf8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74b270e951a827068c908168bf04d4cd3bcba62e472e4a3f415de8b7463fdccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dd4d83f6d6b5db7fc93239bc1a6b731c67bc15ef1ca1990b53589e4ad36bfa7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39ef26bf3e7b95ac9a59199bbabe11fd4e831baba1b120ef97a4839c0c4aab7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:30Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:30 crc kubenswrapper[4752]: I0929 10:45:30.289369 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:30Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:30 crc kubenswrapper[4752]: I0929 10:45:30.290422 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:30 crc kubenswrapper[4752]: I0929 10:45:30.290519 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:30 crc kubenswrapper[4752]: I0929 10:45:30.290612 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:30 crc kubenswrapper[4752]: I0929 10:45:30.290687 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:30 crc kubenswrapper[4752]: I0929 10:45:30.290764 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:30Z","lastTransitionTime":"2025-09-29T10:45:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:30 crc kubenswrapper[4752]: I0929 10:45:30.303580 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22c62d6c-d29c-416f-bfeb-476f97181a39\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bab580564f9dd31f6b2ea23a31918a9fdd2f247d13a0bd882f38dbaee4bf0b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44a5132d9611cf58eef747d86fd0cef4eb52366b9d1bacc6df0cf5be145d3998\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c6d3ad808fe69e726b66a03be183d33f000a614fadbc7f644015633fbb2b457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5335487c039d2e7e80a940cfe980fb46caf0cfc6302660b9318d9c8c525227cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5335487c039d2e7e80a940cfe980fb46caf0cfc6302660b9318d9c8c525227cb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:30Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:30 crc kubenswrapper[4752]: I0929 10:45:30.317103 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:30Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:30 crc kubenswrapper[4752]: I0929 10:45:30.393541 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:30 crc kubenswrapper[4752]: I0929 10:45:30.393580 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:30 crc kubenswrapper[4752]: I0929 10:45:30.393588 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:30 crc kubenswrapper[4752]: I0929 10:45:30.393607 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:30 crc kubenswrapper[4752]: I0929 10:45:30.393616 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:30Z","lastTransitionTime":"2025-09-29T10:45:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:30 crc kubenswrapper[4752]: I0929 10:45:30.488922 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xv5q7_52fc9378-c37b-424b-afde-7b191bab5fde/kube-multus/0.log" Sep 29 10:45:30 crc kubenswrapper[4752]: I0929 10:45:30.489021 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xv5q7" event={"ID":"52fc9378-c37b-424b-afde-7b191bab5fde","Type":"ContainerStarted","Data":"d36b7c0411c2a5cbcb37f626fa70cfe6c7d3fc6280f6a9e882fa27766f6de761"} Sep 29 10:45:30 crc kubenswrapper[4752]: I0929 10:45:30.495288 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:30 crc kubenswrapper[4752]: I0929 10:45:30.495319 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:30 crc kubenswrapper[4752]: I0929 10:45:30.495329 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:30 crc kubenswrapper[4752]: I0929 10:45:30.495344 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:30 crc kubenswrapper[4752]: I0929 10:45:30.495354 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:30Z","lastTransitionTime":"2025-09-29T10:45:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:30 crc kubenswrapper[4752]: I0929 10:45:30.504940 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22c62d6c-d29c-416f-bfeb-476f97181a39\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bab580564f9dd31f6b2ea23a31918a9fdd2f247d13a0bd882f38dbaee4bf0b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44a5132d9611cf58eef747d86fd0cef4eb52366b9d1bacc6df0cf5be145d3998\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c6d3ad808fe69e726b66a03be183d33f000a614fadbc7f644015633fbb2b457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5335487c039d2e7e80a940cfe980fb46caf0cfc6302660b9318d9c8c525227cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5335487c039d2e7e80a940cfe980fb46caf0cfc6302660b9318d9c8c525227cb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:30Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:30 crc kubenswrapper[4752]: I0929 10:45:30.518101 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:30Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:30 crc kubenswrapper[4752]: I0929 10:45:30.531227 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5863c243-797d-462a-b11f-71aaf005f8d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://166738b29f01996ec981fd00b49f422e4a97fe774396e7ea153ad29ef30a7370\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdtpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32155f6078e9c15abe4c659ac79b064ec182a232ea1d816998da4de273b7aa67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdtpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mgrvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:30Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:30 crc kubenswrapper[4752]: I0929 10:45:30.540630 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4whp8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"398b6e5c-29ac-4701-9207-d3d269b62224\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63db080ebca3f5ea23ddc9af874b6b500abe8044c73794ae0749df2949fb9520\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9hp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4whp8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:30Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:30 crc kubenswrapper[4752]: I0929 10:45:30.561433 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48ad7053-6039-4b1a-9729-fcbe1d938928\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00965359c30aa25677d4b114c00b339b155ab4b5316d5e355536bea5b65eaba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e2d86e0821e0155affe296e5cc70e9904f04c800943101e62509e3a5e4e0808\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9378a6f1ac902b030f4ecabac1eae40f884dc1546a360e178f38300e137d8b0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a174bcfad22c2a58c48792478272705c80a56775b45b14919ea1de1dd92b4cbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://828d416b69696f709d91feb8df8fead0f95be74a91c5dab25756e341e29413dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e4ae4f6e0a6df2f1e370b0ff37704c0b0252752c0d8e8a1cdd83088ca9ec951\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e4ae4f6e0a6df2f1e370b0ff37704c0b0252752c0d8e8a1cdd83088ca9ec951\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40c90938f79ba960fa16979dd5f239674df4b13cae8b0b5d3bb48b0e46219a34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40c90938f79ba960fa16979dd5f239674df4b13cae8b0b5d3bb48b0e46219a34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f99c6fe84624f3e518bbe35ee9b700effb126ff1f36d995262b7ed8b73364780\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f99c6fe84624f3e518bbe35ee9b700effb126ff1f36d995262b7ed8b73364780\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:30Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:30 crc kubenswrapper[4752]: I0929 10:45:30.575465 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:30Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:30 crc kubenswrapper[4752]: I0929 10:45:30.588036 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131d2c8a72fc6a373ebf6835840e6b9c1829db4c78b4961bf36642fd0e8a5636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:30Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:30 crc kubenswrapper[4752]: I0929 10:45:30.597982 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:30 crc kubenswrapper[4752]: I0929 10:45:30.598240 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:30 crc kubenswrapper[4752]: I0929 10:45:30.598393 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:30 crc kubenswrapper[4752]: I0929 10:45:30.598512 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:30 crc kubenswrapper[4752]: I0929 10:45:30.598599 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:30Z","lastTransitionTime":"2025-09-29T10:45:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:30 crc kubenswrapper[4752]: I0929 10:45:30.604353 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vm6zb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f30a1f9-86ef-450e-9f8c-8ef8d4ac380a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6bc5aff417397c8b264553f67de7ebd1aeadb67fb83114c5bb13c2e0d10e397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239ca1f17b9f1e1d6ba63b196e34066fe7fb37373453460261044f5fcaf819af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://239ca1f17b9f1e1d6ba63b196e34066fe7fb37373453460261044f5fcaf819af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd5b369dc688f11e4ab502a3886b722cba392fce0d3ac7850bd59abffbf7dee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd5b369dc688f11e4ab502a3886b722cba392fce0d3ac7850bd59abffbf7dee2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d17821abed9aca5c20373738f44ca9a61e954d1eee46f0d16c3e9b34d810a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88d17821abed9aca5c20373738f44ca9a61e954d1eee46f0d16c3e9b34d810a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50f5727e0bd53639ba6b6632f2d62c7c62ae74b07a60aa1cb58c2020990cae42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f5727e0bd53639ba6b6632f2d62c7c62ae74b07a60aa1cb58c2020990cae42\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd84740e3b0a970decedcc3960fb987fa618f9627f06be1d2d0b034d0361f805\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd84740e3b0a970decedcc3960fb987fa618f9627f06be1d2d0b034d0361f805\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6af6d9f7c1ca6625f88dcaa9ef267cf11f3ebb16a0ce12d3c2442550bc0833ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6af6d9f7c1ca6625f88dcaa9ef267cf11f3ebb16a0ce12d3c2442550bc0833ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vm6zb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:30Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:30 crc kubenswrapper[4752]: I0929 10:45:30.627834 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94028c24-ec10-4d5c-b32c-1700e677d539\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://486ac9c45cc8e6cc88a199b152343c1db14c51125b4357c85d5d082467fc4560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2860691a355a598f52a1f13213198fa7889748e67cca21a617ed5714f5eabcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e34a55130babbc5fbe9fb81d05fc687dc1b06c3bffea762ba699f9f6c317b312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5985eb5ebc8fa2ca986873aea235335770621597493b43eaa58d98329cd37009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b46368b26939edaf377aa86ef45fc9dc3ec4fa274dfe1cba458bafb8d32309e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a98f237ee9baeb799b2ea76ccbe7b349ed70b50f47738fc514ae56b46ee8d1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7274c1c9e4153ac28534a3b9f58c87c2e5480650edf3522e235805aea87dd76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7274c1c9e4153ac28534a3b9f58c87c2e5480650edf3522e235805aea87dd76\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T10:45:09Z\\\",\\\"message\\\":\\\"update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-marketplace/marketplace-operator-metrics]} name:Service_openshift-marketplace/marketplace-operator-metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.53:8081: 10.217.5.53:8383:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {89fe421e-04e8-4967-ac75-77a0e6f784ef}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0929 10:45:08.961904 6414 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI0929 10:45:08.961922 6414 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI0929 10:45:08.961943 6414 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nF0929 10:45:08.961950 6414 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T10:45:08Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-c2vrh_openshift-ovn-kubernetes(94028c24-ec10-4d5c-b32c-1700e677d539)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea11fb795febf50e35263b0a02c32a01fd69937dfbfe196696cd1792e40cc191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f22dfbbd26fb3ebf4869b46406913cc1963e33c11794193c815235be5acee338\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f22dfbbd26fb3ebf4869b46406913cc1963e33c11794193c815235be5acee338\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c2vrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:30Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:30 crc kubenswrapper[4752]: I0929 10:45:30.640883 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520a5d33-312c-4033-8b69-5dd582f13ccc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6223734bbce461c09916aea7629bba0cfa97ea17050bca7417020ece9ae031a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1157b82d6f3337270d30abdceadaa1f0a01b3c6d8de6bc8e9edf083a8264f19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://854abd6205c2eec2229d0d65aec3edb7cf1cc1e77759df41bd22deda4a08c8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://362298e6215cc1a9971973419e58a45e5ded2c4120b1e800afd87f480f6fd3d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c927118840179fccacbe6a18a329c117cef73a6e914bf38d20fc2439d6a5c1ee\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"message\\\":\\\"le observer\\\\nW0929 10:44:40.787758 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0929 10:44:40.787900 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 10:44:40.788558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1487283959/tls.crt::/tmp/serving-cert-1487283959/tls.key\\\\\\\"\\\\nI0929 10:44:41.256284 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 10:44:41.261265 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 10:44:41.261291 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 10:44:41.261311 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 10:44:41.261316 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 10:44:41.267824 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0929 10:44:41.267847 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0929 10:44:41.267849 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 10:44:41.267871 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 10:44:41.267876 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 10:44:41.267879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 10:44:41.267882 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 10:44:41.267884 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0929 10:44:41.270258 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbe61bb570ef2be352bb3a0e55da353ce7b618b397e3bf9f0d66da0c9b6f1d4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80f961b58569cce6d634f225369902695ccda2e78efb1c6fd635f1535467cc1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80f961b58569cce6d634f225369902695ccda2e78efb1c6fd635f1535467cc1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:30Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:30 crc kubenswrapper[4752]: I0929 10:45:30.652969 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca4f637cfcb1e52fa69f0ffa46b3a53459225d9ad4afd1178bff709e812c5418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b70242846937de5b4dda37a2b8c48947fded378c299ea4ad857168589d7c175\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:30Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:30 crc kubenswrapper[4752]: I0929 10:45:30.664764 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7kp7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66a61a7f-9be6-486b-a425-62ed62ec0ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4170732970e5e7c429279d239eb2d4b9d8249ff254b35f38ff80d0321087be2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kgr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7kp7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:30Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:30 crc kubenswrapper[4752]: I0929 10:45:30.678722 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fb781fd16d4a9f56202eb1724ed1a4ed6700ff7b81819573b955bcb07e563a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:30Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:30 crc kubenswrapper[4752]: I0929 10:45:30.694036 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xv5q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52fc9378-c37b-424b-afde-7b191bab5fde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d36b7c0411c2a5cbcb37f626fa70cfe6c7d3fc6280f6a9e882fa27766f6de761\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30ee75a35da106cc9424c7a3f97f28d0c711200667372c023612db4a9701c189\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T10:45:29Z\\\",\\\"message\\\":\\\"2025-09-29T10:44:43+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_46ee9782-bab0-46a6-9758-0bdfa32662ba\\\\n2025-09-29T10:44:43+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_46ee9782-bab0-46a6-9758-0bdfa32662ba to /host/opt/cni/bin/\\\\n2025-09-29T10:44:44Z [verbose] multus-daemon started\\\\n2025-09-29T10:44:44Z [verbose] Readiness Indicator file check\\\\n2025-09-29T10:45:29Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:45:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4rqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xv5q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:30Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:30 crc kubenswrapper[4752]: I0929 10:45:30.704738 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:30 crc kubenswrapper[4752]: I0929 10:45:30.704824 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:30 crc kubenswrapper[4752]: I0929 10:45:30.704838 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:30 crc kubenswrapper[4752]: I0929 10:45:30.704854 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:30 crc kubenswrapper[4752]: I0929 10:45:30.704866 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:30Z","lastTransitionTime":"2025-09-29T10:45:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:30 crc kubenswrapper[4752]: I0929 10:45:30.706966 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mp5pm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65f5485e-9000-4512-aad3-7d367715ac2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db5dba49df10714a5f00ec40865af87528f6bee63ee58a89f299af7c10e4d769\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z772z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://073cf9e4675b04d77ad58f0b7e1b313e3fe15e8daee4e1c8934a90924b04ad22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z772z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mp5pm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:30Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:30 crc kubenswrapper[4752]: I0929 10:45:30.719056 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sq7f4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a33b92e-d79c-4162-8500-df7a89df8df3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qck2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qck2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sq7f4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:30Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:30 crc kubenswrapper[4752]: I0929 10:45:30.732840 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3e5d3a3-2f2d-4f61-ae95-26ebd1f72342\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66d77cd5048e199a6eae84be4079c3b00305f4f5223b5176a49df0feb2f0bf8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74b270e951a827068c908168bf04d4cd3bcba62e472e4a3f415de8b7463fdccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dd4d83f6d6b5db7fc93239bc1a6b731c67bc15ef1ca1990b53589e4ad36bfa7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39ef26bf3e7b95ac9a59199bbabe11fd4e831baba1b120ef97a4839c0c4aab7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:30Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:30 crc kubenswrapper[4752]: I0929 10:45:30.746018 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:30Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:30 crc kubenswrapper[4752]: I0929 10:45:30.810085 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:30 crc kubenswrapper[4752]: I0929 10:45:30.810120 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:30 crc kubenswrapper[4752]: I0929 10:45:30.810128 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:30 crc kubenswrapper[4752]: I0929 10:45:30.810141 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:30 crc kubenswrapper[4752]: I0929 10:45:30.810151 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:30Z","lastTransitionTime":"2025-09-29T10:45:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:30 crc kubenswrapper[4752]: I0929 10:45:30.912868 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:30 crc kubenswrapper[4752]: I0929 10:45:30.912922 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:30 crc kubenswrapper[4752]: I0929 10:45:30.912935 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:30 crc kubenswrapper[4752]: I0929 10:45:30.913000 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:30 crc kubenswrapper[4752]: I0929 10:45:30.913013 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:30Z","lastTransitionTime":"2025-09-29T10:45:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:31 crc kubenswrapper[4752]: I0929 10:45:31.015941 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:31 crc kubenswrapper[4752]: I0929 10:45:31.015999 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:31 crc kubenswrapper[4752]: I0929 10:45:31.016012 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:31 crc kubenswrapper[4752]: I0929 10:45:31.016027 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:31 crc kubenswrapper[4752]: I0929 10:45:31.016039 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:31Z","lastTransitionTime":"2025-09-29T10:45:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:31 crc kubenswrapper[4752]: I0929 10:45:31.030459 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 10:45:31 crc kubenswrapper[4752]: I0929 10:45:31.030504 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sq7f4" Sep 29 10:45:31 crc kubenswrapper[4752]: E0929 10:45:31.030565 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 10:45:31 crc kubenswrapper[4752]: E0929 10:45:31.030624 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sq7f4" podUID="0a33b92e-d79c-4162-8500-df7a89df8df3" Sep 29 10:45:31 crc kubenswrapper[4752]: I0929 10:45:31.030675 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 10:45:31 crc kubenswrapper[4752]: E0929 10:45:31.030718 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 10:45:31 crc kubenswrapper[4752]: I0929 10:45:31.030458 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 10:45:31 crc kubenswrapper[4752]: E0929 10:45:31.030771 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 10:45:31 crc kubenswrapper[4752]: I0929 10:45:31.118263 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:31 crc kubenswrapper[4752]: I0929 10:45:31.118308 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:31 crc kubenswrapper[4752]: I0929 10:45:31.118321 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:31 crc kubenswrapper[4752]: I0929 10:45:31.118339 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:31 crc kubenswrapper[4752]: I0929 10:45:31.118352 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:31Z","lastTransitionTime":"2025-09-29T10:45:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:31 crc kubenswrapper[4752]: I0929 10:45:31.221471 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:31 crc kubenswrapper[4752]: I0929 10:45:31.221512 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:31 crc kubenswrapper[4752]: I0929 10:45:31.221554 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:31 crc kubenswrapper[4752]: I0929 10:45:31.221575 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:31 crc kubenswrapper[4752]: I0929 10:45:31.221588 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:31Z","lastTransitionTime":"2025-09-29T10:45:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:31 crc kubenswrapper[4752]: I0929 10:45:31.324942 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:31 crc kubenswrapper[4752]: I0929 10:45:31.324985 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:31 crc kubenswrapper[4752]: I0929 10:45:31.324997 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:31 crc kubenswrapper[4752]: I0929 10:45:31.325015 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:31 crc kubenswrapper[4752]: I0929 10:45:31.325101 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:31Z","lastTransitionTime":"2025-09-29T10:45:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:31 crc kubenswrapper[4752]: I0929 10:45:31.427734 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:31 crc kubenswrapper[4752]: I0929 10:45:31.427771 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:31 crc kubenswrapper[4752]: I0929 10:45:31.427784 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:31 crc kubenswrapper[4752]: I0929 10:45:31.427799 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:31 crc kubenswrapper[4752]: I0929 10:45:31.427827 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:31Z","lastTransitionTime":"2025-09-29T10:45:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:31 crc kubenswrapper[4752]: I0929 10:45:31.530968 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:31 crc kubenswrapper[4752]: I0929 10:45:31.531016 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:31 crc kubenswrapper[4752]: I0929 10:45:31.531033 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:31 crc kubenswrapper[4752]: I0929 10:45:31.531057 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:31 crc kubenswrapper[4752]: I0929 10:45:31.531073 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:31Z","lastTransitionTime":"2025-09-29T10:45:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:31 crc kubenswrapper[4752]: I0929 10:45:31.633381 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:31 crc kubenswrapper[4752]: I0929 10:45:31.633418 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:31 crc kubenswrapper[4752]: I0929 10:45:31.633430 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:31 crc kubenswrapper[4752]: I0929 10:45:31.633446 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:31 crc kubenswrapper[4752]: I0929 10:45:31.633456 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:31Z","lastTransitionTime":"2025-09-29T10:45:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:31 crc kubenswrapper[4752]: I0929 10:45:31.736688 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:31 crc kubenswrapper[4752]: I0929 10:45:31.736757 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:31 crc kubenswrapper[4752]: I0929 10:45:31.736776 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:31 crc kubenswrapper[4752]: I0929 10:45:31.736840 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:31 crc kubenswrapper[4752]: I0929 10:45:31.736865 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:31Z","lastTransitionTime":"2025-09-29T10:45:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:31 crc kubenswrapper[4752]: I0929 10:45:31.838608 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:31 crc kubenswrapper[4752]: I0929 10:45:31.838639 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:31 crc kubenswrapper[4752]: I0929 10:45:31.838647 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:31 crc kubenswrapper[4752]: I0929 10:45:31.838659 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:31 crc kubenswrapper[4752]: I0929 10:45:31.838668 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:31Z","lastTransitionTime":"2025-09-29T10:45:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:31 crc kubenswrapper[4752]: I0929 10:45:31.940917 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:31 crc kubenswrapper[4752]: I0929 10:45:31.940965 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:31 crc kubenswrapper[4752]: I0929 10:45:31.940975 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:31 crc kubenswrapper[4752]: I0929 10:45:31.940993 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:31 crc kubenswrapper[4752]: I0929 10:45:31.941004 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:31Z","lastTransitionTime":"2025-09-29T10:45:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:32 crc kubenswrapper[4752]: I0929 10:45:32.043026 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:32 crc kubenswrapper[4752]: I0929 10:45:32.043067 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:32 crc kubenswrapper[4752]: I0929 10:45:32.043077 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:32 crc kubenswrapper[4752]: I0929 10:45:32.043092 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:32 crc kubenswrapper[4752]: I0929 10:45:32.043103 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:32Z","lastTransitionTime":"2025-09-29T10:45:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:32 crc kubenswrapper[4752]: I0929 10:45:32.146284 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:32 crc kubenswrapper[4752]: I0929 10:45:32.146321 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:32 crc kubenswrapper[4752]: I0929 10:45:32.146331 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:32 crc kubenswrapper[4752]: I0929 10:45:32.146345 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:32 crc kubenswrapper[4752]: I0929 10:45:32.146357 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:32Z","lastTransitionTime":"2025-09-29T10:45:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:32 crc kubenswrapper[4752]: I0929 10:45:32.249707 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:32 crc kubenswrapper[4752]: I0929 10:45:32.249761 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:32 crc kubenswrapper[4752]: I0929 10:45:32.249770 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:32 crc kubenswrapper[4752]: I0929 10:45:32.249785 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:32 crc kubenswrapper[4752]: I0929 10:45:32.249813 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:32Z","lastTransitionTime":"2025-09-29T10:45:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:32 crc kubenswrapper[4752]: I0929 10:45:32.352476 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:32 crc kubenswrapper[4752]: I0929 10:45:32.352525 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:32 crc kubenswrapper[4752]: I0929 10:45:32.352540 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:32 crc kubenswrapper[4752]: I0929 10:45:32.352560 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:32 crc kubenswrapper[4752]: I0929 10:45:32.352577 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:32Z","lastTransitionTime":"2025-09-29T10:45:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:32 crc kubenswrapper[4752]: I0929 10:45:32.454733 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:32 crc kubenswrapper[4752]: I0929 10:45:32.454779 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:32 crc kubenswrapper[4752]: I0929 10:45:32.454792 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:32 crc kubenswrapper[4752]: I0929 10:45:32.454832 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:32 crc kubenswrapper[4752]: I0929 10:45:32.454845 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:32Z","lastTransitionTime":"2025-09-29T10:45:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:32 crc kubenswrapper[4752]: I0929 10:45:32.556795 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:32 crc kubenswrapper[4752]: I0929 10:45:32.556852 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:32 crc kubenswrapper[4752]: I0929 10:45:32.556867 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:32 crc kubenswrapper[4752]: I0929 10:45:32.556881 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:32 crc kubenswrapper[4752]: I0929 10:45:32.556890 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:32Z","lastTransitionTime":"2025-09-29T10:45:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:32 crc kubenswrapper[4752]: I0929 10:45:32.659063 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:32 crc kubenswrapper[4752]: I0929 10:45:32.659125 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:32 crc kubenswrapper[4752]: I0929 10:45:32.659135 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:32 crc kubenswrapper[4752]: I0929 10:45:32.659150 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:32 crc kubenswrapper[4752]: I0929 10:45:32.659163 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:32Z","lastTransitionTime":"2025-09-29T10:45:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:32 crc kubenswrapper[4752]: I0929 10:45:32.762456 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:32 crc kubenswrapper[4752]: I0929 10:45:32.762511 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:32 crc kubenswrapper[4752]: I0929 10:45:32.762521 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:32 crc kubenswrapper[4752]: I0929 10:45:32.762537 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:32 crc kubenswrapper[4752]: I0929 10:45:32.762548 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:32Z","lastTransitionTime":"2025-09-29T10:45:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:32 crc kubenswrapper[4752]: I0929 10:45:32.864996 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:32 crc kubenswrapper[4752]: I0929 10:45:32.865055 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:32 crc kubenswrapper[4752]: I0929 10:45:32.865070 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:32 crc kubenswrapper[4752]: I0929 10:45:32.865091 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:32 crc kubenswrapper[4752]: I0929 10:45:32.865108 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:32Z","lastTransitionTime":"2025-09-29T10:45:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:32 crc kubenswrapper[4752]: I0929 10:45:32.967486 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:32 crc kubenswrapper[4752]: I0929 10:45:32.967550 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:32 crc kubenswrapper[4752]: I0929 10:45:32.967562 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:32 crc kubenswrapper[4752]: I0929 10:45:32.967598 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:32 crc kubenswrapper[4752]: I0929 10:45:32.967615 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:32Z","lastTransitionTime":"2025-09-29T10:45:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:33 crc kubenswrapper[4752]: I0929 10:45:33.030876 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sq7f4" Sep 29 10:45:33 crc kubenswrapper[4752]: I0929 10:45:33.031091 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 10:45:33 crc kubenswrapper[4752]: I0929 10:45:33.030919 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 10:45:33 crc kubenswrapper[4752]: I0929 10:45:33.030916 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 10:45:33 crc kubenswrapper[4752]: E0929 10:45:33.031263 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sq7f4" podUID="0a33b92e-d79c-4162-8500-df7a89df8df3" Sep 29 10:45:33 crc kubenswrapper[4752]: E0929 10:45:33.031356 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 10:45:33 crc kubenswrapper[4752]: E0929 10:45:33.031523 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 10:45:33 crc kubenswrapper[4752]: E0929 10:45:33.031595 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 10:45:33 crc kubenswrapper[4752]: I0929 10:45:33.070603 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:33 crc kubenswrapper[4752]: I0929 10:45:33.070658 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:33 crc kubenswrapper[4752]: I0929 10:45:33.070669 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:33 crc kubenswrapper[4752]: I0929 10:45:33.070684 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:33 crc kubenswrapper[4752]: I0929 10:45:33.070693 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:33Z","lastTransitionTime":"2025-09-29T10:45:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:33 crc kubenswrapper[4752]: I0929 10:45:33.172843 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:33 crc kubenswrapper[4752]: I0929 10:45:33.172884 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:33 crc kubenswrapper[4752]: I0929 10:45:33.172896 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:33 crc kubenswrapper[4752]: I0929 10:45:33.172914 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:33 crc kubenswrapper[4752]: I0929 10:45:33.172926 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:33Z","lastTransitionTime":"2025-09-29T10:45:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:33 crc kubenswrapper[4752]: I0929 10:45:33.275567 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:33 crc kubenswrapper[4752]: I0929 10:45:33.276291 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:33 crc kubenswrapper[4752]: I0929 10:45:33.276321 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:33 crc kubenswrapper[4752]: I0929 10:45:33.276342 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:33 crc kubenswrapper[4752]: I0929 10:45:33.276354 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:33Z","lastTransitionTime":"2025-09-29T10:45:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:33 crc kubenswrapper[4752]: I0929 10:45:33.378592 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:33 crc kubenswrapper[4752]: I0929 10:45:33.378647 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:33 crc kubenswrapper[4752]: I0929 10:45:33.378663 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:33 crc kubenswrapper[4752]: I0929 10:45:33.378683 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:33 crc kubenswrapper[4752]: I0929 10:45:33.378698 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:33Z","lastTransitionTime":"2025-09-29T10:45:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:33 crc kubenswrapper[4752]: I0929 10:45:33.480818 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:33 crc kubenswrapper[4752]: I0929 10:45:33.480861 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:33 crc kubenswrapper[4752]: I0929 10:45:33.480869 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:33 crc kubenswrapper[4752]: I0929 10:45:33.480884 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:33 crc kubenswrapper[4752]: I0929 10:45:33.480894 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:33Z","lastTransitionTime":"2025-09-29T10:45:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:33 crc kubenswrapper[4752]: I0929 10:45:33.582828 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:33 crc kubenswrapper[4752]: I0929 10:45:33.582897 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:33 crc kubenswrapper[4752]: I0929 10:45:33.582914 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:33 crc kubenswrapper[4752]: I0929 10:45:33.582939 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:33 crc kubenswrapper[4752]: I0929 10:45:33.582956 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:33Z","lastTransitionTime":"2025-09-29T10:45:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:33 crc kubenswrapper[4752]: I0929 10:45:33.685180 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:33 crc kubenswrapper[4752]: I0929 10:45:33.685234 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:33 crc kubenswrapper[4752]: I0929 10:45:33.685248 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:33 crc kubenswrapper[4752]: I0929 10:45:33.685267 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:33 crc kubenswrapper[4752]: I0929 10:45:33.685278 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:33Z","lastTransitionTime":"2025-09-29T10:45:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:33 crc kubenswrapper[4752]: I0929 10:45:33.789085 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:33 crc kubenswrapper[4752]: I0929 10:45:33.789154 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:33 crc kubenswrapper[4752]: I0929 10:45:33.789166 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:33 crc kubenswrapper[4752]: I0929 10:45:33.789186 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:33 crc kubenswrapper[4752]: I0929 10:45:33.789198 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:33Z","lastTransitionTime":"2025-09-29T10:45:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:33 crc kubenswrapper[4752]: I0929 10:45:33.892407 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:33 crc kubenswrapper[4752]: I0929 10:45:33.892447 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:33 crc kubenswrapper[4752]: I0929 10:45:33.892458 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:33 crc kubenswrapper[4752]: I0929 10:45:33.892480 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:33 crc kubenswrapper[4752]: I0929 10:45:33.892495 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:33Z","lastTransitionTime":"2025-09-29T10:45:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:33 crc kubenswrapper[4752]: I0929 10:45:33.995643 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:33 crc kubenswrapper[4752]: I0929 10:45:33.995687 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:33 crc kubenswrapper[4752]: I0929 10:45:33.995699 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:33 crc kubenswrapper[4752]: I0929 10:45:33.995715 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:33 crc kubenswrapper[4752]: I0929 10:45:33.995725 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:33Z","lastTransitionTime":"2025-09-29T10:45:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:34 crc kubenswrapper[4752]: I0929 10:45:34.045516 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Sep 29 10:45:34 crc kubenswrapper[4752]: I0929 10:45:34.097869 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:34 crc kubenswrapper[4752]: I0929 10:45:34.097905 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:34 crc kubenswrapper[4752]: I0929 10:45:34.097914 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:34 crc kubenswrapper[4752]: I0929 10:45:34.097928 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:34 crc kubenswrapper[4752]: I0929 10:45:34.097937 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:34Z","lastTransitionTime":"2025-09-29T10:45:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:34 crc kubenswrapper[4752]: I0929 10:45:34.199742 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:34 crc kubenswrapper[4752]: I0929 10:45:34.199778 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:34 crc kubenswrapper[4752]: I0929 10:45:34.199788 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:34 crc kubenswrapper[4752]: I0929 10:45:34.199817 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:34 crc kubenswrapper[4752]: I0929 10:45:34.199828 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:34Z","lastTransitionTime":"2025-09-29T10:45:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:34 crc kubenswrapper[4752]: I0929 10:45:34.301475 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:34 crc kubenswrapper[4752]: I0929 10:45:34.301503 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:34 crc kubenswrapper[4752]: I0929 10:45:34.301511 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:34 crc kubenswrapper[4752]: I0929 10:45:34.301537 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:34 crc kubenswrapper[4752]: I0929 10:45:34.301546 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:34Z","lastTransitionTime":"2025-09-29T10:45:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:34 crc kubenswrapper[4752]: I0929 10:45:34.404653 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:34 crc kubenswrapper[4752]: I0929 10:45:34.404733 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:34 crc kubenswrapper[4752]: I0929 10:45:34.404748 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:34 crc kubenswrapper[4752]: I0929 10:45:34.404764 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:34 crc kubenswrapper[4752]: I0929 10:45:34.404777 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:34Z","lastTransitionTime":"2025-09-29T10:45:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:34 crc kubenswrapper[4752]: I0929 10:45:34.506878 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:34 crc kubenswrapper[4752]: I0929 10:45:34.506916 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:34 crc kubenswrapper[4752]: I0929 10:45:34.506926 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:34 crc kubenswrapper[4752]: I0929 10:45:34.506940 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:34 crc kubenswrapper[4752]: I0929 10:45:34.506951 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:34Z","lastTransitionTime":"2025-09-29T10:45:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:34 crc kubenswrapper[4752]: I0929 10:45:34.609273 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:34 crc kubenswrapper[4752]: I0929 10:45:34.609312 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:34 crc kubenswrapper[4752]: I0929 10:45:34.609322 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:34 crc kubenswrapper[4752]: I0929 10:45:34.609337 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:34 crc kubenswrapper[4752]: I0929 10:45:34.609348 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:34Z","lastTransitionTime":"2025-09-29T10:45:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:34 crc kubenswrapper[4752]: I0929 10:45:34.711347 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:34 crc kubenswrapper[4752]: I0929 10:45:34.711376 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:34 crc kubenswrapper[4752]: I0929 10:45:34.711384 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:34 crc kubenswrapper[4752]: I0929 10:45:34.711396 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:34 crc kubenswrapper[4752]: I0929 10:45:34.711405 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:34Z","lastTransitionTime":"2025-09-29T10:45:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:34 crc kubenswrapper[4752]: I0929 10:45:34.814184 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:34 crc kubenswrapper[4752]: I0929 10:45:34.814235 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:34 crc kubenswrapper[4752]: I0929 10:45:34.814247 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:34 crc kubenswrapper[4752]: I0929 10:45:34.814262 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:34 crc kubenswrapper[4752]: I0929 10:45:34.814273 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:34Z","lastTransitionTime":"2025-09-29T10:45:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:34 crc kubenswrapper[4752]: I0929 10:45:34.916094 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:34 crc kubenswrapper[4752]: I0929 10:45:34.916127 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:34 crc kubenswrapper[4752]: I0929 10:45:34.916135 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:34 crc kubenswrapper[4752]: I0929 10:45:34.916147 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:34 crc kubenswrapper[4752]: I0929 10:45:34.916155 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:34Z","lastTransitionTime":"2025-09-29T10:45:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:35 crc kubenswrapper[4752]: I0929 10:45:35.018917 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:35 crc kubenswrapper[4752]: I0929 10:45:35.018971 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:35 crc kubenswrapper[4752]: I0929 10:45:35.018985 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:35 crc kubenswrapper[4752]: I0929 10:45:35.019006 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:35 crc kubenswrapper[4752]: I0929 10:45:35.019020 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:35Z","lastTransitionTime":"2025-09-29T10:45:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:35 crc kubenswrapper[4752]: I0929 10:45:35.030698 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sq7f4" Sep 29 10:45:35 crc kubenswrapper[4752]: I0929 10:45:35.030761 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 10:45:35 crc kubenswrapper[4752]: I0929 10:45:35.030740 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 10:45:35 crc kubenswrapper[4752]: I0929 10:45:35.030863 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 10:45:35 crc kubenswrapper[4752]: E0929 10:45:35.031004 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sq7f4" podUID="0a33b92e-d79c-4162-8500-df7a89df8df3" Sep 29 10:45:35 crc kubenswrapper[4752]: E0929 10:45:35.031145 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 10:45:35 crc kubenswrapper[4752]: E0929 10:45:35.031261 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 10:45:35 crc kubenswrapper[4752]: E0929 10:45:35.031359 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 10:45:35 crc kubenswrapper[4752]: I0929 10:45:35.121480 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:35 crc kubenswrapper[4752]: I0929 10:45:35.121526 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:35 crc kubenswrapper[4752]: I0929 10:45:35.121537 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:35 crc kubenswrapper[4752]: I0929 10:45:35.121554 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:35 crc kubenswrapper[4752]: I0929 10:45:35.121567 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:35Z","lastTransitionTime":"2025-09-29T10:45:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:35 crc kubenswrapper[4752]: I0929 10:45:35.224334 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:35 crc kubenswrapper[4752]: I0929 10:45:35.224387 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:35 crc kubenswrapper[4752]: I0929 10:45:35.224399 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:35 crc kubenswrapper[4752]: I0929 10:45:35.224415 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:35 crc kubenswrapper[4752]: I0929 10:45:35.224427 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:35Z","lastTransitionTime":"2025-09-29T10:45:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:35 crc kubenswrapper[4752]: I0929 10:45:35.326518 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:35 crc kubenswrapper[4752]: I0929 10:45:35.326565 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:35 crc kubenswrapper[4752]: I0929 10:45:35.326573 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:35 crc kubenswrapper[4752]: I0929 10:45:35.326587 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:35 crc kubenswrapper[4752]: I0929 10:45:35.326596 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:35Z","lastTransitionTime":"2025-09-29T10:45:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:35 crc kubenswrapper[4752]: I0929 10:45:35.429317 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:35 crc kubenswrapper[4752]: I0929 10:45:35.429354 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:35 crc kubenswrapper[4752]: I0929 10:45:35.429365 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:35 crc kubenswrapper[4752]: I0929 10:45:35.429380 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:35 crc kubenswrapper[4752]: I0929 10:45:35.429390 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:35Z","lastTransitionTime":"2025-09-29T10:45:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:35 crc kubenswrapper[4752]: I0929 10:45:35.531981 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:35 crc kubenswrapper[4752]: I0929 10:45:35.532019 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:35 crc kubenswrapper[4752]: I0929 10:45:35.532028 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:35 crc kubenswrapper[4752]: I0929 10:45:35.532042 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:35 crc kubenswrapper[4752]: I0929 10:45:35.532053 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:35Z","lastTransitionTime":"2025-09-29T10:45:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:35 crc kubenswrapper[4752]: I0929 10:45:35.635401 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:35 crc kubenswrapper[4752]: I0929 10:45:35.635431 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:35 crc kubenswrapper[4752]: I0929 10:45:35.635439 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:35 crc kubenswrapper[4752]: I0929 10:45:35.635452 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:35 crc kubenswrapper[4752]: I0929 10:45:35.635459 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:35Z","lastTransitionTime":"2025-09-29T10:45:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:35 crc kubenswrapper[4752]: I0929 10:45:35.738009 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:35 crc kubenswrapper[4752]: I0929 10:45:35.738053 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:35 crc kubenswrapper[4752]: I0929 10:45:35.738065 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:35 crc kubenswrapper[4752]: I0929 10:45:35.738080 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:35 crc kubenswrapper[4752]: I0929 10:45:35.738089 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:35Z","lastTransitionTime":"2025-09-29T10:45:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:35 crc kubenswrapper[4752]: I0929 10:45:35.840923 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:35 crc kubenswrapper[4752]: I0929 10:45:35.841001 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:35 crc kubenswrapper[4752]: I0929 10:45:35.841024 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:35 crc kubenswrapper[4752]: I0929 10:45:35.841056 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:35 crc kubenswrapper[4752]: I0929 10:45:35.841082 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:35Z","lastTransitionTime":"2025-09-29T10:45:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:35 crc kubenswrapper[4752]: I0929 10:45:35.943846 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:35 crc kubenswrapper[4752]: I0929 10:45:35.943882 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:35 crc kubenswrapper[4752]: I0929 10:45:35.943891 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:35 crc kubenswrapper[4752]: I0929 10:45:35.943904 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:35 crc kubenswrapper[4752]: I0929 10:45:35.943913 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:35Z","lastTransitionTime":"2025-09-29T10:45:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:36 crc kubenswrapper[4752]: I0929 10:45:36.045885 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:36 crc kubenswrapper[4752]: I0929 10:45:36.045931 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:36 crc kubenswrapper[4752]: I0929 10:45:36.045943 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:36 crc kubenswrapper[4752]: I0929 10:45:36.045959 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:36 crc kubenswrapper[4752]: I0929 10:45:36.045970 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:36Z","lastTransitionTime":"2025-09-29T10:45:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:36 crc kubenswrapper[4752]: I0929 10:45:36.148688 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:36 crc kubenswrapper[4752]: I0929 10:45:36.148732 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:36 crc kubenswrapper[4752]: I0929 10:45:36.148743 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:36 crc kubenswrapper[4752]: I0929 10:45:36.148761 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:36 crc kubenswrapper[4752]: I0929 10:45:36.148771 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:36Z","lastTransitionTime":"2025-09-29T10:45:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:36 crc kubenswrapper[4752]: I0929 10:45:36.251440 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:36 crc kubenswrapper[4752]: I0929 10:45:36.251484 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:36 crc kubenswrapper[4752]: I0929 10:45:36.251501 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:36 crc kubenswrapper[4752]: I0929 10:45:36.251519 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:36 crc kubenswrapper[4752]: I0929 10:45:36.251530 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:36Z","lastTransitionTime":"2025-09-29T10:45:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:36 crc kubenswrapper[4752]: I0929 10:45:36.353977 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:36 crc kubenswrapper[4752]: I0929 10:45:36.354035 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:36 crc kubenswrapper[4752]: I0929 10:45:36.354046 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:36 crc kubenswrapper[4752]: I0929 10:45:36.354060 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:36 crc kubenswrapper[4752]: I0929 10:45:36.354071 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:36Z","lastTransitionTime":"2025-09-29T10:45:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:36 crc kubenswrapper[4752]: I0929 10:45:36.456639 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:36 crc kubenswrapper[4752]: I0929 10:45:36.456700 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:36 crc kubenswrapper[4752]: I0929 10:45:36.456718 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:36 crc kubenswrapper[4752]: I0929 10:45:36.456741 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:36 crc kubenswrapper[4752]: I0929 10:45:36.456759 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:36Z","lastTransitionTime":"2025-09-29T10:45:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:36 crc kubenswrapper[4752]: I0929 10:45:36.559320 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:36 crc kubenswrapper[4752]: I0929 10:45:36.559381 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:36 crc kubenswrapper[4752]: I0929 10:45:36.559395 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:36 crc kubenswrapper[4752]: I0929 10:45:36.559419 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:36 crc kubenswrapper[4752]: I0929 10:45:36.559434 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:36Z","lastTransitionTime":"2025-09-29T10:45:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:36 crc kubenswrapper[4752]: I0929 10:45:36.661706 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:36 crc kubenswrapper[4752]: I0929 10:45:36.661772 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:36 crc kubenswrapper[4752]: I0929 10:45:36.661783 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:36 crc kubenswrapper[4752]: I0929 10:45:36.661819 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:36 crc kubenswrapper[4752]: I0929 10:45:36.661829 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:36Z","lastTransitionTime":"2025-09-29T10:45:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:36 crc kubenswrapper[4752]: I0929 10:45:36.763858 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:36 crc kubenswrapper[4752]: I0929 10:45:36.763894 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:36 crc kubenswrapper[4752]: I0929 10:45:36.763906 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:36 crc kubenswrapper[4752]: I0929 10:45:36.763922 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:36 crc kubenswrapper[4752]: I0929 10:45:36.763933 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:36Z","lastTransitionTime":"2025-09-29T10:45:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:36 crc kubenswrapper[4752]: I0929 10:45:36.866683 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:36 crc kubenswrapper[4752]: I0929 10:45:36.866725 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:36 crc kubenswrapper[4752]: I0929 10:45:36.866734 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:36 crc kubenswrapper[4752]: I0929 10:45:36.866750 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:36 crc kubenswrapper[4752]: I0929 10:45:36.866761 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:36Z","lastTransitionTime":"2025-09-29T10:45:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:36 crc kubenswrapper[4752]: I0929 10:45:36.969742 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:36 crc kubenswrapper[4752]: I0929 10:45:36.969780 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:36 crc kubenswrapper[4752]: I0929 10:45:36.969789 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:36 crc kubenswrapper[4752]: I0929 10:45:36.969818 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:36 crc kubenswrapper[4752]: I0929 10:45:36.969827 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:36Z","lastTransitionTime":"2025-09-29T10:45:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:37 crc kubenswrapper[4752]: I0929 10:45:37.030713 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 10:45:37 crc kubenswrapper[4752]: I0929 10:45:37.030751 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 10:45:37 crc kubenswrapper[4752]: I0929 10:45:37.030943 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 10:45:37 crc kubenswrapper[4752]: I0929 10:45:37.031038 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sq7f4" Sep 29 10:45:37 crc kubenswrapper[4752]: E0929 10:45:37.031119 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 10:45:37 crc kubenswrapper[4752]: E0929 10:45:37.031261 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sq7f4" podUID="0a33b92e-d79c-4162-8500-df7a89df8df3" Sep 29 10:45:37 crc kubenswrapper[4752]: E0929 10:45:37.031338 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 10:45:37 crc kubenswrapper[4752]: E0929 10:45:37.031407 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 10:45:37 crc kubenswrapper[4752]: I0929 10:45:37.031469 4752 scope.go:117] "RemoveContainer" containerID="d7274c1c9e4153ac28534a3b9f58c87c2e5480650edf3522e235805aea87dd76" Sep 29 10:45:37 crc kubenswrapper[4752]: I0929 10:45:37.072537 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:37 crc kubenswrapper[4752]: I0929 10:45:37.072605 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:37 crc kubenswrapper[4752]: I0929 10:45:37.072616 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:37 crc kubenswrapper[4752]: I0929 10:45:37.072633 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:37 crc kubenswrapper[4752]: I0929 10:45:37.072642 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:37Z","lastTransitionTime":"2025-09-29T10:45:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:37 crc kubenswrapper[4752]: I0929 10:45:37.174505 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:37 crc kubenswrapper[4752]: I0929 10:45:37.174541 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:37 crc kubenswrapper[4752]: I0929 10:45:37.174550 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:37 crc kubenswrapper[4752]: I0929 10:45:37.174565 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:37 crc kubenswrapper[4752]: I0929 10:45:37.174574 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:37Z","lastTransitionTime":"2025-09-29T10:45:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:37 crc kubenswrapper[4752]: I0929 10:45:37.277089 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:37 crc kubenswrapper[4752]: I0929 10:45:37.277141 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:37 crc kubenswrapper[4752]: I0929 10:45:37.277151 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:37 crc kubenswrapper[4752]: I0929 10:45:37.277165 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:37 crc kubenswrapper[4752]: I0929 10:45:37.277175 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:37Z","lastTransitionTime":"2025-09-29T10:45:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:37 crc kubenswrapper[4752]: I0929 10:45:37.379261 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:37 crc kubenswrapper[4752]: I0929 10:45:37.379299 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:37 crc kubenswrapper[4752]: I0929 10:45:37.379310 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:37 crc kubenswrapper[4752]: I0929 10:45:37.379325 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:37 crc kubenswrapper[4752]: I0929 10:45:37.379335 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:37Z","lastTransitionTime":"2025-09-29T10:45:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:37 crc kubenswrapper[4752]: I0929 10:45:37.480949 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:37 crc kubenswrapper[4752]: I0929 10:45:37.480980 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:37 crc kubenswrapper[4752]: I0929 10:45:37.480989 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:37 crc kubenswrapper[4752]: I0929 10:45:37.481002 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:37 crc kubenswrapper[4752]: I0929 10:45:37.481011 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:37Z","lastTransitionTime":"2025-09-29T10:45:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:37 crc kubenswrapper[4752]: I0929 10:45:37.513316 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c2vrh_94028c24-ec10-4d5c-b32c-1700e677d539/ovnkube-controller/2.log" Sep 29 10:45:37 crc kubenswrapper[4752]: I0929 10:45:37.515550 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" event={"ID":"94028c24-ec10-4d5c-b32c-1700e677d539","Type":"ContainerStarted","Data":"f5083afbe3807e485df0ceb9323e330b0f37722f050f83895507559c9f655a21"} Sep 29 10:45:37 crc kubenswrapper[4752]: I0929 10:45:37.515939 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" Sep 29 10:45:37 crc kubenswrapper[4752]: I0929 10:45:37.529676 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:37Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:37 crc kubenswrapper[4752]: I0929 10:45:37.542711 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131d2c8a72fc6a373ebf6835840e6b9c1829db4c78b4961bf36642fd0e8a5636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:37Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:37 crc kubenswrapper[4752]: I0929 10:45:37.556205 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5863c243-797d-462a-b11f-71aaf005f8d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://166738b29f01996ec981fd00b49f422e4a97fe774396e7ea153ad29ef30a7370\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdtpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32155f6078e9c15abe4c659ac79b064ec182a232ea1d816998da4de273b7aa67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdtpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mgrvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:37Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:37 crc kubenswrapper[4752]: I0929 10:45:37.568250 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4whp8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"398b6e5c-29ac-4701-9207-d3d269b62224\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63db080ebca3f5ea23ddc9af874b6b500abe8044c73794ae0749df2949fb9520\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9hp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4whp8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:37Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:37 crc kubenswrapper[4752]: I0929 10:45:37.583084 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:37 crc kubenswrapper[4752]: I0929 10:45:37.583122 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:37 crc kubenswrapper[4752]: I0929 10:45:37.583132 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:37 crc kubenswrapper[4752]: I0929 10:45:37.583172 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:37 crc kubenswrapper[4752]: I0929 10:45:37.583182 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:37Z","lastTransitionTime":"2025-09-29T10:45:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:37 crc kubenswrapper[4752]: I0929 10:45:37.590951 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48ad7053-6039-4b1a-9729-fcbe1d938928\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00965359c30aa25677d4b114c00b339b155ab4b5316d5e355536bea5b65eaba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e2d86e0821e0155affe296e5cc70e9904f04c800943101e62509e3a5e4e0808\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9378a6f1ac902b030f4ecabac1eae40f884dc1546a360e178f38300e137d8b0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a174bcfad22c2a58c48792478272705c80a56775b45b14919ea1de1dd92b4cbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://828d416b69696f709d91feb8df8fead0f95be74a91c5dab25756e341e29413dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e4ae4f6e0a6df2f1e370b0ff37704c0b0252752c0d8e8a1cdd83088ca9ec951\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e4ae4f6e0a6df2f1e370b0ff37704c0b0252752c0d8e8a1cdd83088ca9ec951\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40c90938f79ba960fa16979dd5f239674df4b13cae8b0b5d3bb48b0e46219a34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40c90938f79ba960fa16979dd5f239674df4b13cae8b0b5d3bb48b0e46219a34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f99c6fe84624f3e518bbe35ee9b700effb126ff1f36d995262b7ed8b73364780\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f99c6fe84624f3e518bbe35ee9b700effb126ff1f36d995262b7ed8b73364780\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:37Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:37 crc kubenswrapper[4752]: I0929 10:45:37.603315 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca4f637cfcb1e52fa69f0ffa46b3a53459225d9ad4afd1178bff709e812c5418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b70242846937de5b4dda37a2b8c48947fded378c299ea4ad857168589d7c175\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:37Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:37 crc kubenswrapper[4752]: I0929 10:45:37.613951 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7kp7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66a61a7f-9be6-486b-a425-62ed62ec0ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4170732970e5e7c429279d239eb2d4b9d8249ff254b35f38ff80d0321087be2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kgr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7kp7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:37Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:37 crc kubenswrapper[4752]: I0929 10:45:37.628225 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vm6zb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f30a1f9-86ef-450e-9f8c-8ef8d4ac380a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6bc5aff417397c8b264553f67de7ebd1aeadb67fb83114c5bb13c2e0d10e397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239ca1f17b9f1e1d6ba63b196e34066fe7fb37373453460261044f5fcaf819af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://239ca1f17b9f1e1d6ba63b196e34066fe7fb37373453460261044f5fcaf819af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd5b369dc688f11e4ab502a3886b722cba392fce0d3ac7850bd59abffbf7dee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd5b369dc688f11e4ab502a3886b722cba392fce0d3ac7850bd59abffbf7dee2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d17821abed9aca5c20373738f44ca9a61e954d1eee46f0d16c3e9b34d810a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88d17821abed9aca5c20373738f44ca9a61e954d1eee46f0d16c3e9b34d810a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50f5727e0bd53639ba6b6632f2d62c7c62ae74b07a60aa1cb58c2020990cae42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f5727e0bd53639ba6b6632f2d62c7c62ae74b07a60aa1cb58c2020990cae42\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd84740e3b0a970decedcc3960fb987fa618f9627f06be1d2d0b034d0361f805\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd84740e3b0a970decedcc3960fb987fa618f9627f06be1d2d0b034d0361f805\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6af6d9f7c1ca6625f88dcaa9ef267cf11f3ebb16a0ce12d3c2442550bc0833ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6af6d9f7c1ca6625f88dcaa9ef267cf11f3ebb16a0ce12d3c2442550bc0833ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vm6zb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:37Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:37 crc kubenswrapper[4752]: I0929 10:45:37.645905 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94028c24-ec10-4d5c-b32c-1700e677d539\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://486ac9c45cc8e6cc88a199b152343c1db14c51125b4357c85d5d082467fc4560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2860691a355a598f52a1f13213198fa7889748e67cca21a617ed5714f5eabcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e34a55130babbc5fbe9fb81d05fc687dc1b06c3bffea762ba699f9f6c317b312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5985eb5ebc8fa2ca986873aea235335770621597493b43eaa58d98329cd37009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b46368b26939edaf377aa86ef45fc9dc3ec4fa274dfe1cba458bafb8d32309e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a98f237ee9baeb799b2ea76ccbe7b349ed70b50f47738fc514ae56b46ee8d1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5083afbe3807e485df0ceb9323e330b0f37722f050f83895507559c9f655a21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7274c1c9e4153ac28534a3b9f58c87c2e5480650edf3522e235805aea87dd76\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T10:45:09Z\\\",\\\"message\\\":\\\"update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-marketplace/marketplace-operator-metrics]} name:Service_openshift-marketplace/marketplace-operator-metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.53:8081: 10.217.5.53:8383:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {89fe421e-04e8-4967-ac75-77a0e6f784ef}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0929 10:45:08.961904 6414 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI0929 10:45:08.961922 6414 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI0929 10:45:08.961943 6414 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nF0929 10:45:08.961950 6414 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T10:45:08Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea11fb795febf50e35263b0a02c32a01fd69937dfbfe196696cd1792e40cc191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f22dfbbd26fb3ebf4869b46406913cc1963e33c11794193c815235be5acee338\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f22dfbbd26fb3ebf4869b46406913cc1963e33c11794193c815235be5acee338\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c2vrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:37Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:37 crc kubenswrapper[4752]: I0929 10:45:37.659767 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520a5d33-312c-4033-8b69-5dd582f13ccc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6223734bbce461c09916aea7629bba0cfa97ea17050bca7417020ece9ae031a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1157b82d6f3337270d30abdceadaa1f0a01b3c6d8de6bc8e9edf083a8264f19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://854abd6205c2eec2229d0d65aec3edb7cf1cc1e77759df41bd22deda4a08c8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://362298e6215cc1a9971973419e58a45e5ded2c4120b1e800afd87f480f6fd3d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c927118840179fccacbe6a18a329c117cef73a6e914bf38d20fc2439d6a5c1ee\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"message\\\":\\\"le observer\\\\nW0929 10:44:40.787758 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0929 10:44:40.787900 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 10:44:40.788558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1487283959/tls.crt::/tmp/serving-cert-1487283959/tls.key\\\\\\\"\\\\nI0929 10:44:41.256284 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 10:44:41.261265 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 10:44:41.261291 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 10:44:41.261311 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 10:44:41.261316 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 10:44:41.267824 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0929 10:44:41.267847 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0929 10:44:41.267849 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 10:44:41.267871 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 10:44:41.267876 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 10:44:41.267879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 10:44:41.267882 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 10:44:41.267884 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0929 10:44:41.270258 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbe61bb570ef2be352bb3a0e55da353ce7b618b397e3bf9f0d66da0c9b6f1d4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80f961b58569cce6d634f225369902695ccda2e78efb1c6fd635f1535467cc1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80f961b58569cce6d634f225369902695ccda2e78efb1c6fd635f1535467cc1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:37Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:37 crc kubenswrapper[4752]: I0929 10:45:37.674013 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75427e82-74a4-46cd-ac54-210fa4bdd947\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b188b508629875e659215e5d09b261c54073368b770d1f876b5b0146b27f1af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88c78244f091e746e6cad8937b40c33fd6aef6118e696069f48acc0201635f54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88c78244f091e746e6cad8937b40c33fd6aef6118e696069f48acc0201635f54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:37Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:37 crc kubenswrapper[4752]: I0929 10:45:37.685015 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:37 crc kubenswrapper[4752]: I0929 10:45:37.685057 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:37 crc kubenswrapper[4752]: I0929 10:45:37.685071 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:37 crc kubenswrapper[4752]: I0929 10:45:37.685087 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:37 crc kubenswrapper[4752]: I0929 10:45:37.685099 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:37Z","lastTransitionTime":"2025-09-29T10:45:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:37 crc kubenswrapper[4752]: I0929 10:45:37.692022 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:37Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:37 crc kubenswrapper[4752]: I0929 10:45:37.705061 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fb781fd16d4a9f56202eb1724ed1a4ed6700ff7b81819573b955bcb07e563a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:37Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:37 crc kubenswrapper[4752]: I0929 10:45:37.718363 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xv5q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52fc9378-c37b-424b-afde-7b191bab5fde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d36b7c0411c2a5cbcb37f626fa70cfe6c7d3fc6280f6a9e882fa27766f6de761\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30ee75a35da106cc9424c7a3f97f28d0c711200667372c023612db4a9701c189\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T10:45:29Z\\\",\\\"message\\\":\\\"2025-09-29T10:44:43+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_46ee9782-bab0-46a6-9758-0bdfa32662ba\\\\n2025-09-29T10:44:43+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_46ee9782-bab0-46a6-9758-0bdfa32662ba to /host/opt/cni/bin/\\\\n2025-09-29T10:44:44Z [verbose] multus-daemon started\\\\n2025-09-29T10:44:44Z [verbose] Readiness Indicator file check\\\\n2025-09-29T10:45:29Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:45:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4rqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xv5q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:37Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:37 crc kubenswrapper[4752]: I0929 10:45:37.729874 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mp5pm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65f5485e-9000-4512-aad3-7d367715ac2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db5dba49df10714a5f00ec40865af87528f6bee63ee58a89f299af7c10e4d769\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z772z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://073cf9e4675b04d77ad58f0b7e1b313e3fe15e8daee4e1c8934a90924b04ad22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z772z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mp5pm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:37Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:37 crc kubenswrapper[4752]: I0929 10:45:37.739826 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sq7f4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a33b92e-d79c-4162-8500-df7a89df8df3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qck2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qck2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sq7f4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:37Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:37 crc kubenswrapper[4752]: I0929 10:45:37.752939 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3e5d3a3-2f2d-4f61-ae95-26ebd1f72342\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66d77cd5048e199a6eae84be4079c3b00305f4f5223b5176a49df0feb2f0bf8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74b270e951a827068c908168bf04d4cd3bcba62e472e4a3f415de8b7463fdccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dd4d83f6d6b5db7fc93239bc1a6b731c67bc15ef1ca1990b53589e4ad36bfa7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39ef26bf3e7b95ac9a59199bbabe11fd4e831baba1b120ef97a4839c0c4aab7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:37Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:37 crc kubenswrapper[4752]: I0929 10:45:37.764752 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:37Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:37 crc kubenswrapper[4752]: I0929 10:45:37.776829 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22c62d6c-d29c-416f-bfeb-476f97181a39\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bab580564f9dd31f6b2ea23a31918a9fdd2f247d13a0bd882f38dbaee4bf0b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44a5132d9611cf58eef747d86fd0cef4eb52366b9d1bacc6df0cf5be145d3998\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c6d3ad808fe69e726b66a03be183d33f000a614fadbc7f644015633fbb2b457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5335487c039d2e7e80a940cfe980fb46caf0cfc6302660b9318d9c8c525227cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5335487c039d2e7e80a940cfe980fb46caf0cfc6302660b9318d9c8c525227cb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:37Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:37 crc kubenswrapper[4752]: I0929 10:45:37.787624 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:37 crc kubenswrapper[4752]: I0929 10:45:37.787690 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:37 crc kubenswrapper[4752]: I0929 10:45:37.787703 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:37 crc kubenswrapper[4752]: I0929 10:45:37.787717 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:37 crc kubenswrapper[4752]: I0929 10:45:37.787725 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:37Z","lastTransitionTime":"2025-09-29T10:45:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:37 crc kubenswrapper[4752]: I0929 10:45:37.891323 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:37 crc kubenswrapper[4752]: I0929 10:45:37.891376 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:37 crc kubenswrapper[4752]: I0929 10:45:37.891388 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:37 crc kubenswrapper[4752]: I0929 10:45:37.891405 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:37 crc kubenswrapper[4752]: I0929 10:45:37.891417 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:37Z","lastTransitionTime":"2025-09-29T10:45:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:37 crc kubenswrapper[4752]: I0929 10:45:37.993830 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:37 crc kubenswrapper[4752]: I0929 10:45:37.993871 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:37 crc kubenswrapper[4752]: I0929 10:45:37.993882 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:37 crc kubenswrapper[4752]: I0929 10:45:37.993898 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:37 crc kubenswrapper[4752]: I0929 10:45:37.993909 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:37Z","lastTransitionTime":"2025-09-29T10:45:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:38 crc kubenswrapper[4752]: I0929 10:45:38.096396 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:38 crc kubenswrapper[4752]: I0929 10:45:38.096443 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:38 crc kubenswrapper[4752]: I0929 10:45:38.096454 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:38 crc kubenswrapper[4752]: I0929 10:45:38.096469 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:38 crc kubenswrapper[4752]: I0929 10:45:38.096478 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:38Z","lastTransitionTime":"2025-09-29T10:45:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:38 crc kubenswrapper[4752]: I0929 10:45:38.199153 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:38 crc kubenswrapper[4752]: I0929 10:45:38.199207 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:38 crc kubenswrapper[4752]: I0929 10:45:38.199218 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:38 crc kubenswrapper[4752]: I0929 10:45:38.199240 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:38 crc kubenswrapper[4752]: I0929 10:45:38.199254 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:38Z","lastTransitionTime":"2025-09-29T10:45:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:38 crc kubenswrapper[4752]: I0929 10:45:38.301868 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:38 crc kubenswrapper[4752]: I0929 10:45:38.301904 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:38 crc kubenswrapper[4752]: I0929 10:45:38.301913 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:38 crc kubenswrapper[4752]: I0929 10:45:38.301926 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:38 crc kubenswrapper[4752]: I0929 10:45:38.301934 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:38Z","lastTransitionTime":"2025-09-29T10:45:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:38 crc kubenswrapper[4752]: I0929 10:45:38.319184 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:38 crc kubenswrapper[4752]: I0929 10:45:38.319230 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:38 crc kubenswrapper[4752]: I0929 10:45:38.319241 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:38 crc kubenswrapper[4752]: I0929 10:45:38.319257 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:38 crc kubenswrapper[4752]: I0929 10:45:38.319268 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:38Z","lastTransitionTime":"2025-09-29T10:45:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:38 crc kubenswrapper[4752]: E0929 10:45:38.332531 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:45:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:45:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:45:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:45:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"67757396-6dfe-4e60-ba89-bdfd50031eb3\\\",\\\"systemUUID\\\":\\\"d8106fc8-56a6-4aa2-998a-aa38bb8caa68\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:38Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:38 crc kubenswrapper[4752]: I0929 10:45:38.336464 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:38 crc kubenswrapper[4752]: I0929 10:45:38.336500 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:38 crc kubenswrapper[4752]: I0929 10:45:38.336510 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:38 crc kubenswrapper[4752]: I0929 10:45:38.336527 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:38 crc kubenswrapper[4752]: I0929 10:45:38.336538 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:38Z","lastTransitionTime":"2025-09-29T10:45:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:38 crc kubenswrapper[4752]: E0929 10:45:38.350089 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:45:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:45:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:45:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:45:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"67757396-6dfe-4e60-ba89-bdfd50031eb3\\\",\\\"systemUUID\\\":\\\"d8106fc8-56a6-4aa2-998a-aa38bb8caa68\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:38Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:38 crc kubenswrapper[4752]: I0929 10:45:38.353740 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:38 crc kubenswrapper[4752]: I0929 10:45:38.353779 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:38 crc kubenswrapper[4752]: I0929 10:45:38.353790 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:38 crc kubenswrapper[4752]: I0929 10:45:38.353822 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:38 crc kubenswrapper[4752]: I0929 10:45:38.353835 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:38Z","lastTransitionTime":"2025-09-29T10:45:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:38 crc kubenswrapper[4752]: E0929 10:45:38.367080 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:45:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:45:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:45:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:45:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"67757396-6dfe-4e60-ba89-bdfd50031eb3\\\",\\\"systemUUID\\\":\\\"d8106fc8-56a6-4aa2-998a-aa38bb8caa68\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:38Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:38 crc kubenswrapper[4752]: I0929 10:45:38.370608 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:38 crc kubenswrapper[4752]: I0929 10:45:38.370637 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:38 crc kubenswrapper[4752]: I0929 10:45:38.370648 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:38 crc kubenswrapper[4752]: I0929 10:45:38.370663 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:38 crc kubenswrapper[4752]: I0929 10:45:38.370673 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:38Z","lastTransitionTime":"2025-09-29T10:45:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:38 crc kubenswrapper[4752]: E0929 10:45:38.382606 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:45:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:45:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:45:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:45:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"67757396-6dfe-4e60-ba89-bdfd50031eb3\\\",\\\"systemUUID\\\":\\\"d8106fc8-56a6-4aa2-998a-aa38bb8caa68\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:38Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:38 crc kubenswrapper[4752]: I0929 10:45:38.386574 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:38 crc kubenswrapper[4752]: I0929 10:45:38.386642 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:38 crc kubenswrapper[4752]: I0929 10:45:38.386653 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:38 crc kubenswrapper[4752]: I0929 10:45:38.386668 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:38 crc kubenswrapper[4752]: I0929 10:45:38.386678 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:38Z","lastTransitionTime":"2025-09-29T10:45:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:38 crc kubenswrapper[4752]: E0929 10:45:38.397960 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:45:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:45:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:45:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:45:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"67757396-6dfe-4e60-ba89-bdfd50031eb3\\\",\\\"systemUUID\\\":\\\"d8106fc8-56a6-4aa2-998a-aa38bb8caa68\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:38Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:38 crc kubenswrapper[4752]: E0929 10:45:38.398152 4752 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 29 10:45:38 crc kubenswrapper[4752]: I0929 10:45:38.404589 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:38 crc kubenswrapper[4752]: I0929 10:45:38.404629 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:38 crc kubenswrapper[4752]: I0929 10:45:38.404642 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:38 crc kubenswrapper[4752]: I0929 10:45:38.404660 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:38 crc kubenswrapper[4752]: I0929 10:45:38.404672 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:38Z","lastTransitionTime":"2025-09-29T10:45:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:38 crc kubenswrapper[4752]: I0929 10:45:38.507740 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:38 crc kubenswrapper[4752]: I0929 10:45:38.507783 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:38 crc kubenswrapper[4752]: I0929 10:45:38.507796 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:38 crc kubenswrapper[4752]: I0929 10:45:38.507830 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:38 crc kubenswrapper[4752]: I0929 10:45:38.507841 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:38Z","lastTransitionTime":"2025-09-29T10:45:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:38 crc kubenswrapper[4752]: I0929 10:45:38.522748 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c2vrh_94028c24-ec10-4d5c-b32c-1700e677d539/ovnkube-controller/3.log" Sep 29 10:45:38 crc kubenswrapper[4752]: I0929 10:45:38.523897 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c2vrh_94028c24-ec10-4d5c-b32c-1700e677d539/ovnkube-controller/2.log" Sep 29 10:45:38 crc kubenswrapper[4752]: I0929 10:45:38.526630 4752 generic.go:334] "Generic (PLEG): container finished" podID="94028c24-ec10-4d5c-b32c-1700e677d539" containerID="f5083afbe3807e485df0ceb9323e330b0f37722f050f83895507559c9f655a21" exitCode=1 Sep 29 10:45:38 crc kubenswrapper[4752]: I0929 10:45:38.526666 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" event={"ID":"94028c24-ec10-4d5c-b32c-1700e677d539","Type":"ContainerDied","Data":"f5083afbe3807e485df0ceb9323e330b0f37722f050f83895507559c9f655a21"} Sep 29 10:45:38 crc kubenswrapper[4752]: I0929 10:45:38.526704 4752 scope.go:117] "RemoveContainer" containerID="d7274c1c9e4153ac28534a3b9f58c87c2e5480650edf3522e235805aea87dd76" Sep 29 10:45:38 crc kubenswrapper[4752]: I0929 10:45:38.527360 4752 scope.go:117] "RemoveContainer" containerID="f5083afbe3807e485df0ceb9323e330b0f37722f050f83895507559c9f655a21" Sep 29 10:45:38 crc kubenswrapper[4752]: E0929 10:45:38.527647 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-c2vrh_openshift-ovn-kubernetes(94028c24-ec10-4d5c-b32c-1700e677d539)\"" pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" podUID="94028c24-ec10-4d5c-b32c-1700e677d539" Sep 29 10:45:38 crc kubenswrapper[4752]: I0929 10:45:38.544603 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520a5d33-312c-4033-8b69-5dd582f13ccc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6223734bbce461c09916aea7629bba0cfa97ea17050bca7417020ece9ae031a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1157b82d6f3337270d30abdceadaa1f0a01b3c6d8de6bc8e9edf083a8264f19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://854abd6205c2eec2229d0d65aec3edb7cf1cc1e77759df41bd22deda4a08c8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://362298e6215cc1a9971973419e58a45e5ded2c4120b1e800afd87f480f6fd3d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c927118840179fccacbe6a18a329c117cef73a6e914bf38d20fc2439d6a5c1ee\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"message\\\":\\\"le observer\\\\nW0929 10:44:40.787758 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0929 10:44:40.787900 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 10:44:40.788558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1487283959/tls.crt::/tmp/serving-cert-1487283959/tls.key\\\\\\\"\\\\nI0929 10:44:41.256284 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 10:44:41.261265 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 10:44:41.261291 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 10:44:41.261311 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 10:44:41.261316 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 10:44:41.267824 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0929 10:44:41.267847 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0929 10:44:41.267849 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 10:44:41.267871 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 10:44:41.267876 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 10:44:41.267879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 10:44:41.267882 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 10:44:41.267884 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0929 10:44:41.270258 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbe61bb570ef2be352bb3a0e55da353ce7b618b397e3bf9f0d66da0c9b6f1d4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80f961b58569cce6d634f225369902695ccda2e78efb1c6fd635f1535467cc1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80f961b58569cce6d634f225369902695ccda2e78efb1c6fd635f1535467cc1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:38Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:38 crc kubenswrapper[4752]: I0929 10:45:38.560073 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca4f637cfcb1e52fa69f0ffa46b3a53459225d9ad4afd1178bff709e812c5418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b70242846937de5b4dda37a2b8c48947fded378c299ea4ad857168589d7c175\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:38Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:38 crc kubenswrapper[4752]: I0929 10:45:38.570338 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7kp7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66a61a7f-9be6-486b-a425-62ed62ec0ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4170732970e5e7c429279d239eb2d4b9d8249ff254b35f38ff80d0321087be2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kgr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7kp7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:38Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:38 crc kubenswrapper[4752]: I0929 10:45:38.586337 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vm6zb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f30a1f9-86ef-450e-9f8c-8ef8d4ac380a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6bc5aff417397c8b264553f67de7ebd1aeadb67fb83114c5bb13c2e0d10e397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239ca1f17b9f1e1d6ba63b196e34066fe7fb37373453460261044f5fcaf819af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://239ca1f17b9f1e1d6ba63b196e34066fe7fb37373453460261044f5fcaf819af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd5b369dc688f11e4ab502a3886b722cba392fce0d3ac7850bd59abffbf7dee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd5b369dc688f11e4ab502a3886b722cba392fce0d3ac7850bd59abffbf7dee2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d17821abed9aca5c20373738f44ca9a61e954d1eee46f0d16c3e9b34d810a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88d17821abed9aca5c20373738f44ca9a61e954d1eee46f0d16c3e9b34d810a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50f5727e0bd53639ba6b6632f2d62c7c62ae74b07a60aa1cb58c2020990cae42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f5727e0bd53639ba6b6632f2d62c7c62ae74b07a60aa1cb58c2020990cae42\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd84740e3b0a970decedcc3960fb987fa618f9627f06be1d2d0b034d0361f805\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd84740e3b0a970decedcc3960fb987fa618f9627f06be1d2d0b034d0361f805\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6af6d9f7c1ca6625f88dcaa9ef267cf11f3ebb16a0ce12d3c2442550bc0833ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6af6d9f7c1ca6625f88dcaa9ef267cf11f3ebb16a0ce12d3c2442550bc0833ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vm6zb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:38Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:38 crc kubenswrapper[4752]: I0929 10:45:38.610000 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:38 crc kubenswrapper[4752]: I0929 10:45:38.610047 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:38 crc kubenswrapper[4752]: I0929 10:45:38.610058 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:38 crc kubenswrapper[4752]: I0929 10:45:38.610073 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:38 crc kubenswrapper[4752]: I0929 10:45:38.610084 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:38Z","lastTransitionTime":"2025-09-29T10:45:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:38 crc kubenswrapper[4752]: I0929 10:45:38.616320 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94028c24-ec10-4d5c-b32c-1700e677d539\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://486ac9c45cc8e6cc88a199b152343c1db14c51125b4357c85d5d082467fc4560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2860691a355a598f52a1f13213198fa7889748e67cca21a617ed5714f5eabcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e34a55130babbc5fbe9fb81d05fc687dc1b06c3bffea762ba699f9f6c317b312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5985eb5ebc8fa2ca986873aea235335770621597493b43eaa58d98329cd37009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b46368b26939edaf377aa86ef45fc9dc3ec4fa274dfe1cba458bafb8d32309e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a98f237ee9baeb799b2ea76ccbe7b349ed70b50f47738fc514ae56b46ee8d1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5083afbe3807e485df0ceb9323e330b0f37722f050f83895507559c9f655a21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7274c1c9e4153ac28534a3b9f58c87c2e5480650edf3522e235805aea87dd76\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T10:45:09Z\\\",\\\"message\\\":\\\"update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-marketplace/marketplace-operator-metrics]} name:Service_openshift-marketplace/marketplace-operator-metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.53:8081: 10.217.5.53:8383:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {89fe421e-04e8-4967-ac75-77a0e6f784ef}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0929 10:45:08.961904 6414 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI0929 10:45:08.961922 6414 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI0929 10:45:08.961943 6414 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nF0929 10:45:08.961950 6414 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T10:45:08Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5083afbe3807e485df0ceb9323e330b0f37722f050f83895507559c9f655a21\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T10:45:37Z\\\",\\\"message\\\":\\\"\\\\\\\"10.217.4.222\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0929 10:45:37.791790 6772 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-authentication/oauth-openshift]} name:Service_openshift-authentication/oauth-openshift_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.222:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c0c2f725-e461-454e-a88c-c8350d62e1ef}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0929 10:45:37.791856 6772 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-authentication/oauth-openshift]} name:Service_openshift-authentication/oauth-openshift_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.21\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T10:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea11fb795febf50e35263b0a02c32a01fd69937dfbfe196696cd1792e40cc191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f22dfbbd26fb3ebf4869b46406913cc1963e33c11794193c815235be5acee338\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f22dfbbd26fb3ebf4869b46406913cc1963e33c11794193c815235be5acee338\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c2vrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:38Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:38 crc kubenswrapper[4752]: I0929 10:45:38.628094 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mp5pm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65f5485e-9000-4512-aad3-7d367715ac2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db5dba49df10714a5f00ec40865af87528f6bee63ee58a89f299af7c10e4d769\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z772z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://073cf9e4675b04d77ad58f0b7e1b313e3fe15e8daee4e1c8934a90924b04ad22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z772z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mp5pm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:38Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:38 crc kubenswrapper[4752]: I0929 10:45:38.639793 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sq7f4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a33b92e-d79c-4162-8500-df7a89df8df3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qck2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qck2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sq7f4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:38Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:38 crc kubenswrapper[4752]: I0929 10:45:38.653001 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3e5d3a3-2f2d-4f61-ae95-26ebd1f72342\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66d77cd5048e199a6eae84be4079c3b00305f4f5223b5176a49df0feb2f0bf8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74b270e951a827068c908168bf04d4cd3bcba62e472e4a3f415de8b7463fdccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dd4d83f6d6b5db7fc93239bc1a6b731c67bc15ef1ca1990b53589e4ad36bfa7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39ef26bf3e7b95ac9a59199bbabe11fd4e831baba1b120ef97a4839c0c4aab7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:38Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:38 crc kubenswrapper[4752]: I0929 10:45:38.664921 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75427e82-74a4-46cd-ac54-210fa4bdd947\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b188b508629875e659215e5d09b261c54073368b770d1f876b5b0146b27f1af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88c78244f091e746e6cad8937b40c33fd6aef6118e696069f48acc0201635f54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88c78244f091e746e6cad8937b40c33fd6aef6118e696069f48acc0201635f54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:38Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:38 crc kubenswrapper[4752]: I0929 10:45:38.680904 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:38Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:38 crc kubenswrapper[4752]: I0929 10:45:38.693475 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fb781fd16d4a9f56202eb1724ed1a4ed6700ff7b81819573b955bcb07e563a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:38Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:38 crc kubenswrapper[4752]: I0929 10:45:38.705905 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xv5q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52fc9378-c37b-424b-afde-7b191bab5fde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d36b7c0411c2a5cbcb37f626fa70cfe6c7d3fc6280f6a9e882fa27766f6de761\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30ee75a35da106cc9424c7a3f97f28d0c711200667372c023612db4a9701c189\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T10:45:29Z\\\",\\\"message\\\":\\\"2025-09-29T10:44:43+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_46ee9782-bab0-46a6-9758-0bdfa32662ba\\\\n2025-09-29T10:44:43+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_46ee9782-bab0-46a6-9758-0bdfa32662ba to /host/opt/cni/bin/\\\\n2025-09-29T10:44:44Z [verbose] multus-daemon started\\\\n2025-09-29T10:44:44Z [verbose] Readiness Indicator file check\\\\n2025-09-29T10:45:29Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:45:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4rqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xv5q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:38Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:38 crc kubenswrapper[4752]: I0929 10:45:38.712170 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:38 crc kubenswrapper[4752]: I0929 10:45:38.712208 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:38 crc kubenswrapper[4752]: I0929 10:45:38.712216 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:38 crc kubenswrapper[4752]: I0929 10:45:38.712232 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:38 crc kubenswrapper[4752]: I0929 10:45:38.712241 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:38Z","lastTransitionTime":"2025-09-29T10:45:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:38 crc kubenswrapper[4752]: I0929 10:45:38.717541 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22c62d6c-d29c-416f-bfeb-476f97181a39\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bab580564f9dd31f6b2ea23a31918a9fdd2f247d13a0bd882f38dbaee4bf0b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44a5132d9611cf58eef747d86fd0cef4eb52366b9d1bacc6df0cf5be145d3998\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c6d3ad808fe69e726b66a03be183d33f000a614fadbc7f644015633fbb2b457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5335487c039d2e7e80a940cfe980fb46caf0cfc6302660b9318d9c8c525227cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5335487c039d2e7e80a940cfe980fb46caf0cfc6302660b9318d9c8c525227cb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:38Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:38 crc kubenswrapper[4752]: I0929 10:45:38.728494 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:38Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:38 crc kubenswrapper[4752]: I0929 10:45:38.747070 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48ad7053-6039-4b1a-9729-fcbe1d938928\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00965359c30aa25677d4b114c00b339b155ab4b5316d5e355536bea5b65eaba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e2d86e0821e0155affe296e5cc70e9904f04c800943101e62509e3a5e4e0808\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9378a6f1ac902b030f4ecabac1eae40f884dc1546a360e178f38300e137d8b0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a174bcfad22c2a58c48792478272705c80a56775b45b14919ea1de1dd92b4cbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://828d416b69696f709d91feb8df8fead0f95be74a91c5dab25756e341e29413dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e4ae4f6e0a6df2f1e370b0ff37704c0b0252752c0d8e8a1cdd83088ca9ec951\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e4ae4f6e0a6df2f1e370b0ff37704c0b0252752c0d8e8a1cdd83088ca9ec951\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40c90938f79ba960fa16979dd5f239674df4b13cae8b0b5d3bb48b0e46219a34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40c90938f79ba960fa16979dd5f239674df4b13cae8b0b5d3bb48b0e46219a34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f99c6fe84624f3e518bbe35ee9b700effb126ff1f36d995262b7ed8b73364780\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f99c6fe84624f3e518bbe35ee9b700effb126ff1f36d995262b7ed8b73364780\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:38Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:38 crc kubenswrapper[4752]: I0929 10:45:38.760052 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:38Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:38 crc kubenswrapper[4752]: I0929 10:45:38.772897 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131d2c8a72fc6a373ebf6835840e6b9c1829db4c78b4961bf36642fd0e8a5636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:38Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:38 crc kubenswrapper[4752]: I0929 10:45:38.784706 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5863c243-797d-462a-b11f-71aaf005f8d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://166738b29f01996ec981fd00b49f422e4a97fe774396e7ea153ad29ef30a7370\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdtpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32155f6078e9c15abe4c659ac79b064ec182a232ea1d816998da4de273b7aa67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdtpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mgrvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:38Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:38 crc kubenswrapper[4752]: I0929 10:45:38.797785 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4whp8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"398b6e5c-29ac-4701-9207-d3d269b62224\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63db080ebca3f5ea23ddc9af874b6b500abe8044c73794ae0749df2949fb9520\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9hp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4whp8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:38Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:38 crc kubenswrapper[4752]: I0929 10:45:38.814513 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:38 crc kubenswrapper[4752]: I0929 10:45:38.814858 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:38 crc kubenswrapper[4752]: I0929 10:45:38.814931 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:38 crc kubenswrapper[4752]: I0929 10:45:38.815020 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:38 crc kubenswrapper[4752]: I0929 10:45:38.815084 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:38Z","lastTransitionTime":"2025-09-29T10:45:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:38 crc kubenswrapper[4752]: I0929 10:45:38.917617 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:38 crc kubenswrapper[4752]: I0929 10:45:38.917662 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:38 crc kubenswrapper[4752]: I0929 10:45:38.917671 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:38 crc kubenswrapper[4752]: I0929 10:45:38.917688 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:38 crc kubenswrapper[4752]: I0929 10:45:38.917698 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:38Z","lastTransitionTime":"2025-09-29T10:45:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:39 crc kubenswrapper[4752]: I0929 10:45:39.019666 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:39 crc kubenswrapper[4752]: I0929 10:45:39.019702 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:39 crc kubenswrapper[4752]: I0929 10:45:39.019713 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:39 crc kubenswrapper[4752]: I0929 10:45:39.019757 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:39 crc kubenswrapper[4752]: I0929 10:45:39.019772 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:39Z","lastTransitionTime":"2025-09-29T10:45:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:39 crc kubenswrapper[4752]: I0929 10:45:39.030400 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 10:45:39 crc kubenswrapper[4752]: I0929 10:45:39.030445 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 10:45:39 crc kubenswrapper[4752]: I0929 10:45:39.030480 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sq7f4" Sep 29 10:45:39 crc kubenswrapper[4752]: E0929 10:45:39.030546 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 10:45:39 crc kubenswrapper[4752]: I0929 10:45:39.030559 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 10:45:39 crc kubenswrapper[4752]: E0929 10:45:39.030651 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sq7f4" podUID="0a33b92e-d79c-4162-8500-df7a89df8df3" Sep 29 10:45:39 crc kubenswrapper[4752]: E0929 10:45:39.030787 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 10:45:39 crc kubenswrapper[4752]: E0929 10:45:39.030844 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 10:45:39 crc kubenswrapper[4752]: I0929 10:45:39.121888 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:39 crc kubenswrapper[4752]: I0929 10:45:39.121930 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:39 crc kubenswrapper[4752]: I0929 10:45:39.121939 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:39 crc kubenswrapper[4752]: I0929 10:45:39.121952 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:39 crc kubenswrapper[4752]: I0929 10:45:39.121963 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:39Z","lastTransitionTime":"2025-09-29T10:45:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:39 crc kubenswrapper[4752]: I0929 10:45:39.224349 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:39 crc kubenswrapper[4752]: I0929 10:45:39.224386 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:39 crc kubenswrapper[4752]: I0929 10:45:39.224399 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:39 crc kubenswrapper[4752]: I0929 10:45:39.224415 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:39 crc kubenswrapper[4752]: I0929 10:45:39.224428 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:39Z","lastTransitionTime":"2025-09-29T10:45:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:39 crc kubenswrapper[4752]: I0929 10:45:39.327407 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:39 crc kubenswrapper[4752]: I0929 10:45:39.327466 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:39 crc kubenswrapper[4752]: I0929 10:45:39.327483 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:39 crc kubenswrapper[4752]: I0929 10:45:39.327509 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:39 crc kubenswrapper[4752]: I0929 10:45:39.327526 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:39Z","lastTransitionTime":"2025-09-29T10:45:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:39 crc kubenswrapper[4752]: I0929 10:45:39.430367 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:39 crc kubenswrapper[4752]: I0929 10:45:39.430441 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:39 crc kubenswrapper[4752]: I0929 10:45:39.430464 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:39 crc kubenswrapper[4752]: I0929 10:45:39.430491 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:39 crc kubenswrapper[4752]: I0929 10:45:39.430512 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:39Z","lastTransitionTime":"2025-09-29T10:45:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:39 crc kubenswrapper[4752]: I0929 10:45:39.532307 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:39 crc kubenswrapper[4752]: I0929 10:45:39.532354 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:39 crc kubenswrapper[4752]: I0929 10:45:39.532372 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:39 crc kubenswrapper[4752]: I0929 10:45:39.532397 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:39 crc kubenswrapper[4752]: I0929 10:45:39.532416 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:39Z","lastTransitionTime":"2025-09-29T10:45:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:39 crc kubenswrapper[4752]: I0929 10:45:39.533568 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c2vrh_94028c24-ec10-4d5c-b32c-1700e677d539/ovnkube-controller/3.log" Sep 29 10:45:39 crc kubenswrapper[4752]: I0929 10:45:39.537179 4752 scope.go:117] "RemoveContainer" containerID="f5083afbe3807e485df0ceb9323e330b0f37722f050f83895507559c9f655a21" Sep 29 10:45:39 crc kubenswrapper[4752]: E0929 10:45:39.537341 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-c2vrh_openshift-ovn-kubernetes(94028c24-ec10-4d5c-b32c-1700e677d539)\"" pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" podUID="94028c24-ec10-4d5c-b32c-1700e677d539" Sep 29 10:45:39 crc kubenswrapper[4752]: I0929 10:45:39.555094 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22c62d6c-d29c-416f-bfeb-476f97181a39\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bab580564f9dd31f6b2ea23a31918a9fdd2f247d13a0bd882f38dbaee4bf0b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44a5132d9611cf58eef747d86fd0cef4eb52366b9d1bacc6df0cf5be145d3998\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c6d3ad808fe69e726b66a03be183d33f000a614fadbc7f644015633fbb2b457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5335487c039d2e7e80a940cfe980fb46caf0cfc6302660b9318d9c8c525227cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5335487c039d2e7e80a940cfe980fb46caf0cfc6302660b9318d9c8c525227cb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:39Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:39 crc kubenswrapper[4752]: I0929 10:45:39.573842 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:39Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:39 crc kubenswrapper[4752]: I0929 10:45:39.593606 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48ad7053-6039-4b1a-9729-fcbe1d938928\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00965359c30aa25677d4b114c00b339b155ab4b5316d5e355536bea5b65eaba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e2d86e0821e0155affe296e5cc70e9904f04c800943101e62509e3a5e4e0808\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9378a6f1ac902b030f4ecabac1eae40f884dc1546a360e178f38300e137d8b0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a174bcfad22c2a58c48792478272705c80a56775b45b14919ea1de1dd92b4cbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://828d416b69696f709d91feb8df8fead0f95be74a91c5dab25756e341e29413dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e4ae4f6e0a6df2f1e370b0ff37704c0b0252752c0d8e8a1cdd83088ca9ec951\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e4ae4f6e0a6df2f1e370b0ff37704c0b0252752c0d8e8a1cdd83088ca9ec951\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40c90938f79ba960fa16979dd5f239674df4b13cae8b0b5d3bb48b0e46219a34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40c90938f79ba960fa16979dd5f239674df4b13cae8b0b5d3bb48b0e46219a34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f99c6fe84624f3e518bbe35ee9b700effb126ff1f36d995262b7ed8b73364780\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f99c6fe84624f3e518bbe35ee9b700effb126ff1f36d995262b7ed8b73364780\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:39Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:39 crc kubenswrapper[4752]: I0929 10:45:39.609674 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:39Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:39 crc kubenswrapper[4752]: I0929 10:45:39.623928 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131d2c8a72fc6a373ebf6835840e6b9c1829db4c78b4961bf36642fd0e8a5636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:39Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:39 crc kubenswrapper[4752]: I0929 10:45:39.634895 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:39 crc kubenswrapper[4752]: I0929 10:45:39.634933 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:39 crc kubenswrapper[4752]: I0929 10:45:39.634944 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:39 crc kubenswrapper[4752]: I0929 10:45:39.634959 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:39 crc kubenswrapper[4752]: I0929 10:45:39.634969 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:39Z","lastTransitionTime":"2025-09-29T10:45:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:39 crc kubenswrapper[4752]: I0929 10:45:39.635971 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5863c243-797d-462a-b11f-71aaf005f8d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://166738b29f01996ec981fd00b49f422e4a97fe774396e7ea153ad29ef30a7370\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdtpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32155f6078e9c15abe4c659ac79b064ec182a232ea1d816998da4de273b7aa67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdtpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mgrvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:39Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:39 crc kubenswrapper[4752]: I0929 10:45:39.645055 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4whp8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"398b6e5c-29ac-4701-9207-d3d269b62224\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63db080ebca3f5ea23ddc9af874b6b500abe8044c73794ae0749df2949fb9520\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9hp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4whp8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:39Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:39 crc kubenswrapper[4752]: I0929 10:45:39.658009 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520a5d33-312c-4033-8b69-5dd582f13ccc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6223734bbce461c09916aea7629bba0cfa97ea17050bca7417020ece9ae031a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1157b82d6f3337270d30abdceadaa1f0a01b3c6d8de6bc8e9edf083a8264f19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://854abd6205c2eec2229d0d65aec3edb7cf1cc1e77759df41bd22deda4a08c8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://362298e6215cc1a9971973419e58a45e5ded2c4120b1e800afd87f480f6fd3d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c927118840179fccacbe6a18a329c117cef73a6e914bf38d20fc2439d6a5c1ee\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"message\\\":\\\"le observer\\\\nW0929 10:44:40.787758 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0929 10:44:40.787900 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 10:44:40.788558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1487283959/tls.crt::/tmp/serving-cert-1487283959/tls.key\\\\\\\"\\\\nI0929 10:44:41.256284 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 10:44:41.261265 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 10:44:41.261291 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 10:44:41.261311 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 10:44:41.261316 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 10:44:41.267824 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0929 10:44:41.267847 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0929 10:44:41.267849 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 10:44:41.267871 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 10:44:41.267876 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 10:44:41.267879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 10:44:41.267882 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 10:44:41.267884 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0929 10:44:41.270258 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbe61bb570ef2be352bb3a0e55da353ce7b618b397e3bf9f0d66da0c9b6f1d4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80f961b58569cce6d634f225369902695ccda2e78efb1c6fd635f1535467cc1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80f961b58569cce6d634f225369902695ccda2e78efb1c6fd635f1535467cc1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:39Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:39 crc kubenswrapper[4752]: I0929 10:45:39.668683 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca4f637cfcb1e52fa69f0ffa46b3a53459225d9ad4afd1178bff709e812c5418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b70242846937de5b4dda37a2b8c48947fded378c299ea4ad857168589d7c175\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:39Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:39 crc kubenswrapper[4752]: I0929 10:45:39.679678 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7kp7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66a61a7f-9be6-486b-a425-62ed62ec0ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4170732970e5e7c429279d239eb2d4b9d8249ff254b35f38ff80d0321087be2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kgr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7kp7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:39Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:39 crc kubenswrapper[4752]: I0929 10:45:39.693848 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vm6zb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f30a1f9-86ef-450e-9f8c-8ef8d4ac380a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6bc5aff417397c8b264553f67de7ebd1aeadb67fb83114c5bb13c2e0d10e397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239ca1f17b9f1e1d6ba63b196e34066fe7fb37373453460261044f5fcaf819af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://239ca1f17b9f1e1d6ba63b196e34066fe7fb37373453460261044f5fcaf819af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd5b369dc688f11e4ab502a3886b722cba392fce0d3ac7850bd59abffbf7dee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd5b369dc688f11e4ab502a3886b722cba392fce0d3ac7850bd59abffbf7dee2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d17821abed9aca5c20373738f44ca9a61e954d1eee46f0d16c3e9b34d810a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88d17821abed9aca5c20373738f44ca9a61e954d1eee46f0d16c3e9b34d810a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50f5727e0bd53639ba6b6632f2d62c7c62ae74b07a60aa1cb58c2020990cae42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f5727e0bd53639ba6b6632f2d62c7c62ae74b07a60aa1cb58c2020990cae42\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd84740e3b0a970decedcc3960fb987fa618f9627f06be1d2d0b034d0361f805\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd84740e3b0a970decedcc3960fb987fa618f9627f06be1d2d0b034d0361f805\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6af6d9f7c1ca6625f88dcaa9ef267cf11f3ebb16a0ce12d3c2442550bc0833ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6af6d9f7c1ca6625f88dcaa9ef267cf11f3ebb16a0ce12d3c2442550bc0833ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vm6zb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:39Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:39 crc kubenswrapper[4752]: I0929 10:45:39.714568 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94028c24-ec10-4d5c-b32c-1700e677d539\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://486ac9c45cc8e6cc88a199b152343c1db14c51125b4357c85d5d082467fc4560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2860691a355a598f52a1f13213198fa7889748e67cca21a617ed5714f5eabcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e34a55130babbc5fbe9fb81d05fc687dc1b06c3bffea762ba699f9f6c317b312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5985eb5ebc8fa2ca986873aea235335770621597493b43eaa58d98329cd37009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b46368b26939edaf377aa86ef45fc9dc3ec4fa274dfe1cba458bafb8d32309e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a98f237ee9baeb799b2ea76ccbe7b349ed70b50f47738fc514ae56b46ee8d1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5083afbe3807e485df0ceb9323e330b0f37722f050f83895507559c9f655a21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5083afbe3807e485df0ceb9323e330b0f37722f050f83895507559c9f655a21\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T10:45:37Z\\\",\\\"message\\\":\\\"\\\\\\\"10.217.4.222\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0929 10:45:37.791790 6772 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-authentication/oauth-openshift]} name:Service_openshift-authentication/oauth-openshift_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.222:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c0c2f725-e461-454e-a88c-c8350d62e1ef}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0929 10:45:37.791856 6772 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-authentication/oauth-openshift]} name:Service_openshift-authentication/oauth-openshift_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.21\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T10:45:37Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-c2vrh_openshift-ovn-kubernetes(94028c24-ec10-4d5c-b32c-1700e677d539)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea11fb795febf50e35263b0a02c32a01fd69937dfbfe196696cd1792e40cc191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f22dfbbd26fb3ebf4869b46406913cc1963e33c11794193c815235be5acee338\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f22dfbbd26fb3ebf4869b46406913cc1963e33c11794193c815235be5acee338\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c2vrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:39Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:39 crc kubenswrapper[4752]: I0929 10:45:39.728224 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3e5d3a3-2f2d-4f61-ae95-26ebd1f72342\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66d77cd5048e199a6eae84be4079c3b00305f4f5223b5176a49df0feb2f0bf8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74b270e951a827068c908168bf04d4cd3bcba62e472e4a3f415de8b7463fdccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dd4d83f6d6b5db7fc93239bc1a6b731c67bc15ef1ca1990b53589e4ad36bfa7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39ef26bf3e7b95ac9a59199bbabe11fd4e831baba1b120ef97a4839c0c4aab7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:39Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:39 crc kubenswrapper[4752]: I0929 10:45:39.737219 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:39 crc kubenswrapper[4752]: I0929 10:45:39.737251 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:39 crc kubenswrapper[4752]: I0929 10:45:39.737260 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:39 crc kubenswrapper[4752]: I0929 10:45:39.737274 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:39 crc kubenswrapper[4752]: I0929 10:45:39.737283 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:39Z","lastTransitionTime":"2025-09-29T10:45:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:39 crc kubenswrapper[4752]: I0929 10:45:39.738790 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75427e82-74a4-46cd-ac54-210fa4bdd947\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b188b508629875e659215e5d09b261c54073368b770d1f876b5b0146b27f1af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88c78244f091e746e6cad8937b40c33fd6aef6118e696069f48acc0201635f54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88c78244f091e746e6cad8937b40c33fd6aef6118e696069f48acc0201635f54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:39Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:39 crc kubenswrapper[4752]: I0929 10:45:39.751553 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:39Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:39 crc kubenswrapper[4752]: I0929 10:45:39.765347 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fb781fd16d4a9f56202eb1724ed1a4ed6700ff7b81819573b955bcb07e563a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:39Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:39 crc kubenswrapper[4752]: I0929 10:45:39.779200 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xv5q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52fc9378-c37b-424b-afde-7b191bab5fde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d36b7c0411c2a5cbcb37f626fa70cfe6c7d3fc6280f6a9e882fa27766f6de761\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30ee75a35da106cc9424c7a3f97f28d0c711200667372c023612db4a9701c189\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T10:45:29Z\\\",\\\"message\\\":\\\"2025-09-29T10:44:43+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_46ee9782-bab0-46a6-9758-0bdfa32662ba\\\\n2025-09-29T10:44:43+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_46ee9782-bab0-46a6-9758-0bdfa32662ba to /host/opt/cni/bin/\\\\n2025-09-29T10:44:44Z [verbose] multus-daemon started\\\\n2025-09-29T10:44:44Z [verbose] Readiness Indicator file check\\\\n2025-09-29T10:45:29Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:45:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4rqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xv5q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:39Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:39 crc kubenswrapper[4752]: I0929 10:45:39.796034 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mp5pm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65f5485e-9000-4512-aad3-7d367715ac2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db5dba49df10714a5f00ec40865af87528f6bee63ee58a89f299af7c10e4d769\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z772z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://073cf9e4675b04d77ad58f0b7e1b313e3fe15e8daee4e1c8934a90924b04ad22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z772z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mp5pm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:39Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:39 crc kubenswrapper[4752]: I0929 10:45:39.807724 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sq7f4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a33b92e-d79c-4162-8500-df7a89df8df3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qck2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qck2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sq7f4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:39Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:39 crc kubenswrapper[4752]: I0929 10:45:39.839560 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:39 crc kubenswrapper[4752]: I0929 10:45:39.839590 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:39 crc kubenswrapper[4752]: I0929 10:45:39.839598 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:39 crc kubenswrapper[4752]: I0929 10:45:39.839611 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:39 crc kubenswrapper[4752]: I0929 10:45:39.839619 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:39Z","lastTransitionTime":"2025-09-29T10:45:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:39 crc kubenswrapper[4752]: I0929 10:45:39.942624 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:39 crc kubenswrapper[4752]: I0929 10:45:39.942679 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:39 crc kubenswrapper[4752]: I0929 10:45:39.942699 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:39 crc kubenswrapper[4752]: I0929 10:45:39.942717 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:39 crc kubenswrapper[4752]: I0929 10:45:39.942729 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:39Z","lastTransitionTime":"2025-09-29T10:45:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:40 crc kubenswrapper[4752]: I0929 10:45:40.044953 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:40 crc kubenswrapper[4752]: I0929 10:45:40.045011 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:40 crc kubenswrapper[4752]: I0929 10:45:40.044922 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:40Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:40 crc kubenswrapper[4752]: I0929 10:45:40.045027 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:40 crc kubenswrapper[4752]: I0929 10:45:40.045198 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:40 crc kubenswrapper[4752]: I0929 10:45:40.045210 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:40Z","lastTransitionTime":"2025-09-29T10:45:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:40 crc kubenswrapper[4752]: I0929 10:45:40.063344 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fb781fd16d4a9f56202eb1724ed1a4ed6700ff7b81819573b955bcb07e563a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:40Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:40 crc kubenswrapper[4752]: I0929 10:45:40.078421 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xv5q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52fc9378-c37b-424b-afde-7b191bab5fde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d36b7c0411c2a5cbcb37f626fa70cfe6c7d3fc6280f6a9e882fa27766f6de761\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30ee75a35da106cc9424c7a3f97f28d0c711200667372c023612db4a9701c189\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T10:45:29Z\\\",\\\"message\\\":\\\"2025-09-29T10:44:43+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_46ee9782-bab0-46a6-9758-0bdfa32662ba\\\\n2025-09-29T10:44:43+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_46ee9782-bab0-46a6-9758-0bdfa32662ba to /host/opt/cni/bin/\\\\n2025-09-29T10:44:44Z [verbose] multus-daemon started\\\\n2025-09-29T10:44:44Z [verbose] Readiness Indicator file check\\\\n2025-09-29T10:45:29Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:45:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4rqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xv5q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:40Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:40 crc kubenswrapper[4752]: I0929 10:45:40.091837 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mp5pm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65f5485e-9000-4512-aad3-7d367715ac2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db5dba49df10714a5f00ec40865af87528f6bee63ee58a89f299af7c10e4d769\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z772z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://073cf9e4675b04d77ad58f0b7e1b313e3fe15e8daee4e1c8934a90924b04ad22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z772z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mp5pm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:40Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:40 crc kubenswrapper[4752]: I0929 10:45:40.104150 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sq7f4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a33b92e-d79c-4162-8500-df7a89df8df3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qck2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qck2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sq7f4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:40Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:40 crc kubenswrapper[4752]: I0929 10:45:40.119924 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3e5d3a3-2f2d-4f61-ae95-26ebd1f72342\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66d77cd5048e199a6eae84be4079c3b00305f4f5223b5176a49df0feb2f0bf8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74b270e951a827068c908168bf04d4cd3bcba62e472e4a3f415de8b7463fdccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dd4d83f6d6b5db7fc93239bc1a6b731c67bc15ef1ca1990b53589e4ad36bfa7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39ef26bf3e7b95ac9a59199bbabe11fd4e831baba1b120ef97a4839c0c4aab7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:40Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:40 crc kubenswrapper[4752]: I0929 10:45:40.138933 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75427e82-74a4-46cd-ac54-210fa4bdd947\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b188b508629875e659215e5d09b261c54073368b770d1f876b5b0146b27f1af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88c78244f091e746e6cad8937b40c33fd6aef6118e696069f48acc0201635f54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88c78244f091e746e6cad8937b40c33fd6aef6118e696069f48acc0201635f54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:40Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:40 crc kubenswrapper[4752]: I0929 10:45:40.146921 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:40 crc kubenswrapper[4752]: I0929 10:45:40.147064 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:40 crc kubenswrapper[4752]: I0929 10:45:40.147088 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:40 crc kubenswrapper[4752]: I0929 10:45:40.147112 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:40 crc kubenswrapper[4752]: I0929 10:45:40.147129 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:40Z","lastTransitionTime":"2025-09-29T10:45:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:40 crc kubenswrapper[4752]: I0929 10:45:40.154937 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22c62d6c-d29c-416f-bfeb-476f97181a39\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bab580564f9dd31f6b2ea23a31918a9fdd2f247d13a0bd882f38dbaee4bf0b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44a5132d9611cf58eef747d86fd0cef4eb52366b9d1bacc6df0cf5be145d3998\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c6d3ad808fe69e726b66a03be183d33f000a614fadbc7f644015633fbb2b457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5335487c039d2e7e80a940cfe980fb46caf0cfc6302660b9318d9c8c525227cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5335487c039d2e7e80a940cfe980fb46caf0cfc6302660b9318d9c8c525227cb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:40Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:40 crc kubenswrapper[4752]: I0929 10:45:40.168914 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:40Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:40 crc kubenswrapper[4752]: I0929 10:45:40.182082 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131d2c8a72fc6a373ebf6835840e6b9c1829db4c78b4961bf36642fd0e8a5636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:40Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:40 crc kubenswrapper[4752]: I0929 10:45:40.197267 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5863c243-797d-462a-b11f-71aaf005f8d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://166738b29f01996ec981fd00b49f422e4a97fe774396e7ea153ad29ef30a7370\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdtpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32155f6078e9c15abe4c659ac79b064ec182a232ea1d816998da4de273b7aa67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdtpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mgrvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:40Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:40 crc kubenswrapper[4752]: I0929 10:45:40.211033 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4whp8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"398b6e5c-29ac-4701-9207-d3d269b62224\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63db080ebca3f5ea23ddc9af874b6b500abe8044c73794ae0749df2949fb9520\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9hp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4whp8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:40Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:40 crc kubenswrapper[4752]: I0929 10:45:40.234648 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48ad7053-6039-4b1a-9729-fcbe1d938928\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00965359c30aa25677d4b114c00b339b155ab4b5316d5e355536bea5b65eaba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e2d86e0821e0155affe296e5cc70e9904f04c800943101e62509e3a5e4e0808\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9378a6f1ac902b030f4ecabac1eae40f884dc1546a360e178f38300e137d8b0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a174bcfad22c2a58c48792478272705c80a56775b45b14919ea1de1dd92b4cbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://828d416b69696f709d91feb8df8fead0f95be74a91c5dab25756e341e29413dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e4ae4f6e0a6df2f1e370b0ff37704c0b0252752c0d8e8a1cdd83088ca9ec951\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e4ae4f6e0a6df2f1e370b0ff37704c0b0252752c0d8e8a1cdd83088ca9ec951\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40c90938f79ba960fa16979dd5f239674df4b13cae8b0b5d3bb48b0e46219a34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40c90938f79ba960fa16979dd5f239674df4b13cae8b0b5d3bb48b0e46219a34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f99c6fe84624f3e518bbe35ee9b700effb126ff1f36d995262b7ed8b73364780\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f99c6fe84624f3e518bbe35ee9b700effb126ff1f36d995262b7ed8b73364780\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:40Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:40 crc kubenswrapper[4752]: I0929 10:45:40.249876 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:40Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:40 crc kubenswrapper[4752]: I0929 10:45:40.250380 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:40 crc kubenswrapper[4752]: I0929 10:45:40.250445 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:40 crc kubenswrapper[4752]: I0929 10:45:40.250466 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:40 crc kubenswrapper[4752]: I0929 10:45:40.250485 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:40 crc kubenswrapper[4752]: I0929 10:45:40.250498 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:40Z","lastTransitionTime":"2025-09-29T10:45:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:40 crc kubenswrapper[4752]: I0929 10:45:40.264866 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7kp7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66a61a7f-9be6-486b-a425-62ed62ec0ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4170732970e5e7c429279d239eb2d4b9d8249ff254b35f38ff80d0321087be2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kgr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7kp7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:40Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:40 crc kubenswrapper[4752]: I0929 10:45:40.280150 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vm6zb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f30a1f9-86ef-450e-9f8c-8ef8d4ac380a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6bc5aff417397c8b264553f67de7ebd1aeadb67fb83114c5bb13c2e0d10e397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239ca1f17b9f1e1d6ba63b196e34066fe7fb37373453460261044f5fcaf819af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://239ca1f17b9f1e1d6ba63b196e34066fe7fb37373453460261044f5fcaf819af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd5b369dc688f11e4ab502a3886b722cba392fce0d3ac7850bd59abffbf7dee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd5b369dc688f11e4ab502a3886b722cba392fce0d3ac7850bd59abffbf7dee2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d17821abed9aca5c20373738f44ca9a61e954d1eee46f0d16c3e9b34d810a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88d17821abed9aca5c20373738f44ca9a61e954d1eee46f0d16c3e9b34d810a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50f5727e0bd53639ba6b6632f2d62c7c62ae74b07a60aa1cb58c2020990cae42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f5727e0bd53639ba6b6632f2d62c7c62ae74b07a60aa1cb58c2020990cae42\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd84740e3b0a970decedcc3960fb987fa618f9627f06be1d2d0b034d0361f805\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd84740e3b0a970decedcc3960fb987fa618f9627f06be1d2d0b034d0361f805\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6af6d9f7c1ca6625f88dcaa9ef267cf11f3ebb16a0ce12d3c2442550bc0833ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6af6d9f7c1ca6625f88dcaa9ef267cf11f3ebb16a0ce12d3c2442550bc0833ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vm6zb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:40Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:40 crc kubenswrapper[4752]: I0929 10:45:40.298542 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94028c24-ec10-4d5c-b32c-1700e677d539\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://486ac9c45cc8e6cc88a199b152343c1db14c51125b4357c85d5d082467fc4560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2860691a355a598f52a1f13213198fa7889748e67cca21a617ed5714f5eabcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e34a55130babbc5fbe9fb81d05fc687dc1b06c3bffea762ba699f9f6c317b312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5985eb5ebc8fa2ca986873aea235335770621597493b43eaa58d98329cd37009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b46368b26939edaf377aa86ef45fc9dc3ec4fa274dfe1cba458bafb8d32309e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a98f237ee9baeb799b2ea76ccbe7b349ed70b50f47738fc514ae56b46ee8d1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5083afbe3807e485df0ceb9323e330b0f37722f050f83895507559c9f655a21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5083afbe3807e485df0ceb9323e330b0f37722f050f83895507559c9f655a21\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T10:45:37Z\\\",\\\"message\\\":\\\"\\\\\\\"10.217.4.222\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0929 10:45:37.791790 6772 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-authentication/oauth-openshift]} name:Service_openshift-authentication/oauth-openshift_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.222:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c0c2f725-e461-454e-a88c-c8350d62e1ef}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0929 10:45:37.791856 6772 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-authentication/oauth-openshift]} name:Service_openshift-authentication/oauth-openshift_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.21\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T10:45:37Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-c2vrh_openshift-ovn-kubernetes(94028c24-ec10-4d5c-b32c-1700e677d539)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea11fb795febf50e35263b0a02c32a01fd69937dfbfe196696cd1792e40cc191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f22dfbbd26fb3ebf4869b46406913cc1963e33c11794193c815235be5acee338\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f22dfbbd26fb3ebf4869b46406913cc1963e33c11794193c815235be5acee338\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c2vrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:40Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:40 crc kubenswrapper[4752]: I0929 10:45:40.311989 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520a5d33-312c-4033-8b69-5dd582f13ccc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6223734bbce461c09916aea7629bba0cfa97ea17050bca7417020ece9ae031a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1157b82d6f3337270d30abdceadaa1f0a01b3c6d8de6bc8e9edf083a8264f19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://854abd6205c2eec2229d0d65aec3edb7cf1cc1e77759df41bd22deda4a08c8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://362298e6215cc1a9971973419e58a45e5ded2c4120b1e800afd87f480f6fd3d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c927118840179fccacbe6a18a329c117cef73a6e914bf38d20fc2439d6a5c1ee\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"message\\\":\\\"le observer\\\\nW0929 10:44:40.787758 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0929 10:44:40.787900 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 10:44:40.788558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1487283959/tls.crt::/tmp/serving-cert-1487283959/tls.key\\\\\\\"\\\\nI0929 10:44:41.256284 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 10:44:41.261265 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 10:44:41.261291 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 10:44:41.261311 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 10:44:41.261316 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 10:44:41.267824 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0929 10:44:41.267847 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0929 10:44:41.267849 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 10:44:41.267871 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 10:44:41.267876 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 10:44:41.267879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 10:44:41.267882 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 10:44:41.267884 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0929 10:44:41.270258 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbe61bb570ef2be352bb3a0e55da353ce7b618b397e3bf9f0d66da0c9b6f1d4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80f961b58569cce6d634f225369902695ccda2e78efb1c6fd635f1535467cc1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80f961b58569cce6d634f225369902695ccda2e78efb1c6fd635f1535467cc1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:40Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:40 crc kubenswrapper[4752]: I0929 10:45:40.325783 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca4f637cfcb1e52fa69f0ffa46b3a53459225d9ad4afd1178bff709e812c5418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b70242846937de5b4dda37a2b8c48947fded378c299ea4ad857168589d7c175\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:40Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:40 crc kubenswrapper[4752]: I0929 10:45:40.352836 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:40 crc kubenswrapper[4752]: I0929 10:45:40.352910 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:40 crc kubenswrapper[4752]: I0929 10:45:40.352926 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:40 crc kubenswrapper[4752]: I0929 10:45:40.352953 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:40 crc kubenswrapper[4752]: I0929 10:45:40.352972 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:40Z","lastTransitionTime":"2025-09-29T10:45:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:40 crc kubenswrapper[4752]: I0929 10:45:40.455197 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:40 crc kubenswrapper[4752]: I0929 10:45:40.455241 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:40 crc kubenswrapper[4752]: I0929 10:45:40.455274 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:40 crc kubenswrapper[4752]: I0929 10:45:40.455293 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:40 crc kubenswrapper[4752]: I0929 10:45:40.455303 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:40Z","lastTransitionTime":"2025-09-29T10:45:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:40 crc kubenswrapper[4752]: I0929 10:45:40.557871 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:40 crc kubenswrapper[4752]: I0929 10:45:40.557920 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:40 crc kubenswrapper[4752]: I0929 10:45:40.557928 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:40 crc kubenswrapper[4752]: I0929 10:45:40.557943 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:40 crc kubenswrapper[4752]: I0929 10:45:40.557951 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:40Z","lastTransitionTime":"2025-09-29T10:45:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:40 crc kubenswrapper[4752]: I0929 10:45:40.660904 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:40 crc kubenswrapper[4752]: I0929 10:45:40.660961 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:40 crc kubenswrapper[4752]: I0929 10:45:40.660981 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:40 crc kubenswrapper[4752]: I0929 10:45:40.661011 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:40 crc kubenswrapper[4752]: I0929 10:45:40.661032 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:40Z","lastTransitionTime":"2025-09-29T10:45:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:40 crc kubenswrapper[4752]: I0929 10:45:40.763957 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:40 crc kubenswrapper[4752]: I0929 10:45:40.764012 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:40 crc kubenswrapper[4752]: I0929 10:45:40.764023 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:40 crc kubenswrapper[4752]: I0929 10:45:40.764043 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:40 crc kubenswrapper[4752]: I0929 10:45:40.764056 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:40Z","lastTransitionTime":"2025-09-29T10:45:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:40 crc kubenswrapper[4752]: I0929 10:45:40.866696 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:40 crc kubenswrapper[4752]: I0929 10:45:40.866741 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:40 crc kubenswrapper[4752]: I0929 10:45:40.866757 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:40 crc kubenswrapper[4752]: I0929 10:45:40.866777 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:40 crc kubenswrapper[4752]: I0929 10:45:40.866790 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:40Z","lastTransitionTime":"2025-09-29T10:45:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:40 crc kubenswrapper[4752]: I0929 10:45:40.969845 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:40 crc kubenswrapper[4752]: I0929 10:45:40.969906 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:40 crc kubenswrapper[4752]: I0929 10:45:40.969925 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:40 crc kubenswrapper[4752]: I0929 10:45:40.969948 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:40 crc kubenswrapper[4752]: I0929 10:45:40.969965 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:40Z","lastTransitionTime":"2025-09-29T10:45:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:41 crc kubenswrapper[4752]: I0929 10:45:41.030599 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 10:45:41 crc kubenswrapper[4752]: I0929 10:45:41.030656 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 10:45:41 crc kubenswrapper[4752]: I0929 10:45:41.030618 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 10:45:41 crc kubenswrapper[4752]: I0929 10:45:41.030603 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sq7f4" Sep 29 10:45:41 crc kubenswrapper[4752]: E0929 10:45:41.030714 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 10:45:41 crc kubenswrapper[4752]: E0929 10:45:41.030766 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 10:45:41 crc kubenswrapper[4752]: E0929 10:45:41.030828 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 10:45:41 crc kubenswrapper[4752]: E0929 10:45:41.030905 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sq7f4" podUID="0a33b92e-d79c-4162-8500-df7a89df8df3" Sep 29 10:45:41 crc kubenswrapper[4752]: I0929 10:45:41.073003 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:41 crc kubenswrapper[4752]: I0929 10:45:41.073071 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:41 crc kubenswrapper[4752]: I0929 10:45:41.073098 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:41 crc kubenswrapper[4752]: I0929 10:45:41.073126 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:41 crc kubenswrapper[4752]: I0929 10:45:41.073149 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:41Z","lastTransitionTime":"2025-09-29T10:45:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:41 crc kubenswrapper[4752]: I0929 10:45:41.176053 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:41 crc kubenswrapper[4752]: I0929 10:45:41.176091 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:41 crc kubenswrapper[4752]: I0929 10:45:41.176099 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:41 crc kubenswrapper[4752]: I0929 10:45:41.176113 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:41 crc kubenswrapper[4752]: I0929 10:45:41.176123 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:41Z","lastTransitionTime":"2025-09-29T10:45:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:41 crc kubenswrapper[4752]: I0929 10:45:41.278177 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:41 crc kubenswrapper[4752]: I0929 10:45:41.278222 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:41 crc kubenswrapper[4752]: I0929 10:45:41.278231 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:41 crc kubenswrapper[4752]: I0929 10:45:41.278246 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:41 crc kubenswrapper[4752]: I0929 10:45:41.278257 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:41Z","lastTransitionTime":"2025-09-29T10:45:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:41 crc kubenswrapper[4752]: I0929 10:45:41.380770 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:41 crc kubenswrapper[4752]: I0929 10:45:41.380840 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:41 crc kubenswrapper[4752]: I0929 10:45:41.380851 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:41 crc kubenswrapper[4752]: I0929 10:45:41.380868 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:41 crc kubenswrapper[4752]: I0929 10:45:41.380884 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:41Z","lastTransitionTime":"2025-09-29T10:45:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:41 crc kubenswrapper[4752]: I0929 10:45:41.483003 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:41 crc kubenswrapper[4752]: I0929 10:45:41.483443 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:41 crc kubenswrapper[4752]: I0929 10:45:41.483475 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:41 crc kubenswrapper[4752]: I0929 10:45:41.483507 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:41 crc kubenswrapper[4752]: I0929 10:45:41.483527 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:41Z","lastTransitionTime":"2025-09-29T10:45:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:41 crc kubenswrapper[4752]: I0929 10:45:41.586611 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:41 crc kubenswrapper[4752]: I0929 10:45:41.586664 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:41 crc kubenswrapper[4752]: I0929 10:45:41.586677 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:41 crc kubenswrapper[4752]: I0929 10:45:41.586694 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:41 crc kubenswrapper[4752]: I0929 10:45:41.586706 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:41Z","lastTransitionTime":"2025-09-29T10:45:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:41 crc kubenswrapper[4752]: I0929 10:45:41.688988 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:41 crc kubenswrapper[4752]: I0929 10:45:41.689031 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:41 crc kubenswrapper[4752]: I0929 10:45:41.689044 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:41 crc kubenswrapper[4752]: I0929 10:45:41.689062 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:41 crc kubenswrapper[4752]: I0929 10:45:41.689073 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:41Z","lastTransitionTime":"2025-09-29T10:45:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:41 crc kubenswrapper[4752]: I0929 10:45:41.792115 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:41 crc kubenswrapper[4752]: I0929 10:45:41.792156 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:41 crc kubenswrapper[4752]: I0929 10:45:41.792168 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:41 crc kubenswrapper[4752]: I0929 10:45:41.792186 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:41 crc kubenswrapper[4752]: I0929 10:45:41.792199 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:41Z","lastTransitionTime":"2025-09-29T10:45:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:41 crc kubenswrapper[4752]: I0929 10:45:41.894242 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:41 crc kubenswrapper[4752]: I0929 10:45:41.894285 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:41 crc kubenswrapper[4752]: I0929 10:45:41.894299 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:41 crc kubenswrapper[4752]: I0929 10:45:41.894317 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:41 crc kubenswrapper[4752]: I0929 10:45:41.894329 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:41Z","lastTransitionTime":"2025-09-29T10:45:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:41 crc kubenswrapper[4752]: I0929 10:45:41.996371 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:41 crc kubenswrapper[4752]: I0929 10:45:41.996426 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:41 crc kubenswrapper[4752]: I0929 10:45:41.996440 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:41 crc kubenswrapper[4752]: I0929 10:45:41.996455 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:41 crc kubenswrapper[4752]: I0929 10:45:41.996464 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:41Z","lastTransitionTime":"2025-09-29T10:45:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:42 crc kubenswrapper[4752]: I0929 10:45:42.099173 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:42 crc kubenswrapper[4752]: I0929 10:45:42.099263 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:42 crc kubenswrapper[4752]: I0929 10:45:42.099291 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:42 crc kubenswrapper[4752]: I0929 10:45:42.099321 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:42 crc kubenswrapper[4752]: I0929 10:45:42.099342 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:42Z","lastTransitionTime":"2025-09-29T10:45:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:42 crc kubenswrapper[4752]: I0929 10:45:42.202571 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:42 crc kubenswrapper[4752]: I0929 10:45:42.202640 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:42 crc kubenswrapper[4752]: I0929 10:45:42.202657 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:42 crc kubenswrapper[4752]: I0929 10:45:42.202677 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:42 crc kubenswrapper[4752]: I0929 10:45:42.202693 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:42Z","lastTransitionTime":"2025-09-29T10:45:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:42 crc kubenswrapper[4752]: I0929 10:45:42.305136 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:42 crc kubenswrapper[4752]: I0929 10:45:42.305166 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:42 crc kubenswrapper[4752]: I0929 10:45:42.305174 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:42 crc kubenswrapper[4752]: I0929 10:45:42.305186 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:42 crc kubenswrapper[4752]: I0929 10:45:42.305195 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:42Z","lastTransitionTime":"2025-09-29T10:45:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:42 crc kubenswrapper[4752]: I0929 10:45:42.407348 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:42 crc kubenswrapper[4752]: I0929 10:45:42.407404 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:42 crc kubenswrapper[4752]: I0929 10:45:42.407433 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:42 crc kubenswrapper[4752]: I0929 10:45:42.407453 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:42 crc kubenswrapper[4752]: I0929 10:45:42.407467 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:42Z","lastTransitionTime":"2025-09-29T10:45:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:42 crc kubenswrapper[4752]: I0929 10:45:42.510779 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:42 crc kubenswrapper[4752]: I0929 10:45:42.510882 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:42 crc kubenswrapper[4752]: I0929 10:45:42.510920 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:42 crc kubenswrapper[4752]: I0929 10:45:42.510960 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:42 crc kubenswrapper[4752]: I0929 10:45:42.510984 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:42Z","lastTransitionTime":"2025-09-29T10:45:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:42 crc kubenswrapper[4752]: I0929 10:45:42.613272 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:42 crc kubenswrapper[4752]: I0929 10:45:42.613855 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:42 crc kubenswrapper[4752]: I0929 10:45:42.613964 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:42 crc kubenswrapper[4752]: I0929 10:45:42.614073 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:42 crc kubenswrapper[4752]: I0929 10:45:42.614222 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:42Z","lastTransitionTime":"2025-09-29T10:45:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:42 crc kubenswrapper[4752]: I0929 10:45:42.716749 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:42 crc kubenswrapper[4752]: I0929 10:45:42.716894 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:42 crc kubenswrapper[4752]: I0929 10:45:42.716919 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:42 crc kubenswrapper[4752]: I0929 10:45:42.716952 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:42 crc kubenswrapper[4752]: I0929 10:45:42.716975 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:42Z","lastTransitionTime":"2025-09-29T10:45:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:42 crc kubenswrapper[4752]: I0929 10:45:42.819960 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:42 crc kubenswrapper[4752]: I0929 10:45:42.820051 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:42 crc kubenswrapper[4752]: I0929 10:45:42.820065 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:42 crc kubenswrapper[4752]: I0929 10:45:42.820293 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:42 crc kubenswrapper[4752]: I0929 10:45:42.820306 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:42Z","lastTransitionTime":"2025-09-29T10:45:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:42 crc kubenswrapper[4752]: I0929 10:45:42.922706 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:42 crc kubenswrapper[4752]: I0929 10:45:42.922745 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:42 crc kubenswrapper[4752]: I0929 10:45:42.922753 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:42 crc kubenswrapper[4752]: I0929 10:45:42.922769 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:42 crc kubenswrapper[4752]: I0929 10:45:42.922780 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:42Z","lastTransitionTime":"2025-09-29T10:45:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:43 crc kubenswrapper[4752]: I0929 10:45:43.025442 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:43 crc kubenswrapper[4752]: I0929 10:45:43.025488 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:43 crc kubenswrapper[4752]: I0929 10:45:43.025500 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:43 crc kubenswrapper[4752]: I0929 10:45:43.025519 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:43 crc kubenswrapper[4752]: I0929 10:45:43.025536 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:43Z","lastTransitionTime":"2025-09-29T10:45:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:43 crc kubenswrapper[4752]: I0929 10:45:43.030006 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 10:45:43 crc kubenswrapper[4752]: I0929 10:45:43.030058 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 10:45:43 crc kubenswrapper[4752]: E0929 10:45:43.030119 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 10:45:43 crc kubenswrapper[4752]: E0929 10:45:43.030190 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 10:45:43 crc kubenswrapper[4752]: I0929 10:45:43.030236 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 10:45:43 crc kubenswrapper[4752]: E0929 10:45:43.030278 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 10:45:43 crc kubenswrapper[4752]: I0929 10:45:43.030368 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sq7f4" Sep 29 10:45:43 crc kubenswrapper[4752]: E0929 10:45:43.030535 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sq7f4" podUID="0a33b92e-d79c-4162-8500-df7a89df8df3" Sep 29 10:45:43 crc kubenswrapper[4752]: I0929 10:45:43.130306 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:43 crc kubenswrapper[4752]: I0929 10:45:43.130357 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:43 crc kubenswrapper[4752]: I0929 10:45:43.130368 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:43 crc kubenswrapper[4752]: I0929 10:45:43.130382 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:43 crc kubenswrapper[4752]: I0929 10:45:43.130398 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:43Z","lastTransitionTime":"2025-09-29T10:45:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:43 crc kubenswrapper[4752]: I0929 10:45:43.233596 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:43 crc kubenswrapper[4752]: I0929 10:45:43.233641 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:43 crc kubenswrapper[4752]: I0929 10:45:43.233653 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:43 crc kubenswrapper[4752]: I0929 10:45:43.233671 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:43 crc kubenswrapper[4752]: I0929 10:45:43.233682 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:43Z","lastTransitionTime":"2025-09-29T10:45:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:43 crc kubenswrapper[4752]: I0929 10:45:43.336388 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:43 crc kubenswrapper[4752]: I0929 10:45:43.336419 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:43 crc kubenswrapper[4752]: I0929 10:45:43.336426 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:43 crc kubenswrapper[4752]: I0929 10:45:43.336438 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:43 crc kubenswrapper[4752]: I0929 10:45:43.336446 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:43Z","lastTransitionTime":"2025-09-29T10:45:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:43 crc kubenswrapper[4752]: I0929 10:45:43.439419 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:43 crc kubenswrapper[4752]: I0929 10:45:43.439508 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:43 crc kubenswrapper[4752]: I0929 10:45:43.439540 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:43 crc kubenswrapper[4752]: I0929 10:45:43.439570 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:43 crc kubenswrapper[4752]: I0929 10:45:43.439595 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:43Z","lastTransitionTime":"2025-09-29T10:45:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:43 crc kubenswrapper[4752]: I0929 10:45:43.542109 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:43 crc kubenswrapper[4752]: I0929 10:45:43.542156 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:43 crc kubenswrapper[4752]: I0929 10:45:43.542168 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:43 crc kubenswrapper[4752]: I0929 10:45:43.542182 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:43 crc kubenswrapper[4752]: I0929 10:45:43.542191 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:43Z","lastTransitionTime":"2025-09-29T10:45:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:43 crc kubenswrapper[4752]: I0929 10:45:43.644371 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:43 crc kubenswrapper[4752]: I0929 10:45:43.644410 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:43 crc kubenswrapper[4752]: I0929 10:45:43.644427 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:43 crc kubenswrapper[4752]: I0929 10:45:43.644446 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:43 crc kubenswrapper[4752]: I0929 10:45:43.644457 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:43Z","lastTransitionTime":"2025-09-29T10:45:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:43 crc kubenswrapper[4752]: I0929 10:45:43.747139 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:43 crc kubenswrapper[4752]: I0929 10:45:43.747200 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:43 crc kubenswrapper[4752]: I0929 10:45:43.747218 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:43 crc kubenswrapper[4752]: I0929 10:45:43.747239 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:43 crc kubenswrapper[4752]: I0929 10:45:43.747255 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:43Z","lastTransitionTime":"2025-09-29T10:45:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:43 crc kubenswrapper[4752]: I0929 10:45:43.849550 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:43 crc kubenswrapper[4752]: I0929 10:45:43.849592 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:43 crc kubenswrapper[4752]: I0929 10:45:43.849604 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:43 crc kubenswrapper[4752]: I0929 10:45:43.849621 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:43 crc kubenswrapper[4752]: I0929 10:45:43.849634 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:43Z","lastTransitionTime":"2025-09-29T10:45:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:43 crc kubenswrapper[4752]: I0929 10:45:43.952398 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:43 crc kubenswrapper[4752]: I0929 10:45:43.952434 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:43 crc kubenswrapper[4752]: I0929 10:45:43.952443 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:43 crc kubenswrapper[4752]: I0929 10:45:43.952456 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:43 crc kubenswrapper[4752]: I0929 10:45:43.952465 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:43Z","lastTransitionTime":"2025-09-29T10:45:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:44 crc kubenswrapper[4752]: I0929 10:45:44.055385 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:44 crc kubenswrapper[4752]: I0929 10:45:44.055466 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:44 crc kubenswrapper[4752]: I0929 10:45:44.055484 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:44 crc kubenswrapper[4752]: I0929 10:45:44.055512 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:44 crc kubenswrapper[4752]: I0929 10:45:44.055532 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:44Z","lastTransitionTime":"2025-09-29T10:45:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:44 crc kubenswrapper[4752]: I0929 10:45:44.158188 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:44 crc kubenswrapper[4752]: I0929 10:45:44.158234 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:44 crc kubenswrapper[4752]: I0929 10:45:44.158246 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:44 crc kubenswrapper[4752]: I0929 10:45:44.158263 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:44 crc kubenswrapper[4752]: I0929 10:45:44.158277 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:44Z","lastTransitionTime":"2025-09-29T10:45:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:44 crc kubenswrapper[4752]: I0929 10:45:44.261147 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:44 crc kubenswrapper[4752]: I0929 10:45:44.261195 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:44 crc kubenswrapper[4752]: I0929 10:45:44.261217 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:44 crc kubenswrapper[4752]: I0929 10:45:44.261236 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:44 crc kubenswrapper[4752]: I0929 10:45:44.261250 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:44Z","lastTransitionTime":"2025-09-29T10:45:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:44 crc kubenswrapper[4752]: I0929 10:45:44.364295 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:44 crc kubenswrapper[4752]: I0929 10:45:44.364338 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:44 crc kubenswrapper[4752]: I0929 10:45:44.364347 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:44 crc kubenswrapper[4752]: I0929 10:45:44.364361 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:44 crc kubenswrapper[4752]: I0929 10:45:44.364370 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:44Z","lastTransitionTime":"2025-09-29T10:45:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:44 crc kubenswrapper[4752]: I0929 10:45:44.466553 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:44 crc kubenswrapper[4752]: I0929 10:45:44.466595 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:44 crc kubenswrapper[4752]: I0929 10:45:44.466606 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:44 crc kubenswrapper[4752]: I0929 10:45:44.466621 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:44 crc kubenswrapper[4752]: I0929 10:45:44.466632 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:44Z","lastTransitionTime":"2025-09-29T10:45:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:44 crc kubenswrapper[4752]: I0929 10:45:44.568763 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:44 crc kubenswrapper[4752]: I0929 10:45:44.568816 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:44 crc kubenswrapper[4752]: I0929 10:45:44.568827 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:44 crc kubenswrapper[4752]: I0929 10:45:44.568842 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:44 crc kubenswrapper[4752]: I0929 10:45:44.568853 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:44Z","lastTransitionTime":"2025-09-29T10:45:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:44 crc kubenswrapper[4752]: I0929 10:45:44.672002 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:44 crc kubenswrapper[4752]: I0929 10:45:44.672073 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:44 crc kubenswrapper[4752]: I0929 10:45:44.672092 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:44 crc kubenswrapper[4752]: I0929 10:45:44.672117 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:44 crc kubenswrapper[4752]: I0929 10:45:44.672137 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:44Z","lastTransitionTime":"2025-09-29T10:45:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:44 crc kubenswrapper[4752]: I0929 10:45:44.774113 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:44 crc kubenswrapper[4752]: I0929 10:45:44.774159 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:44 crc kubenswrapper[4752]: I0929 10:45:44.774168 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:44 crc kubenswrapper[4752]: I0929 10:45:44.774181 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:44 crc kubenswrapper[4752]: I0929 10:45:44.774190 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:44Z","lastTransitionTime":"2025-09-29T10:45:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:44 crc kubenswrapper[4752]: I0929 10:45:44.820643 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 10:45:44 crc kubenswrapper[4752]: E0929 10:45:44.820971 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 10:46:48.820936556 +0000 UTC m=+149.610078263 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:45:44 crc kubenswrapper[4752]: I0929 10:45:44.877042 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:44 crc kubenswrapper[4752]: I0929 10:45:44.877103 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:44 crc kubenswrapper[4752]: I0929 10:45:44.877116 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:44 crc kubenswrapper[4752]: I0929 10:45:44.877135 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:44 crc kubenswrapper[4752]: I0929 10:45:44.877146 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:44Z","lastTransitionTime":"2025-09-29T10:45:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:44 crc kubenswrapper[4752]: I0929 10:45:44.921780 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 10:45:44 crc kubenswrapper[4752]: I0929 10:45:44.921890 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 10:45:44 crc kubenswrapper[4752]: I0929 10:45:44.921920 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 10:45:44 crc kubenswrapper[4752]: I0929 10:45:44.921951 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 10:45:44 crc kubenswrapper[4752]: E0929 10:45:44.921922 4752 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 29 10:45:44 crc kubenswrapper[4752]: E0929 10:45:44.922042 4752 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 29 10:45:44 crc kubenswrapper[4752]: E0929 10:45:44.922063 4752 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 29 10:45:44 crc kubenswrapper[4752]: E0929 10:45:44.921990 4752 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 29 10:45:44 crc kubenswrapper[4752]: E0929 10:45:44.922135 4752 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 29 10:45:44 crc kubenswrapper[4752]: E0929 10:45:44.922164 4752 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 10:45:44 crc kubenswrapper[4752]: E0929 10:45:44.922052 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-29 10:46:48.922036908 +0000 UTC m=+149.711178575 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 29 10:45:44 crc kubenswrapper[4752]: E0929 10:45:44.922080 4752 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 29 10:45:44 crc kubenswrapper[4752]: E0929 10:45:44.922242 4752 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 10:45:44 crc kubenswrapper[4752]: E0929 10:45:44.922244 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-29 10:46:48.922224472 +0000 UTC m=+149.711366140 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 29 10:45:44 crc kubenswrapper[4752]: E0929 10:45:44.922276 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-29 10:46:48.922268004 +0000 UTC m=+149.711409761 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 10:45:44 crc kubenswrapper[4752]: E0929 10:45:44.922291 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-29 10:46:48.922284534 +0000 UTC m=+149.711426201 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 10:45:44 crc kubenswrapper[4752]: I0929 10:45:44.980046 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:44 crc kubenswrapper[4752]: I0929 10:45:44.980082 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:44 crc kubenswrapper[4752]: I0929 10:45:44.980091 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:44 crc kubenswrapper[4752]: I0929 10:45:44.980105 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:44 crc kubenswrapper[4752]: I0929 10:45:44.980114 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:44Z","lastTransitionTime":"2025-09-29T10:45:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:45 crc kubenswrapper[4752]: I0929 10:45:45.030976 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sq7f4" Sep 29 10:45:45 crc kubenswrapper[4752]: I0929 10:45:45.031028 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 10:45:45 crc kubenswrapper[4752]: I0929 10:45:45.031056 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 10:45:45 crc kubenswrapper[4752]: I0929 10:45:45.030995 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 10:45:45 crc kubenswrapper[4752]: E0929 10:45:45.031153 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sq7f4" podUID="0a33b92e-d79c-4162-8500-df7a89df8df3" Sep 29 10:45:45 crc kubenswrapper[4752]: E0929 10:45:45.031232 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 10:45:45 crc kubenswrapper[4752]: E0929 10:45:45.031300 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 10:45:45 crc kubenswrapper[4752]: E0929 10:45:45.031358 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 10:45:45 crc kubenswrapper[4752]: I0929 10:45:45.082174 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:45 crc kubenswrapper[4752]: I0929 10:45:45.082213 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:45 crc kubenswrapper[4752]: I0929 10:45:45.082229 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:45 crc kubenswrapper[4752]: I0929 10:45:45.082244 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:45 crc kubenswrapper[4752]: I0929 10:45:45.082256 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:45Z","lastTransitionTime":"2025-09-29T10:45:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:45 crc kubenswrapper[4752]: I0929 10:45:45.184419 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:45 crc kubenswrapper[4752]: I0929 10:45:45.184464 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:45 crc kubenswrapper[4752]: I0929 10:45:45.184479 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:45 crc kubenswrapper[4752]: I0929 10:45:45.184494 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:45 crc kubenswrapper[4752]: I0929 10:45:45.184506 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:45Z","lastTransitionTime":"2025-09-29T10:45:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:45 crc kubenswrapper[4752]: I0929 10:45:45.286670 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:45 crc kubenswrapper[4752]: I0929 10:45:45.286712 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:45 crc kubenswrapper[4752]: I0929 10:45:45.286722 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:45 crc kubenswrapper[4752]: I0929 10:45:45.286735 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:45 crc kubenswrapper[4752]: I0929 10:45:45.286747 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:45Z","lastTransitionTime":"2025-09-29T10:45:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:45 crc kubenswrapper[4752]: I0929 10:45:45.389555 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:45 crc kubenswrapper[4752]: I0929 10:45:45.389620 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:45 crc kubenswrapper[4752]: I0929 10:45:45.389644 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:45 crc kubenswrapper[4752]: I0929 10:45:45.389675 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:45 crc kubenswrapper[4752]: I0929 10:45:45.389697 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:45Z","lastTransitionTime":"2025-09-29T10:45:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:45 crc kubenswrapper[4752]: I0929 10:45:45.492945 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:45 crc kubenswrapper[4752]: I0929 10:45:45.493005 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:45 crc kubenswrapper[4752]: I0929 10:45:45.493026 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:45 crc kubenswrapper[4752]: I0929 10:45:45.493054 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:45 crc kubenswrapper[4752]: I0929 10:45:45.493076 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:45Z","lastTransitionTime":"2025-09-29T10:45:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:45 crc kubenswrapper[4752]: I0929 10:45:45.596438 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:45 crc kubenswrapper[4752]: I0929 10:45:45.596480 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:45 crc kubenswrapper[4752]: I0929 10:45:45.596494 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:45 crc kubenswrapper[4752]: I0929 10:45:45.596516 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:45 crc kubenswrapper[4752]: I0929 10:45:45.596530 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:45Z","lastTransitionTime":"2025-09-29T10:45:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:45 crc kubenswrapper[4752]: I0929 10:45:45.700036 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:45 crc kubenswrapper[4752]: I0929 10:45:45.700077 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:45 crc kubenswrapper[4752]: I0929 10:45:45.700089 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:45 crc kubenswrapper[4752]: I0929 10:45:45.700108 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:45 crc kubenswrapper[4752]: I0929 10:45:45.700122 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:45Z","lastTransitionTime":"2025-09-29T10:45:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:45 crc kubenswrapper[4752]: I0929 10:45:45.803841 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:45 crc kubenswrapper[4752]: I0929 10:45:45.804161 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:45 crc kubenswrapper[4752]: I0929 10:45:45.804175 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:45 crc kubenswrapper[4752]: I0929 10:45:45.804194 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:45 crc kubenswrapper[4752]: I0929 10:45:45.804213 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:45Z","lastTransitionTime":"2025-09-29T10:45:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:45 crc kubenswrapper[4752]: I0929 10:45:45.906850 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:45 crc kubenswrapper[4752]: I0929 10:45:45.906914 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:45 crc kubenswrapper[4752]: I0929 10:45:45.906939 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:45 crc kubenswrapper[4752]: I0929 10:45:45.906963 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:45 crc kubenswrapper[4752]: I0929 10:45:45.906975 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:45Z","lastTransitionTime":"2025-09-29T10:45:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:46 crc kubenswrapper[4752]: I0929 10:45:46.009626 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:46 crc kubenswrapper[4752]: I0929 10:45:46.009685 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:46 crc kubenswrapper[4752]: I0929 10:45:46.009708 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:46 crc kubenswrapper[4752]: I0929 10:45:46.009736 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:46 crc kubenswrapper[4752]: I0929 10:45:46.009751 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:46Z","lastTransitionTime":"2025-09-29T10:45:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:46 crc kubenswrapper[4752]: I0929 10:45:46.112273 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:46 crc kubenswrapper[4752]: I0929 10:45:46.112311 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:46 crc kubenswrapper[4752]: I0929 10:45:46.112320 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:46 crc kubenswrapper[4752]: I0929 10:45:46.112335 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:46 crc kubenswrapper[4752]: I0929 10:45:46.112343 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:46Z","lastTransitionTime":"2025-09-29T10:45:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:46 crc kubenswrapper[4752]: I0929 10:45:46.215035 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:46 crc kubenswrapper[4752]: I0929 10:45:46.215071 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:46 crc kubenswrapper[4752]: I0929 10:45:46.215078 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:46 crc kubenswrapper[4752]: I0929 10:45:46.215090 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:46 crc kubenswrapper[4752]: I0929 10:45:46.215099 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:46Z","lastTransitionTime":"2025-09-29T10:45:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:46 crc kubenswrapper[4752]: I0929 10:45:46.318253 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:46 crc kubenswrapper[4752]: I0929 10:45:46.318303 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:46 crc kubenswrapper[4752]: I0929 10:45:46.318314 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:46 crc kubenswrapper[4752]: I0929 10:45:46.318331 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:46 crc kubenswrapper[4752]: I0929 10:45:46.318346 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:46Z","lastTransitionTime":"2025-09-29T10:45:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:46 crc kubenswrapper[4752]: I0929 10:45:46.420543 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:46 crc kubenswrapper[4752]: I0929 10:45:46.420590 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:46 crc kubenswrapper[4752]: I0929 10:45:46.420604 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:46 crc kubenswrapper[4752]: I0929 10:45:46.420627 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:46 crc kubenswrapper[4752]: I0929 10:45:46.420644 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:46Z","lastTransitionTime":"2025-09-29T10:45:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:46 crc kubenswrapper[4752]: I0929 10:45:46.523454 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:46 crc kubenswrapper[4752]: I0929 10:45:46.523487 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:46 crc kubenswrapper[4752]: I0929 10:45:46.523510 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:46 crc kubenswrapper[4752]: I0929 10:45:46.523527 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:46 crc kubenswrapper[4752]: I0929 10:45:46.523536 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:46Z","lastTransitionTime":"2025-09-29T10:45:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:46 crc kubenswrapper[4752]: I0929 10:45:46.626328 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:46 crc kubenswrapper[4752]: I0929 10:45:46.626393 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:46 crc kubenswrapper[4752]: I0929 10:45:46.626410 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:46 crc kubenswrapper[4752]: I0929 10:45:46.626435 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:46 crc kubenswrapper[4752]: I0929 10:45:46.626451 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:46Z","lastTransitionTime":"2025-09-29T10:45:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:46 crc kubenswrapper[4752]: I0929 10:45:46.728975 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:46 crc kubenswrapper[4752]: I0929 10:45:46.729052 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:46 crc kubenswrapper[4752]: I0929 10:45:46.729065 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:46 crc kubenswrapper[4752]: I0929 10:45:46.729081 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:46 crc kubenswrapper[4752]: I0929 10:45:46.729092 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:46Z","lastTransitionTime":"2025-09-29T10:45:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:46 crc kubenswrapper[4752]: I0929 10:45:46.831604 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:46 crc kubenswrapper[4752]: I0929 10:45:46.831663 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:46 crc kubenswrapper[4752]: I0929 10:45:46.831674 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:46 crc kubenswrapper[4752]: I0929 10:45:46.831691 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:46 crc kubenswrapper[4752]: I0929 10:45:46.831702 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:46Z","lastTransitionTime":"2025-09-29T10:45:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:46 crc kubenswrapper[4752]: I0929 10:45:46.934412 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:46 crc kubenswrapper[4752]: I0929 10:45:46.934483 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:46 crc kubenswrapper[4752]: I0929 10:45:46.934492 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:46 crc kubenswrapper[4752]: I0929 10:45:46.934506 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:46 crc kubenswrapper[4752]: I0929 10:45:46.934516 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:46Z","lastTransitionTime":"2025-09-29T10:45:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:47 crc kubenswrapper[4752]: I0929 10:45:47.030474 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 10:45:47 crc kubenswrapper[4752]: I0929 10:45:47.030498 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sq7f4" Sep 29 10:45:47 crc kubenswrapper[4752]: I0929 10:45:47.030552 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 10:45:47 crc kubenswrapper[4752]: E0929 10:45:47.030624 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 10:45:47 crc kubenswrapper[4752]: I0929 10:45:47.030474 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 10:45:47 crc kubenswrapper[4752]: E0929 10:45:47.030768 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sq7f4" podUID="0a33b92e-d79c-4162-8500-df7a89df8df3" Sep 29 10:45:47 crc kubenswrapper[4752]: E0929 10:45:47.030875 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 10:45:47 crc kubenswrapper[4752]: E0929 10:45:47.030963 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 10:45:47 crc kubenswrapper[4752]: I0929 10:45:47.036383 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:47 crc kubenswrapper[4752]: I0929 10:45:47.036412 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:47 crc kubenswrapper[4752]: I0929 10:45:47.036422 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:47 crc kubenswrapper[4752]: I0929 10:45:47.036435 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:47 crc kubenswrapper[4752]: I0929 10:45:47.036447 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:47Z","lastTransitionTime":"2025-09-29T10:45:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:47 crc kubenswrapper[4752]: I0929 10:45:47.140260 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:47 crc kubenswrapper[4752]: I0929 10:45:47.140304 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:47 crc kubenswrapper[4752]: I0929 10:45:47.140317 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:47 crc kubenswrapper[4752]: I0929 10:45:47.140333 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:47 crc kubenswrapper[4752]: I0929 10:45:47.140342 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:47Z","lastTransitionTime":"2025-09-29T10:45:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:47 crc kubenswrapper[4752]: I0929 10:45:47.243243 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:47 crc kubenswrapper[4752]: I0929 10:45:47.243276 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:47 crc kubenswrapper[4752]: I0929 10:45:47.243286 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:47 crc kubenswrapper[4752]: I0929 10:45:47.243300 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:47 crc kubenswrapper[4752]: I0929 10:45:47.243310 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:47Z","lastTransitionTime":"2025-09-29T10:45:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:47 crc kubenswrapper[4752]: I0929 10:45:47.345982 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:47 crc kubenswrapper[4752]: I0929 10:45:47.346038 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:47 crc kubenswrapper[4752]: I0929 10:45:47.346051 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:47 crc kubenswrapper[4752]: I0929 10:45:47.346069 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:47 crc kubenswrapper[4752]: I0929 10:45:47.346079 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:47Z","lastTransitionTime":"2025-09-29T10:45:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:47 crc kubenswrapper[4752]: I0929 10:45:47.449717 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:47 crc kubenswrapper[4752]: I0929 10:45:47.449777 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:47 crc kubenswrapper[4752]: I0929 10:45:47.449792 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:47 crc kubenswrapper[4752]: I0929 10:45:47.449846 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:47 crc kubenswrapper[4752]: I0929 10:45:47.449865 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:47Z","lastTransitionTime":"2025-09-29T10:45:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:47 crc kubenswrapper[4752]: I0929 10:45:47.553699 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:47 crc kubenswrapper[4752]: I0929 10:45:47.553760 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:47 crc kubenswrapper[4752]: I0929 10:45:47.553773 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:47 crc kubenswrapper[4752]: I0929 10:45:47.553817 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:47 crc kubenswrapper[4752]: I0929 10:45:47.553832 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:47Z","lastTransitionTime":"2025-09-29T10:45:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:47 crc kubenswrapper[4752]: I0929 10:45:47.656666 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:47 crc kubenswrapper[4752]: I0929 10:45:47.656862 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:47 crc kubenswrapper[4752]: I0929 10:45:47.656885 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:47 crc kubenswrapper[4752]: I0929 10:45:47.656902 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:47 crc kubenswrapper[4752]: I0929 10:45:47.656917 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:47Z","lastTransitionTime":"2025-09-29T10:45:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:47 crc kubenswrapper[4752]: I0929 10:45:47.758879 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:47 crc kubenswrapper[4752]: I0929 10:45:47.758931 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:47 crc kubenswrapper[4752]: I0929 10:45:47.758944 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:47 crc kubenswrapper[4752]: I0929 10:45:47.758965 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:47 crc kubenswrapper[4752]: I0929 10:45:47.758977 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:47Z","lastTransitionTime":"2025-09-29T10:45:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:47 crc kubenswrapper[4752]: I0929 10:45:47.861486 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:47 crc kubenswrapper[4752]: I0929 10:45:47.861526 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:47 crc kubenswrapper[4752]: I0929 10:45:47.861536 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:47 crc kubenswrapper[4752]: I0929 10:45:47.861551 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:47 crc kubenswrapper[4752]: I0929 10:45:47.861562 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:47Z","lastTransitionTime":"2025-09-29T10:45:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:47 crc kubenswrapper[4752]: I0929 10:45:47.964264 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:47 crc kubenswrapper[4752]: I0929 10:45:47.964318 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:47 crc kubenswrapper[4752]: I0929 10:45:47.964332 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:47 crc kubenswrapper[4752]: I0929 10:45:47.964350 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:47 crc kubenswrapper[4752]: I0929 10:45:47.964362 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:47Z","lastTransitionTime":"2025-09-29T10:45:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:48 crc kubenswrapper[4752]: I0929 10:45:48.067337 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:48 crc kubenswrapper[4752]: I0929 10:45:48.067394 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:48 crc kubenswrapper[4752]: I0929 10:45:48.067406 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:48 crc kubenswrapper[4752]: I0929 10:45:48.067430 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:48 crc kubenswrapper[4752]: I0929 10:45:48.067445 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:48Z","lastTransitionTime":"2025-09-29T10:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:48 crc kubenswrapper[4752]: I0929 10:45:48.170297 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:48 crc kubenswrapper[4752]: I0929 10:45:48.170344 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:48 crc kubenswrapper[4752]: I0929 10:45:48.170356 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:48 crc kubenswrapper[4752]: I0929 10:45:48.170372 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:48 crc kubenswrapper[4752]: I0929 10:45:48.170384 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:48Z","lastTransitionTime":"2025-09-29T10:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:48 crc kubenswrapper[4752]: I0929 10:45:48.273354 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:48 crc kubenswrapper[4752]: I0929 10:45:48.273414 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:48 crc kubenswrapper[4752]: I0929 10:45:48.273438 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:48 crc kubenswrapper[4752]: I0929 10:45:48.273460 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:48 crc kubenswrapper[4752]: I0929 10:45:48.273475 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:48Z","lastTransitionTime":"2025-09-29T10:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:48 crc kubenswrapper[4752]: I0929 10:45:48.376005 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:48 crc kubenswrapper[4752]: I0929 10:45:48.376043 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:48 crc kubenswrapper[4752]: I0929 10:45:48.376059 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:48 crc kubenswrapper[4752]: I0929 10:45:48.376075 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:48 crc kubenswrapper[4752]: I0929 10:45:48.376085 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:48Z","lastTransitionTime":"2025-09-29T10:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:48 crc kubenswrapper[4752]: I0929 10:45:48.479070 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:48 crc kubenswrapper[4752]: I0929 10:45:48.479432 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:48 crc kubenswrapper[4752]: I0929 10:45:48.479576 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:48 crc kubenswrapper[4752]: I0929 10:45:48.479673 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:48 crc kubenswrapper[4752]: I0929 10:45:48.479762 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:48Z","lastTransitionTime":"2025-09-29T10:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:48 crc kubenswrapper[4752]: I0929 10:45:48.540900 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:48 crc kubenswrapper[4752]: I0929 10:45:48.540958 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:48 crc kubenswrapper[4752]: I0929 10:45:48.540972 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:48 crc kubenswrapper[4752]: I0929 10:45:48.540994 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:48 crc kubenswrapper[4752]: I0929 10:45:48.541010 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:48Z","lastTransitionTime":"2025-09-29T10:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:48 crc kubenswrapper[4752]: E0929 10:45:48.560171 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:45:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:45:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:45:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:45:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"67757396-6dfe-4e60-ba89-bdfd50031eb3\\\",\\\"systemUUID\\\":\\\"d8106fc8-56a6-4aa2-998a-aa38bb8caa68\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:48Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:48 crc kubenswrapper[4752]: I0929 10:45:48.566189 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:48 crc kubenswrapper[4752]: I0929 10:45:48.566240 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:48 crc kubenswrapper[4752]: I0929 10:45:48.566255 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:48 crc kubenswrapper[4752]: I0929 10:45:48.566277 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:48 crc kubenswrapper[4752]: I0929 10:45:48.566293 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:48Z","lastTransitionTime":"2025-09-29T10:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:48 crc kubenswrapper[4752]: E0929 10:45:48.582094 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:45:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:45:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:45:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:45:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"67757396-6dfe-4e60-ba89-bdfd50031eb3\\\",\\\"systemUUID\\\":\\\"d8106fc8-56a6-4aa2-998a-aa38bb8caa68\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:48Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:48 crc kubenswrapper[4752]: I0929 10:45:48.586327 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:48 crc kubenswrapper[4752]: I0929 10:45:48.586458 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:48 crc kubenswrapper[4752]: I0929 10:45:48.586533 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:48 crc kubenswrapper[4752]: I0929 10:45:48.586595 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:48 crc kubenswrapper[4752]: I0929 10:45:48.586648 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:48Z","lastTransitionTime":"2025-09-29T10:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:48 crc kubenswrapper[4752]: E0929 10:45:48.598985 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:45:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:45:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:45:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:45:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"67757396-6dfe-4e60-ba89-bdfd50031eb3\\\",\\\"systemUUID\\\":\\\"d8106fc8-56a6-4aa2-998a-aa38bb8caa68\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:48Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:48 crc kubenswrapper[4752]: I0929 10:45:48.602651 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:48 crc kubenswrapper[4752]: I0929 10:45:48.602921 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:48 crc kubenswrapper[4752]: I0929 10:45:48.602999 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:48 crc kubenswrapper[4752]: I0929 10:45:48.603094 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:48 crc kubenswrapper[4752]: I0929 10:45:48.603179 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:48Z","lastTransitionTime":"2025-09-29T10:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:48 crc kubenswrapper[4752]: E0929 10:45:48.619396 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:45:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:45:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:45:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:45:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"67757396-6dfe-4e60-ba89-bdfd50031eb3\\\",\\\"systemUUID\\\":\\\"d8106fc8-56a6-4aa2-998a-aa38bb8caa68\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:48Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:48 crc kubenswrapper[4752]: I0929 10:45:48.622921 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:48 crc kubenswrapper[4752]: I0929 10:45:48.622960 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:48 crc kubenswrapper[4752]: I0929 10:45:48.622969 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:48 crc kubenswrapper[4752]: I0929 10:45:48.622983 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:48 crc kubenswrapper[4752]: I0929 10:45:48.622993 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:48Z","lastTransitionTime":"2025-09-29T10:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:48 crc kubenswrapper[4752]: E0929 10:45:48.636628 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:45:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:45:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:45:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T10:45:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"67757396-6dfe-4e60-ba89-bdfd50031eb3\\\",\\\"systemUUID\\\":\\\"d8106fc8-56a6-4aa2-998a-aa38bb8caa68\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:48Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:48 crc kubenswrapper[4752]: E0929 10:45:48.637061 4752 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 29 10:45:48 crc kubenswrapper[4752]: I0929 10:45:48.638458 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:48 crc kubenswrapper[4752]: I0929 10:45:48.638480 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:48 crc kubenswrapper[4752]: I0929 10:45:48.638489 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:48 crc kubenswrapper[4752]: I0929 10:45:48.638503 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:48 crc kubenswrapper[4752]: I0929 10:45:48.638512 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:48Z","lastTransitionTime":"2025-09-29T10:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:48 crc kubenswrapper[4752]: I0929 10:45:48.740995 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:48 crc kubenswrapper[4752]: I0929 10:45:48.741045 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:48 crc kubenswrapper[4752]: I0929 10:45:48.741056 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:48 crc kubenswrapper[4752]: I0929 10:45:48.741075 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:48 crc kubenswrapper[4752]: I0929 10:45:48.741088 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:48Z","lastTransitionTime":"2025-09-29T10:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:48 crc kubenswrapper[4752]: I0929 10:45:48.843652 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:48 crc kubenswrapper[4752]: I0929 10:45:48.843839 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:48 crc kubenswrapper[4752]: I0929 10:45:48.843856 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:48 crc kubenswrapper[4752]: I0929 10:45:48.843874 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:48 crc kubenswrapper[4752]: I0929 10:45:48.843885 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:48Z","lastTransitionTime":"2025-09-29T10:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:48 crc kubenswrapper[4752]: I0929 10:45:48.946931 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:48 crc kubenswrapper[4752]: I0929 10:45:48.946991 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:48 crc kubenswrapper[4752]: I0929 10:45:48.946999 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:48 crc kubenswrapper[4752]: I0929 10:45:48.947014 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:48 crc kubenswrapper[4752]: I0929 10:45:48.947022 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:48Z","lastTransitionTime":"2025-09-29T10:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:49 crc kubenswrapper[4752]: I0929 10:45:49.030259 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 10:45:49 crc kubenswrapper[4752]: I0929 10:45:49.030312 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sq7f4" Sep 29 10:45:49 crc kubenswrapper[4752]: E0929 10:45:49.030406 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 10:45:49 crc kubenswrapper[4752]: E0929 10:45:49.030494 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sq7f4" podUID="0a33b92e-d79c-4162-8500-df7a89df8df3" Sep 29 10:45:49 crc kubenswrapper[4752]: I0929 10:45:49.030585 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 10:45:49 crc kubenswrapper[4752]: I0929 10:45:49.030617 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 10:45:49 crc kubenswrapper[4752]: E0929 10:45:49.030688 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 10:45:49 crc kubenswrapper[4752]: E0929 10:45:49.030775 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 10:45:49 crc kubenswrapper[4752]: I0929 10:45:49.049540 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:49 crc kubenswrapper[4752]: I0929 10:45:49.049569 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:49 crc kubenswrapper[4752]: I0929 10:45:49.049579 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:49 crc kubenswrapper[4752]: I0929 10:45:49.049596 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:49 crc kubenswrapper[4752]: I0929 10:45:49.049608 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:49Z","lastTransitionTime":"2025-09-29T10:45:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:49 crc kubenswrapper[4752]: I0929 10:45:49.151671 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:49 crc kubenswrapper[4752]: I0929 10:45:49.151723 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:49 crc kubenswrapper[4752]: I0929 10:45:49.151739 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:49 crc kubenswrapper[4752]: I0929 10:45:49.151753 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:49 crc kubenswrapper[4752]: I0929 10:45:49.151762 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:49Z","lastTransitionTime":"2025-09-29T10:45:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:49 crc kubenswrapper[4752]: I0929 10:45:49.254140 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:49 crc kubenswrapper[4752]: I0929 10:45:49.254194 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:49 crc kubenswrapper[4752]: I0929 10:45:49.254206 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:49 crc kubenswrapper[4752]: I0929 10:45:49.254226 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:49 crc kubenswrapper[4752]: I0929 10:45:49.254239 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:49Z","lastTransitionTime":"2025-09-29T10:45:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:49 crc kubenswrapper[4752]: I0929 10:45:49.357032 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:49 crc kubenswrapper[4752]: I0929 10:45:49.357120 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:49 crc kubenswrapper[4752]: I0929 10:45:49.357134 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:49 crc kubenswrapper[4752]: I0929 10:45:49.357157 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:49 crc kubenswrapper[4752]: I0929 10:45:49.357170 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:49Z","lastTransitionTime":"2025-09-29T10:45:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:49 crc kubenswrapper[4752]: I0929 10:45:49.459381 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:49 crc kubenswrapper[4752]: I0929 10:45:49.459418 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:49 crc kubenswrapper[4752]: I0929 10:45:49.459426 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:49 crc kubenswrapper[4752]: I0929 10:45:49.459440 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:49 crc kubenswrapper[4752]: I0929 10:45:49.459450 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:49Z","lastTransitionTime":"2025-09-29T10:45:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:49 crc kubenswrapper[4752]: I0929 10:45:49.561717 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:49 crc kubenswrapper[4752]: I0929 10:45:49.561780 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:49 crc kubenswrapper[4752]: I0929 10:45:49.561788 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:49 crc kubenswrapper[4752]: I0929 10:45:49.561840 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:49 crc kubenswrapper[4752]: I0929 10:45:49.561851 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:49Z","lastTransitionTime":"2025-09-29T10:45:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:49 crc kubenswrapper[4752]: I0929 10:45:49.664058 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:49 crc kubenswrapper[4752]: I0929 10:45:49.664104 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:49 crc kubenswrapper[4752]: I0929 10:45:49.664114 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:49 crc kubenswrapper[4752]: I0929 10:45:49.664130 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:49 crc kubenswrapper[4752]: I0929 10:45:49.664141 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:49Z","lastTransitionTime":"2025-09-29T10:45:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:49 crc kubenswrapper[4752]: I0929 10:45:49.767570 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:49 crc kubenswrapper[4752]: I0929 10:45:49.767639 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:49 crc kubenswrapper[4752]: I0929 10:45:49.767653 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:49 crc kubenswrapper[4752]: I0929 10:45:49.767669 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:49 crc kubenswrapper[4752]: I0929 10:45:49.767679 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:49Z","lastTransitionTime":"2025-09-29T10:45:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:49 crc kubenswrapper[4752]: I0929 10:45:49.869956 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:49 crc kubenswrapper[4752]: I0929 10:45:49.870021 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:49 crc kubenswrapper[4752]: I0929 10:45:49.870045 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:49 crc kubenswrapper[4752]: I0929 10:45:49.870073 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:49 crc kubenswrapper[4752]: I0929 10:45:49.870096 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:49Z","lastTransitionTime":"2025-09-29T10:45:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:49 crc kubenswrapper[4752]: I0929 10:45:49.972945 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:49 crc kubenswrapper[4752]: I0929 10:45:49.973005 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:49 crc kubenswrapper[4752]: I0929 10:45:49.973027 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:49 crc kubenswrapper[4752]: I0929 10:45:49.973049 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:49 crc kubenswrapper[4752]: I0929 10:45:49.973063 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:49Z","lastTransitionTime":"2025-09-29T10:45:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:50 crc kubenswrapper[4752]: I0929 10:45:50.046500 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22c62d6c-d29c-416f-bfeb-476f97181a39\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bab580564f9dd31f6b2ea23a31918a9fdd2f247d13a0bd882f38dbaee4bf0b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44a5132d9611cf58eef747d86fd0cef4eb52366b9d1bacc6df0cf5be145d3998\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c6d3ad808fe69e726b66a03be183d33f000a614fadbc7f644015633fbb2b457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5335487c039d2e7e80a940cfe980fb46caf0cfc6302660b9318d9c8c525227cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5335487c039d2e7e80a940cfe980fb46caf0cfc6302660b9318d9c8c525227cb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:50Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:50 crc kubenswrapper[4752]: I0929 10:45:50.064664 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:50Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:50 crc kubenswrapper[4752]: I0929 10:45:50.076841 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:50 crc kubenswrapper[4752]: I0929 10:45:50.077242 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:50 crc kubenswrapper[4752]: I0929 10:45:50.077360 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:50 crc kubenswrapper[4752]: I0929 10:45:50.077476 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:50 crc kubenswrapper[4752]: I0929 10:45:50.077598 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:50Z","lastTransitionTime":"2025-09-29T10:45:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:50 crc kubenswrapper[4752]: I0929 10:45:50.082697 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5863c243-797d-462a-b11f-71aaf005f8d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://166738b29f01996ec981fd00b49f422e4a97fe774396e7ea153ad29ef30a7370\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdtpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32155f6078e9c15abe4c659ac79b064ec182a232ea1d816998da4de273b7aa67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdtpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mgrvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:50Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:50 crc kubenswrapper[4752]: I0929 10:45:50.096508 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4whp8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"398b6e5c-29ac-4701-9207-d3d269b62224\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63db080ebca3f5ea23ddc9af874b6b500abe8044c73794ae0749df2949fb9520\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9hp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4whp8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:50Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:50 crc kubenswrapper[4752]: I0929 10:45:50.122861 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48ad7053-6039-4b1a-9729-fcbe1d938928\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00965359c30aa25677d4b114c00b339b155ab4b5316d5e355536bea5b65eaba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e2d86e0821e0155affe296e5cc70e9904f04c800943101e62509e3a5e4e0808\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9378a6f1ac902b030f4ecabac1eae40f884dc1546a360e178f38300e137d8b0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a174bcfad22c2a58c48792478272705c80a56775b45b14919ea1de1dd92b4cbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://828d416b69696f709d91feb8df8fead0f95be74a91c5dab25756e341e29413dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e4ae4f6e0a6df2f1e370b0ff37704c0b0252752c0d8e8a1cdd83088ca9ec951\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e4ae4f6e0a6df2f1e370b0ff37704c0b0252752c0d8e8a1cdd83088ca9ec951\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40c90938f79ba960fa16979dd5f239674df4b13cae8b0b5d3bb48b0e46219a34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40c90938f79ba960fa16979dd5f239674df4b13cae8b0b5d3bb48b0e46219a34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f99c6fe84624f3e518bbe35ee9b700effb126ff1f36d995262b7ed8b73364780\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f99c6fe84624f3e518bbe35ee9b700effb126ff1f36d995262b7ed8b73364780\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:50Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:50 crc kubenswrapper[4752]: I0929 10:45:50.144401 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:50Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:50 crc kubenswrapper[4752]: I0929 10:45:50.160498 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131d2c8a72fc6a373ebf6835840e6b9c1829db4c78b4961bf36642fd0e8a5636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:50Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:50 crc kubenswrapper[4752]: I0929 10:45:50.178277 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vm6zb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f30a1f9-86ef-450e-9f8c-8ef8d4ac380a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6bc5aff417397c8b264553f67de7ebd1aeadb67fb83114c5bb13c2e0d10e397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239ca1f17b9f1e1d6ba63b196e34066fe7fb37373453460261044f5fcaf819af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://239ca1f17b9f1e1d6ba63b196e34066fe7fb37373453460261044f5fcaf819af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd5b369dc688f11e4ab502a3886b722cba392fce0d3ac7850bd59abffbf7dee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd5b369dc688f11e4ab502a3886b722cba392fce0d3ac7850bd59abffbf7dee2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d17821abed9aca5c20373738f44ca9a61e954d1eee46f0d16c3e9b34d810a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88d17821abed9aca5c20373738f44ca9a61e954d1eee46f0d16c3e9b34d810a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50f5727e0bd53639ba6b6632f2d62c7c62ae74b07a60aa1cb58c2020990cae42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f5727e0bd53639ba6b6632f2d62c7c62ae74b07a60aa1cb58c2020990cae42\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd84740e3b0a970decedcc3960fb987fa618f9627f06be1d2d0b034d0361f805\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd84740e3b0a970decedcc3960fb987fa618f9627f06be1d2d0b034d0361f805\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6af6d9f7c1ca6625f88dcaa9ef267cf11f3ebb16a0ce12d3c2442550bc0833ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6af6d9f7c1ca6625f88dcaa9ef267cf11f3ebb16a0ce12d3c2442550bc0833ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vm6zb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:50Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:50 crc kubenswrapper[4752]: I0929 10:45:50.179630 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:50 crc kubenswrapper[4752]: I0929 10:45:50.179660 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:50 crc kubenswrapper[4752]: I0929 10:45:50.179668 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:50 crc kubenswrapper[4752]: I0929 10:45:50.179682 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:50 crc kubenswrapper[4752]: I0929 10:45:50.179692 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:50Z","lastTransitionTime":"2025-09-29T10:45:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:50 crc kubenswrapper[4752]: I0929 10:45:50.208246 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94028c24-ec10-4d5c-b32c-1700e677d539\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://486ac9c45cc8e6cc88a199b152343c1db14c51125b4357c85d5d082467fc4560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2860691a355a598f52a1f13213198fa7889748e67cca21a617ed5714f5eabcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e34a55130babbc5fbe9fb81d05fc687dc1b06c3bffea762ba699f9f6c317b312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5985eb5ebc8fa2ca986873aea235335770621597493b43eaa58d98329cd37009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b46368b26939edaf377aa86ef45fc9dc3ec4fa274dfe1cba458bafb8d32309e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a98f237ee9baeb799b2ea76ccbe7b349ed70b50f47738fc514ae56b46ee8d1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5083afbe3807e485df0ceb9323e330b0f37722f050f83895507559c9f655a21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5083afbe3807e485df0ceb9323e330b0f37722f050f83895507559c9f655a21\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T10:45:37Z\\\",\\\"message\\\":\\\"\\\\\\\"10.217.4.222\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0929 10:45:37.791790 6772 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-authentication/oauth-openshift]} name:Service_openshift-authentication/oauth-openshift_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.222:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c0c2f725-e461-454e-a88c-c8350d62e1ef}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0929 10:45:37.791856 6772 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-authentication/oauth-openshift]} name:Service_openshift-authentication/oauth-openshift_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.21\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T10:45:37Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-c2vrh_openshift-ovn-kubernetes(94028c24-ec10-4d5c-b32c-1700e677d539)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea11fb795febf50e35263b0a02c32a01fd69937dfbfe196696cd1792e40cc191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f22dfbbd26fb3ebf4869b46406913cc1963e33c11794193c815235be5acee338\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f22dfbbd26fb3ebf4869b46406913cc1963e33c11794193c815235be5acee338\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9v6qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c2vrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:50Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:50 crc kubenswrapper[4752]: I0929 10:45:50.235102 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520a5d33-312c-4033-8b69-5dd582f13ccc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6223734bbce461c09916aea7629bba0cfa97ea17050bca7417020ece9ae031a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1157b82d6f3337270d30abdceadaa1f0a01b3c6d8de6bc8e9edf083a8264f19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://854abd6205c2eec2229d0d65aec3edb7cf1cc1e77759df41bd22deda4a08c8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://362298e6215cc1a9971973419e58a45e5ded2c4120b1e800afd87f480f6fd3d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c927118840179fccacbe6a18a329c117cef73a6e914bf38d20fc2439d6a5c1ee\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"message\\\":\\\"le observer\\\\nW0929 10:44:40.787758 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0929 10:44:40.787900 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 10:44:40.788558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1487283959/tls.crt::/tmp/serving-cert-1487283959/tls.key\\\\\\\"\\\\nI0929 10:44:41.256284 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 10:44:41.261265 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 10:44:41.261291 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 10:44:41.261311 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 10:44:41.261316 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 10:44:41.267824 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0929 10:44:41.267847 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0929 10:44:41.267849 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 10:44:41.267871 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 10:44:41.267876 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 10:44:41.267879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 10:44:41.267882 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 10:44:41.267884 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0929 10:44:41.270258 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbe61bb570ef2be352bb3a0e55da353ce7b618b397e3bf9f0d66da0c9b6f1d4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80f961b58569cce6d634f225369902695ccda2e78efb1c6fd635f1535467cc1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80f961b58569cce6d634f225369902695ccda2e78efb1c6fd635f1535467cc1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:50Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:50 crc kubenswrapper[4752]: I0929 10:45:50.253900 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca4f637cfcb1e52fa69f0ffa46b3a53459225d9ad4afd1178bff709e812c5418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b70242846937de5b4dda37a2b8c48947fded378c299ea4ad857168589d7c175\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:50Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:50 crc kubenswrapper[4752]: I0929 10:45:50.265095 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7kp7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66a61a7f-9be6-486b-a425-62ed62ec0ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4170732970e5e7c429279d239eb2d4b9d8249ff254b35f38ff80d0321087be2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kgr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7kp7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:50Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:50 crc kubenswrapper[4752]: I0929 10:45:50.280304 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fb781fd16d4a9f56202eb1724ed1a4ed6700ff7b81819573b955bcb07e563a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:50Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:50 crc kubenswrapper[4752]: I0929 10:45:50.281995 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:50 crc kubenswrapper[4752]: I0929 10:45:50.282025 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:50 crc kubenswrapper[4752]: I0929 10:45:50.282037 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:50 crc kubenswrapper[4752]: I0929 10:45:50.282053 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:50 crc kubenswrapper[4752]: I0929 10:45:50.282065 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:50Z","lastTransitionTime":"2025-09-29T10:45:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:50 crc kubenswrapper[4752]: I0929 10:45:50.296375 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xv5q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52fc9378-c37b-424b-afde-7b191bab5fde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:45:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d36b7c0411c2a5cbcb37f626fa70cfe6c7d3fc6280f6a9e882fa27766f6de761\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30ee75a35da106cc9424c7a3f97f28d0c711200667372c023612db4a9701c189\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T10:45:29Z\\\",\\\"message\\\":\\\"2025-09-29T10:44:43+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_46ee9782-bab0-46a6-9758-0bdfa32662ba\\\\n2025-09-29T10:44:43+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_46ee9782-bab0-46a6-9758-0bdfa32662ba to /host/opt/cni/bin/\\\\n2025-09-29T10:44:44Z [verbose] multus-daemon started\\\\n2025-09-29T10:44:44Z [verbose] Readiness Indicator file check\\\\n2025-09-29T10:45:29Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:42Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:45:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4rqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xv5q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:50Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:50 crc kubenswrapper[4752]: I0929 10:45:50.307577 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mp5pm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65f5485e-9000-4512-aad3-7d367715ac2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db5dba49df10714a5f00ec40865af87528f6bee63ee58a89f299af7c10e4d769\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z772z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://073cf9e4675b04d77ad58f0b7e1b313e3fe15e8daee4e1c8934a90924b04ad22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z772z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mp5pm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:50Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:50 crc kubenswrapper[4752]: I0929 10:45:50.320434 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sq7f4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a33b92e-d79c-4162-8500-df7a89df8df3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qck2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qck2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sq7f4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:50Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:50 crc kubenswrapper[4752]: I0929 10:45:50.333693 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3e5d3a3-2f2d-4f61-ae95-26ebd1f72342\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66d77cd5048e199a6eae84be4079c3b00305f4f5223b5176a49df0feb2f0bf8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74b270e951a827068c908168bf04d4cd3bcba62e472e4a3f415de8b7463fdccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dd4d83f6d6b5db7fc93239bc1a6b731c67bc15ef1ca1990b53589e4ad36bfa7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39ef26bf3e7b95ac9a59199bbabe11fd4e831baba1b120ef97a4839c0c4aab7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:50Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:50 crc kubenswrapper[4752]: I0929 10:45:50.345557 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75427e82-74a4-46cd-ac54-210fa4bdd947\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b188b508629875e659215e5d09b261c54073368b770d1f876b5b0146b27f1af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T10:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88c78244f091e746e6cad8937b40c33fd6aef6118e696069f48acc0201635f54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88c78244f091e746e6cad8937b40c33fd6aef6118e696069f48acc0201635f54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T10:44:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T10:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T10:44:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:50Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:50 crc kubenswrapper[4752]: I0929 10:45:50.357211 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T10:44:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T10:45:50Z is after 2025-08-24T17:21:41Z" Sep 29 10:45:50 crc kubenswrapper[4752]: I0929 10:45:50.384674 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:50 crc kubenswrapper[4752]: I0929 10:45:50.384716 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:50 crc kubenswrapper[4752]: I0929 10:45:50.384727 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:50 crc kubenswrapper[4752]: I0929 10:45:50.384742 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:50 crc kubenswrapper[4752]: I0929 10:45:50.384751 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:50Z","lastTransitionTime":"2025-09-29T10:45:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:50 crc kubenswrapper[4752]: I0929 10:45:50.487059 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:50 crc kubenswrapper[4752]: I0929 10:45:50.487099 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:50 crc kubenswrapper[4752]: I0929 10:45:50.487107 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:50 crc kubenswrapper[4752]: I0929 10:45:50.487123 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:50 crc kubenswrapper[4752]: I0929 10:45:50.487133 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:50Z","lastTransitionTime":"2025-09-29T10:45:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:50 crc kubenswrapper[4752]: I0929 10:45:50.589401 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:50 crc kubenswrapper[4752]: I0929 10:45:50.589431 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:50 crc kubenswrapper[4752]: I0929 10:45:50.589439 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:50 crc kubenswrapper[4752]: I0929 10:45:50.589452 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:50 crc kubenswrapper[4752]: I0929 10:45:50.589461 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:50Z","lastTransitionTime":"2025-09-29T10:45:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:50 crc kubenswrapper[4752]: I0929 10:45:50.692901 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:50 crc kubenswrapper[4752]: I0929 10:45:50.693170 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:50 crc kubenswrapper[4752]: I0929 10:45:50.693272 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:50 crc kubenswrapper[4752]: I0929 10:45:50.693415 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:50 crc kubenswrapper[4752]: I0929 10:45:50.693520 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:50Z","lastTransitionTime":"2025-09-29T10:45:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:50 crc kubenswrapper[4752]: I0929 10:45:50.796281 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:50 crc kubenswrapper[4752]: I0929 10:45:50.796338 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:50 crc kubenswrapper[4752]: I0929 10:45:50.796355 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:50 crc kubenswrapper[4752]: I0929 10:45:50.796375 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:50 crc kubenswrapper[4752]: I0929 10:45:50.796392 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:50Z","lastTransitionTime":"2025-09-29T10:45:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:50 crc kubenswrapper[4752]: I0929 10:45:50.899467 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:50 crc kubenswrapper[4752]: I0929 10:45:50.899542 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:50 crc kubenswrapper[4752]: I0929 10:45:50.899561 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:50 crc kubenswrapper[4752]: I0929 10:45:50.899584 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:50 crc kubenswrapper[4752]: I0929 10:45:50.899602 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:50Z","lastTransitionTime":"2025-09-29T10:45:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:51 crc kubenswrapper[4752]: I0929 10:45:51.002833 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:51 crc kubenswrapper[4752]: I0929 10:45:51.002886 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:51 crc kubenswrapper[4752]: I0929 10:45:51.002900 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:51 crc kubenswrapper[4752]: I0929 10:45:51.002919 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:51 crc kubenswrapper[4752]: I0929 10:45:51.002934 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:51Z","lastTransitionTime":"2025-09-29T10:45:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:51 crc kubenswrapper[4752]: I0929 10:45:51.030057 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 10:45:51 crc kubenswrapper[4752]: E0929 10:45:51.030201 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 10:45:51 crc kubenswrapper[4752]: I0929 10:45:51.030993 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 10:45:51 crc kubenswrapper[4752]: I0929 10:45:51.031017 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sq7f4" Sep 29 10:45:51 crc kubenswrapper[4752]: E0929 10:45:51.031076 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 10:45:51 crc kubenswrapper[4752]: E0929 10:45:51.031162 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sq7f4" podUID="0a33b92e-d79c-4162-8500-df7a89df8df3" Sep 29 10:45:51 crc kubenswrapper[4752]: I0929 10:45:51.031022 4752 scope.go:117] "RemoveContainer" containerID="f5083afbe3807e485df0ceb9323e330b0f37722f050f83895507559c9f655a21" Sep 29 10:45:51 crc kubenswrapper[4752]: I0929 10:45:51.031276 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 10:45:51 crc kubenswrapper[4752]: E0929 10:45:51.031471 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-c2vrh_openshift-ovn-kubernetes(94028c24-ec10-4d5c-b32c-1700e677d539)\"" pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" podUID="94028c24-ec10-4d5c-b32c-1700e677d539" Sep 29 10:45:51 crc kubenswrapper[4752]: E0929 10:45:51.031517 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 10:45:51 crc kubenswrapper[4752]: I0929 10:45:51.106001 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:51 crc kubenswrapper[4752]: I0929 10:45:51.106064 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:51 crc kubenswrapper[4752]: I0929 10:45:51.106077 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:51 crc kubenswrapper[4752]: I0929 10:45:51.106099 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:51 crc kubenswrapper[4752]: I0929 10:45:51.106114 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:51Z","lastTransitionTime":"2025-09-29T10:45:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:51 crc kubenswrapper[4752]: I0929 10:45:51.208343 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:51 crc kubenswrapper[4752]: I0929 10:45:51.208379 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:51 crc kubenswrapper[4752]: I0929 10:45:51.208390 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:51 crc kubenswrapper[4752]: I0929 10:45:51.208404 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:51 crc kubenswrapper[4752]: I0929 10:45:51.208415 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:51Z","lastTransitionTime":"2025-09-29T10:45:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:51 crc kubenswrapper[4752]: I0929 10:45:51.310629 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:51 crc kubenswrapper[4752]: I0929 10:45:51.310697 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:51 crc kubenswrapper[4752]: I0929 10:45:51.310712 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:51 crc kubenswrapper[4752]: I0929 10:45:51.310733 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:51 crc kubenswrapper[4752]: I0929 10:45:51.310746 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:51Z","lastTransitionTime":"2025-09-29T10:45:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:51 crc kubenswrapper[4752]: I0929 10:45:51.412592 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:51 crc kubenswrapper[4752]: I0929 10:45:51.412641 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:51 crc kubenswrapper[4752]: I0929 10:45:51.412649 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:51 crc kubenswrapper[4752]: I0929 10:45:51.412665 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:51 crc kubenswrapper[4752]: I0929 10:45:51.412675 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:51Z","lastTransitionTime":"2025-09-29T10:45:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:51 crc kubenswrapper[4752]: I0929 10:45:51.515566 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:51 crc kubenswrapper[4752]: I0929 10:45:51.515607 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:51 crc kubenswrapper[4752]: I0929 10:45:51.515618 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:51 crc kubenswrapper[4752]: I0929 10:45:51.515634 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:51 crc kubenswrapper[4752]: I0929 10:45:51.515644 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:51Z","lastTransitionTime":"2025-09-29T10:45:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:51 crc kubenswrapper[4752]: I0929 10:45:51.618268 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:51 crc kubenswrapper[4752]: I0929 10:45:51.618325 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:51 crc kubenswrapper[4752]: I0929 10:45:51.618336 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:51 crc kubenswrapper[4752]: I0929 10:45:51.618358 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:51 crc kubenswrapper[4752]: I0929 10:45:51.618371 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:51Z","lastTransitionTime":"2025-09-29T10:45:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:51 crc kubenswrapper[4752]: I0929 10:45:51.720918 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:51 crc kubenswrapper[4752]: I0929 10:45:51.720960 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:51 crc kubenswrapper[4752]: I0929 10:45:51.720971 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:51 crc kubenswrapper[4752]: I0929 10:45:51.720987 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:51 crc kubenswrapper[4752]: I0929 10:45:51.720995 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:51Z","lastTransitionTime":"2025-09-29T10:45:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:51 crc kubenswrapper[4752]: I0929 10:45:51.823489 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:51 crc kubenswrapper[4752]: I0929 10:45:51.823540 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:51 crc kubenswrapper[4752]: I0929 10:45:51.823552 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:51 crc kubenswrapper[4752]: I0929 10:45:51.823570 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:51 crc kubenswrapper[4752]: I0929 10:45:51.823584 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:51Z","lastTransitionTime":"2025-09-29T10:45:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:51 crc kubenswrapper[4752]: I0929 10:45:51.925670 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:51 crc kubenswrapper[4752]: I0929 10:45:51.925700 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:51 crc kubenswrapper[4752]: I0929 10:45:51.925708 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:51 crc kubenswrapper[4752]: I0929 10:45:51.925721 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:51 crc kubenswrapper[4752]: I0929 10:45:51.925730 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:51Z","lastTransitionTime":"2025-09-29T10:45:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:52 crc kubenswrapper[4752]: I0929 10:45:52.029385 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:52 crc kubenswrapper[4752]: I0929 10:45:52.029496 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:52 crc kubenswrapper[4752]: I0929 10:45:52.029517 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:52 crc kubenswrapper[4752]: I0929 10:45:52.029544 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:52 crc kubenswrapper[4752]: I0929 10:45:52.029564 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:52Z","lastTransitionTime":"2025-09-29T10:45:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:52 crc kubenswrapper[4752]: I0929 10:45:52.132238 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:52 crc kubenswrapper[4752]: I0929 10:45:52.132317 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:52 crc kubenswrapper[4752]: I0929 10:45:52.132336 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:52 crc kubenswrapper[4752]: I0929 10:45:52.132354 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:52 crc kubenswrapper[4752]: I0929 10:45:52.132366 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:52Z","lastTransitionTime":"2025-09-29T10:45:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:52 crc kubenswrapper[4752]: I0929 10:45:52.235444 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:52 crc kubenswrapper[4752]: I0929 10:45:52.235486 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:52 crc kubenswrapper[4752]: I0929 10:45:52.235494 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:52 crc kubenswrapper[4752]: I0929 10:45:52.235508 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:52 crc kubenswrapper[4752]: I0929 10:45:52.235517 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:52Z","lastTransitionTime":"2025-09-29T10:45:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:52 crc kubenswrapper[4752]: I0929 10:45:52.338032 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:52 crc kubenswrapper[4752]: I0929 10:45:52.338067 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:52 crc kubenswrapper[4752]: I0929 10:45:52.338075 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:52 crc kubenswrapper[4752]: I0929 10:45:52.338089 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:52 crc kubenswrapper[4752]: I0929 10:45:52.338098 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:52Z","lastTransitionTime":"2025-09-29T10:45:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:52 crc kubenswrapper[4752]: I0929 10:45:52.440960 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:52 crc kubenswrapper[4752]: I0929 10:45:52.441031 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:52 crc kubenswrapper[4752]: I0929 10:45:52.441047 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:52 crc kubenswrapper[4752]: I0929 10:45:52.441066 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:52 crc kubenswrapper[4752]: I0929 10:45:52.441077 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:52Z","lastTransitionTime":"2025-09-29T10:45:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:52 crc kubenswrapper[4752]: I0929 10:45:52.543469 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:52 crc kubenswrapper[4752]: I0929 10:45:52.543510 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:52 crc kubenswrapper[4752]: I0929 10:45:52.543522 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:52 crc kubenswrapper[4752]: I0929 10:45:52.543540 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:52 crc kubenswrapper[4752]: I0929 10:45:52.543551 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:52Z","lastTransitionTime":"2025-09-29T10:45:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:52 crc kubenswrapper[4752]: I0929 10:45:52.646746 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:52 crc kubenswrapper[4752]: I0929 10:45:52.646851 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:52 crc kubenswrapper[4752]: I0929 10:45:52.646874 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:52 crc kubenswrapper[4752]: I0929 10:45:52.646905 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:52 crc kubenswrapper[4752]: I0929 10:45:52.646926 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:52Z","lastTransitionTime":"2025-09-29T10:45:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:52 crc kubenswrapper[4752]: I0929 10:45:52.750162 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:52 crc kubenswrapper[4752]: I0929 10:45:52.750264 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:52 crc kubenswrapper[4752]: I0929 10:45:52.750276 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:52 crc kubenswrapper[4752]: I0929 10:45:52.750305 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:52 crc kubenswrapper[4752]: I0929 10:45:52.750324 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:52Z","lastTransitionTime":"2025-09-29T10:45:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:52 crc kubenswrapper[4752]: I0929 10:45:52.853257 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:52 crc kubenswrapper[4752]: I0929 10:45:52.853309 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:52 crc kubenswrapper[4752]: I0929 10:45:52.853321 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:52 crc kubenswrapper[4752]: I0929 10:45:52.853345 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:52 crc kubenswrapper[4752]: I0929 10:45:52.853361 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:52Z","lastTransitionTime":"2025-09-29T10:45:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:52 crc kubenswrapper[4752]: I0929 10:45:52.956489 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:52 crc kubenswrapper[4752]: I0929 10:45:52.956539 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:52 crc kubenswrapper[4752]: I0929 10:45:52.956551 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:52 crc kubenswrapper[4752]: I0929 10:45:52.956570 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:52 crc kubenswrapper[4752]: I0929 10:45:52.956581 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:52Z","lastTransitionTime":"2025-09-29T10:45:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:53 crc kubenswrapper[4752]: I0929 10:45:53.030938 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sq7f4" Sep 29 10:45:53 crc kubenswrapper[4752]: I0929 10:45:53.031001 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 10:45:53 crc kubenswrapper[4752]: I0929 10:45:53.030960 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 10:45:53 crc kubenswrapper[4752]: I0929 10:45:53.030960 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 10:45:53 crc kubenswrapper[4752]: E0929 10:45:53.031080 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sq7f4" podUID="0a33b92e-d79c-4162-8500-df7a89df8df3" Sep 29 10:45:53 crc kubenswrapper[4752]: E0929 10:45:53.031309 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 10:45:53 crc kubenswrapper[4752]: E0929 10:45:53.031337 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 10:45:53 crc kubenswrapper[4752]: E0929 10:45:53.031405 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 10:45:53 crc kubenswrapper[4752]: I0929 10:45:53.059756 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:53 crc kubenswrapper[4752]: I0929 10:45:53.059795 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:53 crc kubenswrapper[4752]: I0929 10:45:53.059834 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:53 crc kubenswrapper[4752]: I0929 10:45:53.059850 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:53 crc kubenswrapper[4752]: I0929 10:45:53.059858 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:53Z","lastTransitionTime":"2025-09-29T10:45:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:53 crc kubenswrapper[4752]: I0929 10:45:53.162314 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:53 crc kubenswrapper[4752]: I0929 10:45:53.162560 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:53 crc kubenswrapper[4752]: I0929 10:45:53.162571 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:53 crc kubenswrapper[4752]: I0929 10:45:53.162586 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:53 crc kubenswrapper[4752]: I0929 10:45:53.162597 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:53Z","lastTransitionTime":"2025-09-29T10:45:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:53 crc kubenswrapper[4752]: I0929 10:45:53.265597 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:53 crc kubenswrapper[4752]: I0929 10:45:53.265653 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:53 crc kubenswrapper[4752]: I0929 10:45:53.265661 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:53 crc kubenswrapper[4752]: I0929 10:45:53.265678 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:53 crc kubenswrapper[4752]: I0929 10:45:53.265687 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:53Z","lastTransitionTime":"2025-09-29T10:45:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:53 crc kubenswrapper[4752]: I0929 10:45:53.367973 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:53 crc kubenswrapper[4752]: I0929 10:45:53.368059 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:53 crc kubenswrapper[4752]: I0929 10:45:53.368083 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:53 crc kubenswrapper[4752]: I0929 10:45:53.368111 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:53 crc kubenswrapper[4752]: I0929 10:45:53.368128 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:53Z","lastTransitionTime":"2025-09-29T10:45:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:53 crc kubenswrapper[4752]: I0929 10:45:53.470953 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:53 crc kubenswrapper[4752]: I0929 10:45:53.471027 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:53 crc kubenswrapper[4752]: I0929 10:45:53.471042 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:53 crc kubenswrapper[4752]: I0929 10:45:53.471066 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:53 crc kubenswrapper[4752]: I0929 10:45:53.471087 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:53Z","lastTransitionTime":"2025-09-29T10:45:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:53 crc kubenswrapper[4752]: I0929 10:45:53.574521 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:53 crc kubenswrapper[4752]: I0929 10:45:53.574587 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:53 crc kubenswrapper[4752]: I0929 10:45:53.574609 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:53 crc kubenswrapper[4752]: I0929 10:45:53.574640 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:53 crc kubenswrapper[4752]: I0929 10:45:53.574663 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:53Z","lastTransitionTime":"2025-09-29T10:45:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:53 crc kubenswrapper[4752]: I0929 10:45:53.678169 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:53 crc kubenswrapper[4752]: I0929 10:45:53.678212 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:53 crc kubenswrapper[4752]: I0929 10:45:53.678221 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:53 crc kubenswrapper[4752]: I0929 10:45:53.678244 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:53 crc kubenswrapper[4752]: I0929 10:45:53.678256 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:53Z","lastTransitionTime":"2025-09-29T10:45:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:53 crc kubenswrapper[4752]: I0929 10:45:53.781165 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:53 crc kubenswrapper[4752]: I0929 10:45:53.781246 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:53 crc kubenswrapper[4752]: I0929 10:45:53.781270 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:53 crc kubenswrapper[4752]: I0929 10:45:53.781303 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:53 crc kubenswrapper[4752]: I0929 10:45:53.781342 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:53Z","lastTransitionTime":"2025-09-29T10:45:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:53 crc kubenswrapper[4752]: I0929 10:45:53.884759 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:53 crc kubenswrapper[4752]: I0929 10:45:53.884858 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:53 crc kubenswrapper[4752]: I0929 10:45:53.884877 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:53 crc kubenswrapper[4752]: I0929 10:45:53.884899 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:53 crc kubenswrapper[4752]: I0929 10:45:53.884916 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:53Z","lastTransitionTime":"2025-09-29T10:45:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:53 crc kubenswrapper[4752]: I0929 10:45:53.987682 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:53 crc kubenswrapper[4752]: I0929 10:45:53.987739 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:53 crc kubenswrapper[4752]: I0929 10:45:53.987754 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:53 crc kubenswrapper[4752]: I0929 10:45:53.987775 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:53 crc kubenswrapper[4752]: I0929 10:45:53.987792 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:53Z","lastTransitionTime":"2025-09-29T10:45:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:54 crc kubenswrapper[4752]: I0929 10:45:54.091023 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:54 crc kubenswrapper[4752]: I0929 10:45:54.091087 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:54 crc kubenswrapper[4752]: I0929 10:45:54.091098 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:54 crc kubenswrapper[4752]: I0929 10:45:54.091131 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:54 crc kubenswrapper[4752]: I0929 10:45:54.091143 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:54Z","lastTransitionTime":"2025-09-29T10:45:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:54 crc kubenswrapper[4752]: I0929 10:45:54.193681 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:54 crc kubenswrapper[4752]: I0929 10:45:54.193720 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:54 crc kubenswrapper[4752]: I0929 10:45:54.193738 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:54 crc kubenswrapper[4752]: I0929 10:45:54.193755 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:54 crc kubenswrapper[4752]: I0929 10:45:54.193764 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:54Z","lastTransitionTime":"2025-09-29T10:45:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:54 crc kubenswrapper[4752]: I0929 10:45:54.296702 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:54 crc kubenswrapper[4752]: I0929 10:45:54.296745 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:54 crc kubenswrapper[4752]: I0929 10:45:54.296756 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:54 crc kubenswrapper[4752]: I0929 10:45:54.296854 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:54 crc kubenswrapper[4752]: I0929 10:45:54.296870 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:54Z","lastTransitionTime":"2025-09-29T10:45:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:54 crc kubenswrapper[4752]: I0929 10:45:54.399148 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:54 crc kubenswrapper[4752]: I0929 10:45:54.399215 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:54 crc kubenswrapper[4752]: I0929 10:45:54.399232 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:54 crc kubenswrapper[4752]: I0929 10:45:54.399256 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:54 crc kubenswrapper[4752]: I0929 10:45:54.399273 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:54Z","lastTransitionTime":"2025-09-29T10:45:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:54 crc kubenswrapper[4752]: I0929 10:45:54.502517 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:54 crc kubenswrapper[4752]: I0929 10:45:54.502581 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:54 crc kubenswrapper[4752]: I0929 10:45:54.502591 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:54 crc kubenswrapper[4752]: I0929 10:45:54.502609 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:54 crc kubenswrapper[4752]: I0929 10:45:54.502620 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:54Z","lastTransitionTime":"2025-09-29T10:45:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:54 crc kubenswrapper[4752]: I0929 10:45:54.606177 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:54 crc kubenswrapper[4752]: I0929 10:45:54.606242 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:54 crc kubenswrapper[4752]: I0929 10:45:54.606271 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:54 crc kubenswrapper[4752]: I0929 10:45:54.606302 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:54 crc kubenswrapper[4752]: I0929 10:45:54.606325 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:54Z","lastTransitionTime":"2025-09-29T10:45:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:54 crc kubenswrapper[4752]: I0929 10:45:54.709756 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:54 crc kubenswrapper[4752]: I0929 10:45:54.709826 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:54 crc kubenswrapper[4752]: I0929 10:45:54.709842 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:54 crc kubenswrapper[4752]: I0929 10:45:54.709865 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:54 crc kubenswrapper[4752]: I0929 10:45:54.709881 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:54Z","lastTransitionTime":"2025-09-29T10:45:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:54 crc kubenswrapper[4752]: I0929 10:45:54.812332 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:54 crc kubenswrapper[4752]: I0929 10:45:54.812375 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:54 crc kubenswrapper[4752]: I0929 10:45:54.812384 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:54 crc kubenswrapper[4752]: I0929 10:45:54.812399 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:54 crc kubenswrapper[4752]: I0929 10:45:54.812408 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:54Z","lastTransitionTime":"2025-09-29T10:45:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:54 crc kubenswrapper[4752]: I0929 10:45:54.915077 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:54 crc kubenswrapper[4752]: I0929 10:45:54.915120 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:54 crc kubenswrapper[4752]: I0929 10:45:54.915130 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:54 crc kubenswrapper[4752]: I0929 10:45:54.915147 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:54 crc kubenswrapper[4752]: I0929 10:45:54.915158 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:54Z","lastTransitionTime":"2025-09-29T10:45:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:55 crc kubenswrapper[4752]: I0929 10:45:55.017322 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:55 crc kubenswrapper[4752]: I0929 10:45:55.017394 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:55 crc kubenswrapper[4752]: I0929 10:45:55.017417 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:55 crc kubenswrapper[4752]: I0929 10:45:55.017444 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:55 crc kubenswrapper[4752]: I0929 10:45:55.017465 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:55Z","lastTransitionTime":"2025-09-29T10:45:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:55 crc kubenswrapper[4752]: I0929 10:45:55.030418 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sq7f4" Sep 29 10:45:55 crc kubenswrapper[4752]: I0929 10:45:55.030488 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 10:45:55 crc kubenswrapper[4752]: I0929 10:45:55.030530 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 10:45:55 crc kubenswrapper[4752]: I0929 10:45:55.030418 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 10:45:55 crc kubenswrapper[4752]: E0929 10:45:55.030652 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sq7f4" podUID="0a33b92e-d79c-4162-8500-df7a89df8df3" Sep 29 10:45:55 crc kubenswrapper[4752]: E0929 10:45:55.030795 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 10:45:55 crc kubenswrapper[4752]: E0929 10:45:55.030938 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 10:45:55 crc kubenswrapper[4752]: E0929 10:45:55.031083 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 10:45:55 crc kubenswrapper[4752]: I0929 10:45:55.120182 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:55 crc kubenswrapper[4752]: I0929 10:45:55.120243 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:55 crc kubenswrapper[4752]: I0929 10:45:55.120260 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:55 crc kubenswrapper[4752]: I0929 10:45:55.120285 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:55 crc kubenswrapper[4752]: I0929 10:45:55.120302 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:55Z","lastTransitionTime":"2025-09-29T10:45:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:55 crc kubenswrapper[4752]: I0929 10:45:55.222945 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:55 crc kubenswrapper[4752]: I0929 10:45:55.223018 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:55 crc kubenswrapper[4752]: I0929 10:45:55.223031 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:55 crc kubenswrapper[4752]: I0929 10:45:55.223052 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:55 crc kubenswrapper[4752]: I0929 10:45:55.223066 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:55Z","lastTransitionTime":"2025-09-29T10:45:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:55 crc kubenswrapper[4752]: I0929 10:45:55.325421 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:55 crc kubenswrapper[4752]: I0929 10:45:55.325468 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:55 crc kubenswrapper[4752]: I0929 10:45:55.325477 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:55 crc kubenswrapper[4752]: I0929 10:45:55.325493 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:55 crc kubenswrapper[4752]: I0929 10:45:55.325504 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:55Z","lastTransitionTime":"2025-09-29T10:45:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:55 crc kubenswrapper[4752]: I0929 10:45:55.427818 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:55 crc kubenswrapper[4752]: I0929 10:45:55.427858 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:55 crc kubenswrapper[4752]: I0929 10:45:55.427872 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:55 crc kubenswrapper[4752]: I0929 10:45:55.427889 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:55 crc kubenswrapper[4752]: I0929 10:45:55.427900 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:55Z","lastTransitionTime":"2025-09-29T10:45:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:55 crc kubenswrapper[4752]: I0929 10:45:55.529972 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:55 crc kubenswrapper[4752]: I0929 10:45:55.530010 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:55 crc kubenswrapper[4752]: I0929 10:45:55.530024 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:55 crc kubenswrapper[4752]: I0929 10:45:55.530040 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:55 crc kubenswrapper[4752]: I0929 10:45:55.530049 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:55Z","lastTransitionTime":"2025-09-29T10:45:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:55 crc kubenswrapper[4752]: I0929 10:45:55.632553 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:55 crc kubenswrapper[4752]: I0929 10:45:55.632585 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:55 crc kubenswrapper[4752]: I0929 10:45:55.632631 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:55 crc kubenswrapper[4752]: I0929 10:45:55.632649 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:55 crc kubenswrapper[4752]: I0929 10:45:55.632659 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:55Z","lastTransitionTime":"2025-09-29T10:45:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:55 crc kubenswrapper[4752]: I0929 10:45:55.735186 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:55 crc kubenswrapper[4752]: I0929 10:45:55.735244 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:55 crc kubenswrapper[4752]: I0929 10:45:55.735255 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:55 crc kubenswrapper[4752]: I0929 10:45:55.735271 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:55 crc kubenswrapper[4752]: I0929 10:45:55.735280 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:55Z","lastTransitionTime":"2025-09-29T10:45:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:55 crc kubenswrapper[4752]: I0929 10:45:55.837797 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:55 crc kubenswrapper[4752]: I0929 10:45:55.837886 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:55 crc kubenswrapper[4752]: I0929 10:45:55.837904 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:55 crc kubenswrapper[4752]: I0929 10:45:55.837926 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:55 crc kubenswrapper[4752]: I0929 10:45:55.837955 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:55Z","lastTransitionTime":"2025-09-29T10:45:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:55 crc kubenswrapper[4752]: I0929 10:45:55.941065 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:55 crc kubenswrapper[4752]: I0929 10:45:55.941120 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:55 crc kubenswrapper[4752]: I0929 10:45:55.941130 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:55 crc kubenswrapper[4752]: I0929 10:45:55.941147 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:55 crc kubenswrapper[4752]: I0929 10:45:55.941159 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:55Z","lastTransitionTime":"2025-09-29T10:45:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:56 crc kubenswrapper[4752]: I0929 10:45:56.044115 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:56 crc kubenswrapper[4752]: I0929 10:45:56.044187 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:56 crc kubenswrapper[4752]: I0929 10:45:56.044202 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:56 crc kubenswrapper[4752]: I0929 10:45:56.044221 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:56 crc kubenswrapper[4752]: I0929 10:45:56.044256 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:56Z","lastTransitionTime":"2025-09-29T10:45:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:56 crc kubenswrapper[4752]: I0929 10:45:56.146903 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:56 crc kubenswrapper[4752]: I0929 10:45:56.146946 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:56 crc kubenswrapper[4752]: I0929 10:45:56.146957 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:56 crc kubenswrapper[4752]: I0929 10:45:56.146972 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:56 crc kubenswrapper[4752]: I0929 10:45:56.146983 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:56Z","lastTransitionTime":"2025-09-29T10:45:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:56 crc kubenswrapper[4752]: I0929 10:45:56.249569 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:56 crc kubenswrapper[4752]: I0929 10:45:56.249640 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:56 crc kubenswrapper[4752]: I0929 10:45:56.249677 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:56 crc kubenswrapper[4752]: I0929 10:45:56.249706 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:56 crc kubenswrapper[4752]: I0929 10:45:56.249727 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:56Z","lastTransitionTime":"2025-09-29T10:45:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:56 crc kubenswrapper[4752]: I0929 10:45:56.353035 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:56 crc kubenswrapper[4752]: I0929 10:45:56.353082 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:56 crc kubenswrapper[4752]: I0929 10:45:56.353091 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:56 crc kubenswrapper[4752]: I0929 10:45:56.353106 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:56 crc kubenswrapper[4752]: I0929 10:45:56.353117 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:56Z","lastTransitionTime":"2025-09-29T10:45:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:56 crc kubenswrapper[4752]: I0929 10:45:56.456182 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:56 crc kubenswrapper[4752]: I0929 10:45:56.456234 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:56 crc kubenswrapper[4752]: I0929 10:45:56.456250 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:56 crc kubenswrapper[4752]: I0929 10:45:56.456277 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:56 crc kubenswrapper[4752]: I0929 10:45:56.456295 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:56Z","lastTransitionTime":"2025-09-29T10:45:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:56 crc kubenswrapper[4752]: I0929 10:45:56.559471 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:56 crc kubenswrapper[4752]: I0929 10:45:56.559544 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:56 crc kubenswrapper[4752]: I0929 10:45:56.559570 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:56 crc kubenswrapper[4752]: I0929 10:45:56.559599 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:56 crc kubenswrapper[4752]: I0929 10:45:56.559625 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:56Z","lastTransitionTime":"2025-09-29T10:45:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:56 crc kubenswrapper[4752]: I0929 10:45:56.662892 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:56 crc kubenswrapper[4752]: I0929 10:45:56.662940 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:56 crc kubenswrapper[4752]: I0929 10:45:56.662954 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:56 crc kubenswrapper[4752]: I0929 10:45:56.662975 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:56 crc kubenswrapper[4752]: I0929 10:45:56.662988 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:56Z","lastTransitionTime":"2025-09-29T10:45:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:56 crc kubenswrapper[4752]: I0929 10:45:56.766302 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:56 crc kubenswrapper[4752]: I0929 10:45:56.766349 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:56 crc kubenswrapper[4752]: I0929 10:45:56.766362 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:56 crc kubenswrapper[4752]: I0929 10:45:56.766386 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:56 crc kubenswrapper[4752]: I0929 10:45:56.766403 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:56Z","lastTransitionTime":"2025-09-29T10:45:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:56 crc kubenswrapper[4752]: I0929 10:45:56.868616 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:56 crc kubenswrapper[4752]: I0929 10:45:56.868677 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:56 crc kubenswrapper[4752]: I0929 10:45:56.868690 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:56 crc kubenswrapper[4752]: I0929 10:45:56.868712 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:56 crc kubenswrapper[4752]: I0929 10:45:56.868728 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:56Z","lastTransitionTime":"2025-09-29T10:45:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:56 crc kubenswrapper[4752]: I0929 10:45:56.971551 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:56 crc kubenswrapper[4752]: I0929 10:45:56.971597 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:56 crc kubenswrapper[4752]: I0929 10:45:56.971612 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:56 crc kubenswrapper[4752]: I0929 10:45:56.971633 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:56 crc kubenswrapper[4752]: I0929 10:45:56.971645 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:56Z","lastTransitionTime":"2025-09-29T10:45:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:57 crc kubenswrapper[4752]: I0929 10:45:57.030290 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 10:45:57 crc kubenswrapper[4752]: I0929 10:45:57.030419 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 10:45:57 crc kubenswrapper[4752]: I0929 10:45:57.030496 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sq7f4" Sep 29 10:45:57 crc kubenswrapper[4752]: I0929 10:45:57.030537 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 10:45:57 crc kubenswrapper[4752]: E0929 10:45:57.031855 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 10:45:57 crc kubenswrapper[4752]: E0929 10:45:57.031994 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sq7f4" podUID="0a33b92e-d79c-4162-8500-df7a89df8df3" Sep 29 10:45:57 crc kubenswrapper[4752]: E0929 10:45:57.032138 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 10:45:57 crc kubenswrapper[4752]: E0929 10:45:57.032234 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 10:45:57 crc kubenswrapper[4752]: I0929 10:45:57.074901 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:57 crc kubenswrapper[4752]: I0929 10:45:57.074970 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:57 crc kubenswrapper[4752]: I0929 10:45:57.074985 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:57 crc kubenswrapper[4752]: I0929 10:45:57.075007 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:57 crc kubenswrapper[4752]: I0929 10:45:57.075024 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:57Z","lastTransitionTime":"2025-09-29T10:45:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:57 crc kubenswrapper[4752]: I0929 10:45:57.177449 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:57 crc kubenswrapper[4752]: I0929 10:45:57.177514 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:57 crc kubenswrapper[4752]: I0929 10:45:57.177530 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:57 crc kubenswrapper[4752]: I0929 10:45:57.177552 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:57 crc kubenswrapper[4752]: I0929 10:45:57.177565 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:57Z","lastTransitionTime":"2025-09-29T10:45:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:57 crc kubenswrapper[4752]: I0929 10:45:57.280628 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:57 crc kubenswrapper[4752]: I0929 10:45:57.280667 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:57 crc kubenswrapper[4752]: I0929 10:45:57.280681 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:57 crc kubenswrapper[4752]: I0929 10:45:57.280699 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:57 crc kubenswrapper[4752]: I0929 10:45:57.280709 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:57Z","lastTransitionTime":"2025-09-29T10:45:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:57 crc kubenswrapper[4752]: I0929 10:45:57.383430 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:57 crc kubenswrapper[4752]: I0929 10:45:57.383705 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:57 crc kubenswrapper[4752]: I0929 10:45:57.383859 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:57 crc kubenswrapper[4752]: I0929 10:45:57.383938 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:57 crc kubenswrapper[4752]: I0929 10:45:57.384007 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:57Z","lastTransitionTime":"2025-09-29T10:45:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:57 crc kubenswrapper[4752]: I0929 10:45:57.487326 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:57 crc kubenswrapper[4752]: I0929 10:45:57.487356 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:57 crc kubenswrapper[4752]: I0929 10:45:57.487364 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:57 crc kubenswrapper[4752]: I0929 10:45:57.487381 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:57 crc kubenswrapper[4752]: I0929 10:45:57.487389 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:57Z","lastTransitionTime":"2025-09-29T10:45:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:57 crc kubenswrapper[4752]: I0929 10:45:57.589971 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:57 crc kubenswrapper[4752]: I0929 10:45:57.590016 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:57 crc kubenswrapper[4752]: I0929 10:45:57.590030 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:57 crc kubenswrapper[4752]: I0929 10:45:57.590047 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:57 crc kubenswrapper[4752]: I0929 10:45:57.590058 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:57Z","lastTransitionTime":"2025-09-29T10:45:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:57 crc kubenswrapper[4752]: I0929 10:45:57.694137 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:57 crc kubenswrapper[4752]: I0929 10:45:57.694193 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:57 crc kubenswrapper[4752]: I0929 10:45:57.694202 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:57 crc kubenswrapper[4752]: I0929 10:45:57.694232 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:57 crc kubenswrapper[4752]: I0929 10:45:57.694244 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:57Z","lastTransitionTime":"2025-09-29T10:45:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:57 crc kubenswrapper[4752]: I0929 10:45:57.796989 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:57 crc kubenswrapper[4752]: I0929 10:45:57.797367 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:57 crc kubenswrapper[4752]: I0929 10:45:57.797486 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:57 crc kubenswrapper[4752]: I0929 10:45:57.797578 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:57 crc kubenswrapper[4752]: I0929 10:45:57.797639 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:57Z","lastTransitionTime":"2025-09-29T10:45:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:57 crc kubenswrapper[4752]: I0929 10:45:57.900898 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:57 crc kubenswrapper[4752]: I0929 10:45:57.900946 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:57 crc kubenswrapper[4752]: I0929 10:45:57.900957 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:57 crc kubenswrapper[4752]: I0929 10:45:57.900974 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:57 crc kubenswrapper[4752]: I0929 10:45:57.900986 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:57Z","lastTransitionTime":"2025-09-29T10:45:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:58 crc kubenswrapper[4752]: I0929 10:45:58.003404 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:58 crc kubenswrapper[4752]: I0929 10:45:58.003436 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:58 crc kubenswrapper[4752]: I0929 10:45:58.003445 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:58 crc kubenswrapper[4752]: I0929 10:45:58.003460 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:58 crc kubenswrapper[4752]: I0929 10:45:58.003469 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:58Z","lastTransitionTime":"2025-09-29T10:45:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:58 crc kubenswrapper[4752]: I0929 10:45:58.106321 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:58 crc kubenswrapper[4752]: I0929 10:45:58.106367 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:58 crc kubenswrapper[4752]: I0929 10:45:58.106376 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:58 crc kubenswrapper[4752]: I0929 10:45:58.106399 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:58 crc kubenswrapper[4752]: I0929 10:45:58.106413 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:58Z","lastTransitionTime":"2025-09-29T10:45:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:58 crc kubenswrapper[4752]: I0929 10:45:58.209267 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:58 crc kubenswrapper[4752]: I0929 10:45:58.209321 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:58 crc kubenswrapper[4752]: I0929 10:45:58.209333 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:58 crc kubenswrapper[4752]: I0929 10:45:58.209354 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:58 crc kubenswrapper[4752]: I0929 10:45:58.209366 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:58Z","lastTransitionTime":"2025-09-29T10:45:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:58 crc kubenswrapper[4752]: I0929 10:45:58.312468 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:58 crc kubenswrapper[4752]: I0929 10:45:58.312534 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:58 crc kubenswrapper[4752]: I0929 10:45:58.312548 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:58 crc kubenswrapper[4752]: I0929 10:45:58.312571 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:58 crc kubenswrapper[4752]: I0929 10:45:58.312586 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:58Z","lastTransitionTime":"2025-09-29T10:45:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:58 crc kubenswrapper[4752]: I0929 10:45:58.416085 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:58 crc kubenswrapper[4752]: I0929 10:45:58.416159 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:58 crc kubenswrapper[4752]: I0929 10:45:58.416183 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:58 crc kubenswrapper[4752]: I0929 10:45:58.416216 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:58 crc kubenswrapper[4752]: I0929 10:45:58.416234 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:58Z","lastTransitionTime":"2025-09-29T10:45:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:58 crc kubenswrapper[4752]: I0929 10:45:58.519834 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:58 crc kubenswrapper[4752]: I0929 10:45:58.519923 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:58 crc kubenswrapper[4752]: I0929 10:45:58.519948 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:58 crc kubenswrapper[4752]: I0929 10:45:58.519983 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:58 crc kubenswrapper[4752]: I0929 10:45:58.520007 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:58Z","lastTransitionTime":"2025-09-29T10:45:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:58 crc kubenswrapper[4752]: I0929 10:45:58.622408 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:58 crc kubenswrapper[4752]: I0929 10:45:58.622442 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:58 crc kubenswrapper[4752]: I0929 10:45:58.622452 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:58 crc kubenswrapper[4752]: I0929 10:45:58.622466 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:58 crc kubenswrapper[4752]: I0929 10:45:58.622476 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:58Z","lastTransitionTime":"2025-09-29T10:45:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:58 crc kubenswrapper[4752]: I0929 10:45:58.725317 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:58 crc kubenswrapper[4752]: I0929 10:45:58.725410 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:58 crc kubenswrapper[4752]: I0929 10:45:58.725430 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:58 crc kubenswrapper[4752]: I0929 10:45:58.725465 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:58 crc kubenswrapper[4752]: I0929 10:45:58.725496 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:58Z","lastTransitionTime":"2025-09-29T10:45:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:58 crc kubenswrapper[4752]: I0929 10:45:58.828293 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:58 crc kubenswrapper[4752]: I0929 10:45:58.828329 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:58 crc kubenswrapper[4752]: I0929 10:45:58.828338 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:58 crc kubenswrapper[4752]: I0929 10:45:58.828351 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:58 crc kubenswrapper[4752]: I0929 10:45:58.828362 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:58Z","lastTransitionTime":"2025-09-29T10:45:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:58 crc kubenswrapper[4752]: I0929 10:45:58.899623 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 10:45:58 crc kubenswrapper[4752]: I0929 10:45:58.899654 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 10:45:58 crc kubenswrapper[4752]: I0929 10:45:58.899665 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 10:45:58 crc kubenswrapper[4752]: I0929 10:45:58.899679 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 10:45:58 crc kubenswrapper[4752]: I0929 10:45:58.899688 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T10:45:58Z","lastTransitionTime":"2025-09-29T10:45:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 10:45:58 crc kubenswrapper[4752]: I0929 10:45:58.950514 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-tr8ns"] Sep 29 10:45:58 crc kubenswrapper[4752]: I0929 10:45:58.951105 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tr8ns" Sep 29 10:45:58 crc kubenswrapper[4752]: I0929 10:45:58.953049 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Sep 29 10:45:58 crc kubenswrapper[4752]: I0929 10:45:58.953279 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Sep 29 10:45:58 crc kubenswrapper[4752]: I0929 10:45:58.953445 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Sep 29 10:45:58 crc kubenswrapper[4752]: I0929 10:45:58.953964 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Sep 29 10:45:58 crc kubenswrapper[4752]: I0929 10:45:58.972403 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/6537b356-dc0d-424d-a085-e3661c2b26eb-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-tr8ns\" (UID: \"6537b356-dc0d-424d-a085-e3661c2b26eb\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tr8ns" Sep 29 10:45:58 crc kubenswrapper[4752]: I0929 10:45:58.972503 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6537b356-dc0d-424d-a085-e3661c2b26eb-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-tr8ns\" (UID: \"6537b356-dc0d-424d-a085-e3661c2b26eb\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tr8ns" Sep 29 10:45:58 crc kubenswrapper[4752]: I0929 10:45:58.972603 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/6537b356-dc0d-424d-a085-e3661c2b26eb-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-tr8ns\" (UID: \"6537b356-dc0d-424d-a085-e3661c2b26eb\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tr8ns" Sep 29 10:45:58 crc kubenswrapper[4752]: I0929 10:45:58.972672 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6537b356-dc0d-424d-a085-e3661c2b26eb-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-tr8ns\" (UID: \"6537b356-dc0d-424d-a085-e3661c2b26eb\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tr8ns" Sep 29 10:45:58 crc kubenswrapper[4752]: I0929 10:45:58.972723 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6537b356-dc0d-424d-a085-e3661c2b26eb-service-ca\") pod \"cluster-version-operator-5c965bbfc6-tr8ns\" (UID: \"6537b356-dc0d-424d-a085-e3661c2b26eb\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tr8ns" Sep 29 10:45:58 crc kubenswrapper[4752]: I0929 10:45:58.980639 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podStartSLOduration=78.980608139 podStartE2EDuration="1m18.980608139s" podCreationTimestamp="2025-09-29 10:44:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:45:58.968773527 +0000 UTC m=+99.757915194" watchObservedRunningTime="2025-09-29 10:45:58.980608139 +0000 UTC m=+99.769749806" Sep 29 10:45:59 crc kubenswrapper[4752]: I0929 10:45:59.003377 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=77.00336042 podStartE2EDuration="1m17.00336042s" podCreationTimestamp="2025-09-29 10:44:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:45:59.002675113 +0000 UTC m=+99.791816780" watchObservedRunningTime="2025-09-29 10:45:59.00336042 +0000 UTC m=+99.792502087" Sep 29 10:45:59 crc kubenswrapper[4752]: I0929 10:45:59.003604 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-4whp8" podStartSLOduration=79.003601267 podStartE2EDuration="1m19.003601267s" podCreationTimestamp="2025-09-29 10:44:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:45:58.981075801 +0000 UTC m=+99.770217468" watchObservedRunningTime="2025-09-29 10:45:59.003601267 +0000 UTC m=+99.792742934" Sep 29 10:45:59 crc kubenswrapper[4752]: I0929 10:45:59.030916 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 10:45:59 crc kubenswrapper[4752]: I0929 10:45:59.030964 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sq7f4" Sep 29 10:45:59 crc kubenswrapper[4752]: E0929 10:45:59.031047 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 10:45:59 crc kubenswrapper[4752]: I0929 10:45:59.031052 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 10:45:59 crc kubenswrapper[4752]: E0929 10:45:59.031248 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sq7f4" podUID="0a33b92e-d79c-4162-8500-df7a89df8df3" Sep 29 10:45:59 crc kubenswrapper[4752]: E0929 10:45:59.031352 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 10:45:59 crc kubenswrapper[4752]: I0929 10:45:59.031493 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 10:45:59 crc kubenswrapper[4752]: E0929 10:45:59.031606 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 10:45:59 crc kubenswrapper[4752]: I0929 10:45:59.052098 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-vm6zb" podStartSLOduration=79.052062077 podStartE2EDuration="1m19.052062077s" podCreationTimestamp="2025-09-29 10:44:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:45:59.051354639 +0000 UTC m=+99.840496306" watchObservedRunningTime="2025-09-29 10:45:59.052062077 +0000 UTC m=+99.841203734" Sep 29 10:45:59 crc kubenswrapper[4752]: I0929 10:45:59.074100 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6537b356-dc0d-424d-a085-e3661c2b26eb-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-tr8ns\" (UID: \"6537b356-dc0d-424d-a085-e3661c2b26eb\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tr8ns" Sep 29 10:45:59 crc kubenswrapper[4752]: I0929 10:45:59.074162 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6537b356-dc0d-424d-a085-e3661c2b26eb-service-ca\") pod \"cluster-version-operator-5c965bbfc6-tr8ns\" (UID: \"6537b356-dc0d-424d-a085-e3661c2b26eb\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tr8ns" Sep 29 10:45:59 crc kubenswrapper[4752]: I0929 10:45:59.074227 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/6537b356-dc0d-424d-a085-e3661c2b26eb-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-tr8ns\" (UID: \"6537b356-dc0d-424d-a085-e3661c2b26eb\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tr8ns" Sep 29 10:45:59 crc kubenswrapper[4752]: I0929 10:45:59.074276 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6537b356-dc0d-424d-a085-e3661c2b26eb-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-tr8ns\" (UID: \"6537b356-dc0d-424d-a085-e3661c2b26eb\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tr8ns" Sep 29 10:45:59 crc kubenswrapper[4752]: I0929 10:45:59.074324 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/6537b356-dc0d-424d-a085-e3661c2b26eb-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-tr8ns\" (UID: \"6537b356-dc0d-424d-a085-e3661c2b26eb\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tr8ns" Sep 29 10:45:59 crc kubenswrapper[4752]: I0929 10:45:59.074392 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/6537b356-dc0d-424d-a085-e3661c2b26eb-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-tr8ns\" (UID: \"6537b356-dc0d-424d-a085-e3661c2b26eb\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tr8ns" Sep 29 10:45:59 crc kubenswrapper[4752]: I0929 10:45:59.074450 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/6537b356-dc0d-424d-a085-e3661c2b26eb-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-tr8ns\" (UID: \"6537b356-dc0d-424d-a085-e3661c2b26eb\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tr8ns" Sep 29 10:45:59 crc kubenswrapper[4752]: I0929 10:45:59.075308 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6537b356-dc0d-424d-a085-e3661c2b26eb-service-ca\") pod \"cluster-version-operator-5c965bbfc6-tr8ns\" (UID: \"6537b356-dc0d-424d-a085-e3661c2b26eb\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tr8ns" Sep 29 10:45:59 crc kubenswrapper[4752]: I0929 10:45:59.081664 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6537b356-dc0d-424d-a085-e3661c2b26eb-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-tr8ns\" (UID: \"6537b356-dc0d-424d-a085-e3661c2b26eb\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tr8ns" Sep 29 10:45:59 crc kubenswrapper[4752]: I0929 10:45:59.092674 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6537b356-dc0d-424d-a085-e3661c2b26eb-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-tr8ns\" (UID: \"6537b356-dc0d-424d-a085-e3661c2b26eb\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tr8ns" Sep 29 10:45:59 crc kubenswrapper[4752]: I0929 10:45:59.109519 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=77.109495745 podStartE2EDuration="1m17.109495745s" podCreationTimestamp="2025-09-29 10:44:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:45:59.109097865 +0000 UTC m=+99.898239532" watchObservedRunningTime="2025-09-29 10:45:59.109495745 +0000 UTC m=+99.898637422" Sep 29 10:45:59 crc kubenswrapper[4752]: I0929 10:45:59.159167 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-7kp7q" podStartSLOduration=79.159141617 podStartE2EDuration="1m19.159141617s" podCreationTimestamp="2025-09-29 10:44:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:45:59.142740724 +0000 UTC m=+99.931882391" watchObservedRunningTime="2025-09-29 10:45:59.159141617 +0000 UTC m=+99.948283274" Sep 29 10:45:59 crc kubenswrapper[4752]: I0929 10:45:59.174015 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-xv5q7" podStartSLOduration=79.173987489 podStartE2EDuration="1m19.173987489s" podCreationTimestamp="2025-09-29 10:44:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:45:59.173075686 +0000 UTC m=+99.962217373" watchObservedRunningTime="2025-09-29 10:45:59.173987489 +0000 UTC m=+99.963129156" Sep 29 10:45:59 crc kubenswrapper[4752]: I0929 10:45:59.187049 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mp5pm" podStartSLOduration=78.187033854 podStartE2EDuration="1m18.187033854s" podCreationTimestamp="2025-09-29 10:44:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:45:59.185869954 +0000 UTC m=+99.975011621" watchObservedRunningTime="2025-09-29 10:45:59.187033854 +0000 UTC m=+99.976175521" Sep 29 10:45:59 crc kubenswrapper[4752]: I0929 10:45:59.212865 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=73.212845666 podStartE2EDuration="1m13.212845666s" podCreationTimestamp="2025-09-29 10:44:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:45:59.212346253 +0000 UTC m=+100.001487930" watchObservedRunningTime="2025-09-29 10:45:59.212845666 +0000 UTC m=+100.001987333" Sep 29 10:45:59 crc kubenswrapper[4752]: I0929 10:45:59.237144 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=25.237125257 podStartE2EDuration="25.237125257s" podCreationTimestamp="2025-09-29 10:45:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:45:59.223750304 +0000 UTC m=+100.012891971" watchObservedRunningTime="2025-09-29 10:45:59.237125257 +0000 UTC m=+100.026266924" Sep 29 10:45:59 crc kubenswrapper[4752]: I0929 10:45:59.249568 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=51.249546346 podStartE2EDuration="51.249546346s" podCreationTimestamp="2025-09-29 10:45:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:45:59.249410112 +0000 UTC m=+100.038551779" watchObservedRunningTime="2025-09-29 10:45:59.249546346 +0000 UTC m=+100.038688013" Sep 29 10:45:59 crc kubenswrapper[4752]: I0929 10:45:59.265947 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tr8ns" Sep 29 10:45:59 crc kubenswrapper[4752]: I0929 10:45:59.601367 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tr8ns" event={"ID":"6537b356-dc0d-424d-a085-e3661c2b26eb","Type":"ContainerStarted","Data":"fa882c46126affd03a3cdc3cded1cd7da9aefd72da5ad0f7f67420e9882d3ace"} Sep 29 10:45:59 crc kubenswrapper[4752]: I0929 10:45:59.601441 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tr8ns" event={"ID":"6537b356-dc0d-424d-a085-e3661c2b26eb","Type":"ContainerStarted","Data":"a654337a5a7f9ac62462691651def2530adb312c1175baca68d2802671aa2ec8"} Sep 29 10:45:59 crc kubenswrapper[4752]: I0929 10:45:59.682729 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0a33b92e-d79c-4162-8500-df7a89df8df3-metrics-certs\") pod \"network-metrics-daemon-sq7f4\" (UID: \"0a33b92e-d79c-4162-8500-df7a89df8df3\") " pod="openshift-multus/network-metrics-daemon-sq7f4" Sep 29 10:45:59 crc kubenswrapper[4752]: E0929 10:45:59.683439 4752 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 29 10:45:59 crc kubenswrapper[4752]: E0929 10:45:59.683678 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0a33b92e-d79c-4162-8500-df7a89df8df3-metrics-certs podName:0a33b92e-d79c-4162-8500-df7a89df8df3 nodeName:}" failed. No retries permitted until 2025-09-29 10:47:03.683639886 +0000 UTC m=+164.472781703 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0a33b92e-d79c-4162-8500-df7a89df8df3-metrics-certs") pod "network-metrics-daemon-sq7f4" (UID: "0a33b92e-d79c-4162-8500-df7a89df8df3") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 29 10:46:01 crc kubenswrapper[4752]: I0929 10:46:01.029956 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 10:46:01 crc kubenswrapper[4752]: E0929 10:46:01.030424 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 10:46:01 crc kubenswrapper[4752]: I0929 10:46:01.030029 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 10:46:01 crc kubenswrapper[4752]: E0929 10:46:01.030519 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 10:46:01 crc kubenswrapper[4752]: I0929 10:46:01.030029 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sq7f4" Sep 29 10:46:01 crc kubenswrapper[4752]: I0929 10:46:01.030100 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 10:46:01 crc kubenswrapper[4752]: E0929 10:46:01.030591 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sq7f4" podUID="0a33b92e-d79c-4162-8500-df7a89df8df3" Sep 29 10:46:01 crc kubenswrapper[4752]: E0929 10:46:01.030905 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 10:46:03 crc kubenswrapper[4752]: I0929 10:46:03.030925 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 10:46:03 crc kubenswrapper[4752]: I0929 10:46:03.030982 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 10:46:03 crc kubenswrapper[4752]: I0929 10:46:03.030957 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sq7f4" Sep 29 10:46:03 crc kubenswrapper[4752]: I0929 10:46:03.030924 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 10:46:03 crc kubenswrapper[4752]: E0929 10:46:03.031061 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 10:46:03 crc kubenswrapper[4752]: E0929 10:46:03.031125 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 10:46:03 crc kubenswrapper[4752]: E0929 10:46:03.031583 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 10:46:03 crc kubenswrapper[4752]: I0929 10:46:03.031761 4752 scope.go:117] "RemoveContainer" containerID="f5083afbe3807e485df0ceb9323e330b0f37722f050f83895507559c9f655a21" Sep 29 10:46:03 crc kubenswrapper[4752]: E0929 10:46:03.031888 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sq7f4" podUID="0a33b92e-d79c-4162-8500-df7a89df8df3" Sep 29 10:46:03 crc kubenswrapper[4752]: E0929 10:46:03.031983 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-c2vrh_openshift-ovn-kubernetes(94028c24-ec10-4d5c-b32c-1700e677d539)\"" pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" podUID="94028c24-ec10-4d5c-b32c-1700e677d539" Sep 29 10:46:05 crc kubenswrapper[4752]: I0929 10:46:05.030540 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 10:46:05 crc kubenswrapper[4752]: I0929 10:46:05.030695 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 10:46:05 crc kubenswrapper[4752]: E0929 10:46:05.030900 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 10:46:05 crc kubenswrapper[4752]: I0929 10:46:05.031005 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 10:46:05 crc kubenswrapper[4752]: I0929 10:46:05.031024 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sq7f4" Sep 29 10:46:05 crc kubenswrapper[4752]: E0929 10:46:05.031062 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 10:46:05 crc kubenswrapper[4752]: E0929 10:46:05.031183 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 10:46:05 crc kubenswrapper[4752]: E0929 10:46:05.031302 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sq7f4" podUID="0a33b92e-d79c-4162-8500-df7a89df8df3" Sep 29 10:46:07 crc kubenswrapper[4752]: I0929 10:46:07.030884 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sq7f4" Sep 29 10:46:07 crc kubenswrapper[4752]: E0929 10:46:07.031073 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sq7f4" podUID="0a33b92e-d79c-4162-8500-df7a89df8df3" Sep 29 10:46:07 crc kubenswrapper[4752]: I0929 10:46:07.030899 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 10:46:07 crc kubenswrapper[4752]: E0929 10:46:07.031157 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 10:46:07 crc kubenswrapper[4752]: I0929 10:46:07.030899 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 10:46:07 crc kubenswrapper[4752]: I0929 10:46:07.030922 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 10:46:07 crc kubenswrapper[4752]: E0929 10:46:07.031219 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 10:46:07 crc kubenswrapper[4752]: E0929 10:46:07.031336 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 10:46:09 crc kubenswrapper[4752]: I0929 10:46:09.029953 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 10:46:09 crc kubenswrapper[4752]: E0929 10:46:09.030048 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 10:46:09 crc kubenswrapper[4752]: I0929 10:46:09.030079 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 10:46:09 crc kubenswrapper[4752]: I0929 10:46:09.030117 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 10:46:09 crc kubenswrapper[4752]: E0929 10:46:09.030154 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 10:46:09 crc kubenswrapper[4752]: I0929 10:46:09.030325 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sq7f4" Sep 29 10:46:09 crc kubenswrapper[4752]: E0929 10:46:09.030331 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 10:46:09 crc kubenswrapper[4752]: E0929 10:46:09.030388 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sq7f4" podUID="0a33b92e-d79c-4162-8500-df7a89df8df3" Sep 29 10:46:11 crc kubenswrapper[4752]: I0929 10:46:11.029987 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 10:46:11 crc kubenswrapper[4752]: I0929 10:46:11.030044 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sq7f4" Sep 29 10:46:11 crc kubenswrapper[4752]: I0929 10:46:11.030014 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 10:46:11 crc kubenswrapper[4752]: E0929 10:46:11.030111 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 10:46:11 crc kubenswrapper[4752]: I0929 10:46:11.029996 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 10:46:11 crc kubenswrapper[4752]: E0929 10:46:11.030187 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 10:46:11 crc kubenswrapper[4752]: E0929 10:46:11.030239 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sq7f4" podUID="0a33b92e-d79c-4162-8500-df7a89df8df3" Sep 29 10:46:11 crc kubenswrapper[4752]: E0929 10:46:11.030288 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 10:46:13 crc kubenswrapper[4752]: I0929 10:46:13.030112 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 10:46:13 crc kubenswrapper[4752]: I0929 10:46:13.030129 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sq7f4" Sep 29 10:46:13 crc kubenswrapper[4752]: I0929 10:46:13.030243 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 10:46:13 crc kubenswrapper[4752]: I0929 10:46:13.030137 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 10:46:13 crc kubenswrapper[4752]: E0929 10:46:13.030307 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 10:46:13 crc kubenswrapper[4752]: E0929 10:46:13.030461 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 10:46:13 crc kubenswrapper[4752]: E0929 10:46:13.030491 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sq7f4" podUID="0a33b92e-d79c-4162-8500-df7a89df8df3" Sep 29 10:46:13 crc kubenswrapper[4752]: E0929 10:46:13.030567 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 10:46:15 crc kubenswrapper[4752]: I0929 10:46:15.030840 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sq7f4" Sep 29 10:46:15 crc kubenswrapper[4752]: I0929 10:46:15.030879 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 10:46:15 crc kubenswrapper[4752]: I0929 10:46:15.030934 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 10:46:15 crc kubenswrapper[4752]: I0929 10:46:15.031002 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 10:46:15 crc kubenswrapper[4752]: E0929 10:46:15.031701 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sq7f4" podUID="0a33b92e-d79c-4162-8500-df7a89df8df3" Sep 29 10:46:15 crc kubenswrapper[4752]: I0929 10:46:15.032246 4752 scope.go:117] "RemoveContainer" containerID="f5083afbe3807e485df0ceb9323e330b0f37722f050f83895507559c9f655a21" Sep 29 10:46:15 crc kubenswrapper[4752]: E0929 10:46:15.032277 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 10:46:15 crc kubenswrapper[4752]: E0929 10:46:15.032481 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 10:46:15 crc kubenswrapper[4752]: E0929 10:46:15.032532 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 10:46:15 crc kubenswrapper[4752]: E0929 10:46:15.032587 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-c2vrh_openshift-ovn-kubernetes(94028c24-ec10-4d5c-b32c-1700e677d539)\"" pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" podUID="94028c24-ec10-4d5c-b32c-1700e677d539" Sep 29 10:46:15 crc kubenswrapper[4752]: I0929 10:46:15.654700 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xv5q7_52fc9378-c37b-424b-afde-7b191bab5fde/kube-multus/1.log" Sep 29 10:46:15 crc kubenswrapper[4752]: I0929 10:46:15.655779 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xv5q7_52fc9378-c37b-424b-afde-7b191bab5fde/kube-multus/0.log" Sep 29 10:46:15 crc kubenswrapper[4752]: I0929 10:46:15.655899 4752 generic.go:334] "Generic (PLEG): container finished" podID="52fc9378-c37b-424b-afde-7b191bab5fde" containerID="d36b7c0411c2a5cbcb37f626fa70cfe6c7d3fc6280f6a9e882fa27766f6de761" exitCode=1 Sep 29 10:46:15 crc kubenswrapper[4752]: I0929 10:46:15.655962 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xv5q7" event={"ID":"52fc9378-c37b-424b-afde-7b191bab5fde","Type":"ContainerDied","Data":"d36b7c0411c2a5cbcb37f626fa70cfe6c7d3fc6280f6a9e882fa27766f6de761"} Sep 29 10:46:15 crc kubenswrapper[4752]: I0929 10:46:15.656064 4752 scope.go:117] "RemoveContainer" containerID="30ee75a35da106cc9424c7a3f97f28d0c711200667372c023612db4a9701c189" Sep 29 10:46:15 crc kubenswrapper[4752]: I0929 10:46:15.656676 4752 scope.go:117] "RemoveContainer" containerID="d36b7c0411c2a5cbcb37f626fa70cfe6c7d3fc6280f6a9e882fa27766f6de761" Sep 29 10:46:15 crc kubenswrapper[4752]: E0929 10:46:15.657035 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-xv5q7_openshift-multus(52fc9378-c37b-424b-afde-7b191bab5fde)\"" pod="openshift-multus/multus-xv5q7" podUID="52fc9378-c37b-424b-afde-7b191bab5fde" Sep 29 10:46:15 crc kubenswrapper[4752]: I0929 10:46:15.685687 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tr8ns" podStartSLOduration=95.685669218 podStartE2EDuration="1m35.685669218s" podCreationTimestamp="2025-09-29 10:44:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:45:59.620697283 +0000 UTC m=+100.409838950" watchObservedRunningTime="2025-09-29 10:46:15.685669218 +0000 UTC m=+116.474810885" Sep 29 10:46:16 crc kubenswrapper[4752]: I0929 10:46:16.663670 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xv5q7_52fc9378-c37b-424b-afde-7b191bab5fde/kube-multus/1.log" Sep 29 10:46:17 crc kubenswrapper[4752]: I0929 10:46:17.030765 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sq7f4" Sep 29 10:46:17 crc kubenswrapper[4752]: I0929 10:46:17.030765 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 10:46:17 crc kubenswrapper[4752]: I0929 10:46:17.030765 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 10:46:17 crc kubenswrapper[4752]: I0929 10:46:17.030976 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 10:46:17 crc kubenswrapper[4752]: E0929 10:46:17.031131 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sq7f4" podUID="0a33b92e-d79c-4162-8500-df7a89df8df3" Sep 29 10:46:17 crc kubenswrapper[4752]: E0929 10:46:17.031469 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 10:46:17 crc kubenswrapper[4752]: E0929 10:46:17.031610 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 10:46:17 crc kubenswrapper[4752]: E0929 10:46:17.031666 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 10:46:19 crc kubenswrapper[4752]: I0929 10:46:19.030426 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 10:46:19 crc kubenswrapper[4752]: I0929 10:46:19.030454 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 10:46:19 crc kubenswrapper[4752]: I0929 10:46:19.030454 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sq7f4" Sep 29 10:46:19 crc kubenswrapper[4752]: E0929 10:46:19.030561 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 10:46:19 crc kubenswrapper[4752]: I0929 10:46:19.030630 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 10:46:19 crc kubenswrapper[4752]: E0929 10:46:19.030649 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 10:46:19 crc kubenswrapper[4752]: E0929 10:46:19.030826 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 10:46:19 crc kubenswrapper[4752]: E0929 10:46:19.030886 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sq7f4" podUID="0a33b92e-d79c-4162-8500-df7a89df8df3" Sep 29 10:46:20 crc kubenswrapper[4752]: E0929 10:46:20.039578 4752 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Sep 29 10:46:20 crc kubenswrapper[4752]: E0929 10:46:20.150840 4752 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Sep 29 10:46:21 crc kubenswrapper[4752]: I0929 10:46:21.030907 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 10:46:21 crc kubenswrapper[4752]: I0929 10:46:21.030993 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sq7f4" Sep 29 10:46:21 crc kubenswrapper[4752]: I0929 10:46:21.031066 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 10:46:21 crc kubenswrapper[4752]: I0929 10:46:21.030941 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 10:46:21 crc kubenswrapper[4752]: E0929 10:46:21.031186 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 10:46:21 crc kubenswrapper[4752]: E0929 10:46:21.031338 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sq7f4" podUID="0a33b92e-d79c-4162-8500-df7a89df8df3" Sep 29 10:46:21 crc kubenswrapper[4752]: E0929 10:46:21.031862 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 10:46:21 crc kubenswrapper[4752]: E0929 10:46:21.032001 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 10:46:23 crc kubenswrapper[4752]: I0929 10:46:23.030205 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 10:46:23 crc kubenswrapper[4752]: I0929 10:46:23.030264 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 10:46:23 crc kubenswrapper[4752]: E0929 10:46:23.030371 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 10:46:23 crc kubenswrapper[4752]: I0929 10:46:23.030398 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sq7f4" Sep 29 10:46:23 crc kubenswrapper[4752]: I0929 10:46:23.030386 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 10:46:23 crc kubenswrapper[4752]: E0929 10:46:23.030454 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 10:46:23 crc kubenswrapper[4752]: E0929 10:46:23.030714 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 10:46:23 crc kubenswrapper[4752]: E0929 10:46:23.030879 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sq7f4" podUID="0a33b92e-d79c-4162-8500-df7a89df8df3" Sep 29 10:46:25 crc kubenswrapper[4752]: I0929 10:46:25.030648 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 10:46:25 crc kubenswrapper[4752]: I0929 10:46:25.030648 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 10:46:25 crc kubenswrapper[4752]: E0929 10:46:25.030910 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 10:46:25 crc kubenswrapper[4752]: I0929 10:46:25.030794 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 10:46:25 crc kubenswrapper[4752]: E0929 10:46:25.031003 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 10:46:25 crc kubenswrapper[4752]: E0929 10:46:25.031223 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 10:46:25 crc kubenswrapper[4752]: I0929 10:46:25.031290 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sq7f4" Sep 29 10:46:25 crc kubenswrapper[4752]: E0929 10:46:25.031987 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sq7f4" podUID="0a33b92e-d79c-4162-8500-df7a89df8df3" Sep 29 10:46:25 crc kubenswrapper[4752]: E0929 10:46:25.152005 4752 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Sep 29 10:46:27 crc kubenswrapper[4752]: I0929 10:46:27.030528 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sq7f4" Sep 29 10:46:27 crc kubenswrapper[4752]: I0929 10:46:27.030558 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 10:46:27 crc kubenswrapper[4752]: I0929 10:46:27.030594 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 10:46:27 crc kubenswrapper[4752]: E0929 10:46:27.030689 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sq7f4" podUID="0a33b92e-d79c-4162-8500-df7a89df8df3" Sep 29 10:46:27 crc kubenswrapper[4752]: E0929 10:46:27.030907 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 10:46:27 crc kubenswrapper[4752]: E0929 10:46:27.030989 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 10:46:27 crc kubenswrapper[4752]: I0929 10:46:27.031145 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 10:46:27 crc kubenswrapper[4752]: E0929 10:46:27.031204 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 10:46:28 crc kubenswrapper[4752]: I0929 10:46:28.031738 4752 scope.go:117] "RemoveContainer" containerID="f5083afbe3807e485df0ceb9323e330b0f37722f050f83895507559c9f655a21" Sep 29 10:46:28 crc kubenswrapper[4752]: I0929 10:46:28.708349 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c2vrh_94028c24-ec10-4d5c-b32c-1700e677d539/ovnkube-controller/3.log" Sep 29 10:46:28 crc kubenswrapper[4752]: I0929 10:46:28.712030 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" event={"ID":"94028c24-ec10-4d5c-b32c-1700e677d539","Type":"ContainerStarted","Data":"3019dad252df73aaf83bc4c0b714472cf54345012a9a5b83a88315570d972fb7"} Sep 29 10:46:28 crc kubenswrapper[4752]: I0929 10:46:28.712606 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" Sep 29 10:46:28 crc kubenswrapper[4752]: I0929 10:46:28.738535 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" podStartSLOduration=108.738508174 podStartE2EDuration="1m48.738508174s" podCreationTimestamp="2025-09-29 10:44:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:46:28.73837224 +0000 UTC m=+129.527513937" watchObservedRunningTime="2025-09-29 10:46:28.738508174 +0000 UTC m=+129.527649841" Sep 29 10:46:28 crc kubenswrapper[4752]: I0929 10:46:28.892233 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-sq7f4"] Sep 29 10:46:28 crc kubenswrapper[4752]: I0929 10:46:28.892352 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sq7f4" Sep 29 10:46:28 crc kubenswrapper[4752]: E0929 10:46:28.892471 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sq7f4" podUID="0a33b92e-d79c-4162-8500-df7a89df8df3" Sep 29 10:46:29 crc kubenswrapper[4752]: I0929 10:46:29.030206 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 10:46:29 crc kubenswrapper[4752]: I0929 10:46:29.030232 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 10:46:29 crc kubenswrapper[4752]: I0929 10:46:29.030206 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 10:46:29 crc kubenswrapper[4752]: E0929 10:46:29.030363 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 10:46:29 crc kubenswrapper[4752]: E0929 10:46:29.030464 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 10:46:29 crc kubenswrapper[4752]: E0929 10:46:29.030649 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 10:46:30 crc kubenswrapper[4752]: I0929 10:46:30.030139 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sq7f4" Sep 29 10:46:30 crc kubenswrapper[4752]: E0929 10:46:30.031367 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sq7f4" podUID="0a33b92e-d79c-4162-8500-df7a89df8df3" Sep 29 10:46:30 crc kubenswrapper[4752]: I0929 10:46:30.031710 4752 scope.go:117] "RemoveContainer" containerID="d36b7c0411c2a5cbcb37f626fa70cfe6c7d3fc6280f6a9e882fa27766f6de761" Sep 29 10:46:30 crc kubenswrapper[4752]: E0929 10:46:30.152563 4752 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Sep 29 10:46:30 crc kubenswrapper[4752]: I0929 10:46:30.720520 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xv5q7_52fc9378-c37b-424b-afde-7b191bab5fde/kube-multus/1.log" Sep 29 10:46:30 crc kubenswrapper[4752]: I0929 10:46:30.720574 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xv5q7" event={"ID":"52fc9378-c37b-424b-afde-7b191bab5fde","Type":"ContainerStarted","Data":"eff8591a1e7e061df63a2f3b4b4af9f4dd03197426fd89027902ac085abf289f"} Sep 29 10:46:31 crc kubenswrapper[4752]: I0929 10:46:31.030016 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 10:46:31 crc kubenswrapper[4752]: I0929 10:46:31.030027 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 10:46:31 crc kubenswrapper[4752]: E0929 10:46:31.030314 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 10:46:31 crc kubenswrapper[4752]: E0929 10:46:31.030194 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 10:46:31 crc kubenswrapper[4752]: I0929 10:46:31.030027 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 10:46:31 crc kubenswrapper[4752]: E0929 10:46:31.030404 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 10:46:32 crc kubenswrapper[4752]: I0929 10:46:32.030516 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sq7f4" Sep 29 10:46:32 crc kubenswrapper[4752]: E0929 10:46:32.030725 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sq7f4" podUID="0a33b92e-d79c-4162-8500-df7a89df8df3" Sep 29 10:46:33 crc kubenswrapper[4752]: I0929 10:46:33.030820 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 10:46:33 crc kubenswrapper[4752]: I0929 10:46:33.030820 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 10:46:33 crc kubenswrapper[4752]: E0929 10:46:33.030968 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 10:46:33 crc kubenswrapper[4752]: E0929 10:46:33.031032 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 10:46:33 crc kubenswrapper[4752]: I0929 10:46:33.030855 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 10:46:33 crc kubenswrapper[4752]: E0929 10:46:33.031105 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 10:46:34 crc kubenswrapper[4752]: I0929 10:46:34.031190 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sq7f4" Sep 29 10:46:34 crc kubenswrapper[4752]: E0929 10:46:34.031535 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sq7f4" podUID="0a33b92e-d79c-4162-8500-df7a89df8df3" Sep 29 10:46:35 crc kubenswrapper[4752]: I0929 10:46:35.029985 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 10:46:35 crc kubenswrapper[4752]: I0929 10:46:35.030068 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 10:46:35 crc kubenswrapper[4752]: I0929 10:46:35.030068 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 10:46:35 crc kubenswrapper[4752]: E0929 10:46:35.030205 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 10:46:35 crc kubenswrapper[4752]: E0929 10:46:35.030287 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 10:46:35 crc kubenswrapper[4752]: E0929 10:46:35.030386 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 10:46:36 crc kubenswrapper[4752]: I0929 10:46:36.030303 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sq7f4" Sep 29 10:46:36 crc kubenswrapper[4752]: I0929 10:46:36.033309 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Sep 29 10:46:36 crc kubenswrapper[4752]: I0929 10:46:36.033656 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Sep 29 10:46:37 crc kubenswrapper[4752]: I0929 10:46:37.030558 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 10:46:37 crc kubenswrapper[4752]: I0929 10:46:37.030575 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 10:46:37 crc kubenswrapper[4752]: I0929 10:46:37.031248 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 10:46:37 crc kubenswrapper[4752]: I0929 10:46:37.033346 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Sep 29 10:46:37 crc kubenswrapper[4752]: I0929 10:46:37.033365 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Sep 29 10:46:37 crc kubenswrapper[4752]: I0929 10:46:37.034324 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Sep 29 10:46:37 crc kubenswrapper[4752]: I0929 10:46:37.037641 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Sep 29 10:46:37 crc kubenswrapper[4752]: I0929 10:46:37.976936 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.847724 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.885613 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-ntjk6"] Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.886427 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-ntjk6" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.886924 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-rvflz"] Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.887412 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-rvflz" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.887887 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-48jd6"] Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.888219 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-48jd6" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.889884 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-6ng8r"] Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.890335 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6ng8r" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.892651 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4t7tw"] Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.893331 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4t7tw" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.896150 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.896196 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.896510 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.896691 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.897198 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.897361 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.898490 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.900144 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.900732 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.900956 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.901086 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fw7kk"] Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.901935 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-7sz2n"] Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.902693 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-7sz2n" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.903108 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fw7kk" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.915375 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.915548 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.915623 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.916574 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.916634 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.916878 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.917002 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.917051 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.917185 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.917292 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.917324 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.917401 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.917480 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.917518 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.917571 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.917675 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.917753 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.918039 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.918228 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-h5ql5"] Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.918476 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.918657 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.918765 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.918885 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.918976 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.919071 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.919123 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.919260 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-h5ql5" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.920070 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.920248 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.920351 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.920434 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.921430 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xs56\" (UniqueName: \"kubernetes.io/projected/e8089407-89fb-42c1-8947-58fa83f8ef4c-kube-api-access-6xs56\") pod \"machine-approver-56656f9798-6ng8r\" (UID: \"e8089407-89fb-42c1-8947-58fa83f8ef4c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6ng8r" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.921426 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-bhw29"] Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.921994 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-bhw29" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.921488 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f3f0dcdd-283d-4ed5-889a-da260dcf13b0-serving-cert\") pod \"controller-manager-879f6c89f-rvflz\" (UID: \"f3f0dcdd-283d-4ed5-889a-da260dcf13b0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-rvflz" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.922296 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/2f1a2a22-45fb-441a-a05d-fb6c6dbf9e68-etcd-client\") pod \"apiserver-76f77b778f-7sz2n\" (UID: \"2f1a2a22-45fb-441a-a05d-fb6c6dbf9e68\") " pod="openshift-apiserver/apiserver-76f77b778f-7sz2n" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.922334 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f3f0dcdd-283d-4ed5-889a-da260dcf13b0-client-ca\") pod \"controller-manager-879f6c89f-rvflz\" (UID: \"f3f0dcdd-283d-4ed5-889a-da260dcf13b0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-rvflz" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.922352 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tktcs\" (UniqueName: \"kubernetes.io/projected/f3f0dcdd-283d-4ed5-889a-da260dcf13b0-kube-api-access-tktcs\") pod \"controller-manager-879f6c89f-rvflz\" (UID: \"f3f0dcdd-283d-4ed5-889a-da260dcf13b0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-rvflz" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.922373 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6b7ad647-b9dc-4694-beab-5908b529d9cf-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-48jd6\" (UID: \"6b7ad647-b9dc-4694-beab-5908b529d9cf\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-48jd6" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.922399 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/2477e356-dc04-44a6-bec0-e7304134493f-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-ntjk6\" (UID: \"2477e356-dc04-44a6-bec0-e7304134493f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ntjk6" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.922415 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwft9\" (UniqueName: \"kubernetes.io/projected/fd3a77bc-f325-4019-ad7b-e03b97e0471a-kube-api-access-xwft9\") pod \"cluster-samples-operator-665b6dd947-fw7kk\" (UID: \"fd3a77bc-f325-4019-ad7b-e03b97e0471a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fw7kk" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.922434 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/e8089407-89fb-42c1-8947-58fa83f8ef4c-machine-approver-tls\") pod \"machine-approver-56656f9798-6ng8r\" (UID: \"e8089407-89fb-42c1-8947-58fa83f8ef4c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6ng8r" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.922449 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e9b3ddc5-03c1-4d65-b890-660e9e8cc6c0-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-4t7tw\" (UID: \"e9b3ddc5-03c1-4d65-b890-660e9e8cc6c0\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4t7tw" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.922469 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5jm8\" (UniqueName: \"kubernetes.io/projected/6b7ad647-b9dc-4694-beab-5908b529d9cf-kube-api-access-c5jm8\") pod \"cluster-image-registry-operator-dc59b4c8b-48jd6\" (UID: \"6b7ad647-b9dc-4694-beab-5908b529d9cf\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-48jd6" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.922487 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/2f1a2a22-45fb-441a-a05d-fb6c6dbf9e68-etcd-serving-ca\") pod \"apiserver-76f77b778f-7sz2n\" (UID: \"2f1a2a22-45fb-441a-a05d-fb6c6dbf9e68\") " pod="openshift-apiserver/apiserver-76f77b778f-7sz2n" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.922520 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3f0dcdd-283d-4ed5-889a-da260dcf13b0-config\") pod \"controller-manager-879f6c89f-rvflz\" (UID: \"f3f0dcdd-283d-4ed5-889a-da260dcf13b0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-rvflz" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.922534 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e9b3ddc5-03c1-4d65-b890-660e9e8cc6c0-config\") pod \"openshift-apiserver-operator-796bbdcf4f-4t7tw\" (UID: \"e9b3ddc5-03c1-4d65-b890-660e9e8cc6c0\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4t7tw" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.922555 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2f1a2a22-45fb-441a-a05d-fb6c6dbf9e68-audit-dir\") pod \"apiserver-76f77b778f-7sz2n\" (UID: \"2f1a2a22-45fb-441a-a05d-fb6c6dbf9e68\") " pod="openshift-apiserver/apiserver-76f77b778f-7sz2n" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.922573 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/6b7ad647-b9dc-4694-beab-5908b529d9cf-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-48jd6\" (UID: \"6b7ad647-b9dc-4694-beab-5908b529d9cf\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-48jd6" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.922594 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/2f1a2a22-45fb-441a-a05d-fb6c6dbf9e68-node-pullsecrets\") pod \"apiserver-76f77b778f-7sz2n\" (UID: \"2f1a2a22-45fb-441a-a05d-fb6c6dbf9e68\") " pod="openshift-apiserver/apiserver-76f77b778f-7sz2n" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.922613 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2f1a2a22-45fb-441a-a05d-fb6c6dbf9e68-trusted-ca-bundle\") pod \"apiserver-76f77b778f-7sz2n\" (UID: \"2f1a2a22-45fb-441a-a05d-fb6c6dbf9e68\") " pod="openshift-apiserver/apiserver-76f77b778f-7sz2n" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.922633 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2477e356-dc04-44a6-bec0-e7304134493f-config\") pod \"machine-api-operator-5694c8668f-ntjk6\" (UID: \"2477e356-dc04-44a6-bec0-e7304134493f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ntjk6" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.922656 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hw72r\" (UniqueName: \"kubernetes.io/projected/2477e356-dc04-44a6-bec0-e7304134493f-kube-api-access-hw72r\") pod \"machine-api-operator-5694c8668f-ntjk6\" (UID: \"2477e356-dc04-44a6-bec0-e7304134493f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ntjk6" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.922674 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/2f1a2a22-45fb-441a-a05d-fb6c6dbf9e68-audit\") pod \"apiserver-76f77b778f-7sz2n\" (UID: \"2f1a2a22-45fb-441a-a05d-fb6c6dbf9e68\") " pod="openshift-apiserver/apiserver-76f77b778f-7sz2n" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.922692 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f1a2a22-45fb-441a-a05d-fb6c6dbf9e68-serving-cert\") pod \"apiserver-76f77b778f-7sz2n\" (UID: \"2f1a2a22-45fb-441a-a05d-fb6c6dbf9e68\") " pod="openshift-apiserver/apiserver-76f77b778f-7sz2n" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.922712 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6b7ad647-b9dc-4694-beab-5908b529d9cf-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-48jd6\" (UID: \"6b7ad647-b9dc-4694-beab-5908b529d9cf\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-48jd6" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.922732 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8089407-89fb-42c1-8947-58fa83f8ef4c-config\") pod \"machine-approver-56656f9798-6ng8r\" (UID: \"e8089407-89fb-42c1-8947-58fa83f8ef4c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6ng8r" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.922759 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-jvmfr"] Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.922826 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dcnj\" (UniqueName: \"kubernetes.io/projected/e9b3ddc5-03c1-4d65-b890-660e9e8cc6c0-kube-api-access-9dcnj\") pod \"openshift-apiserver-operator-796bbdcf4f-4t7tw\" (UID: \"e9b3ddc5-03c1-4d65-b890-660e9e8cc6c0\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4t7tw" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.922853 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/fd3a77bc-f325-4019-ad7b-e03b97e0471a-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-fw7kk\" (UID: \"fd3a77bc-f325-4019-ad7b-e03b97e0471a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fw7kk" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.922922 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lrwc\" (UniqueName: \"kubernetes.io/projected/2f1a2a22-45fb-441a-a05d-fb6c6dbf9e68-kube-api-access-7lrwc\") pod \"apiserver-76f77b778f-7sz2n\" (UID: \"2f1a2a22-45fb-441a-a05d-fb6c6dbf9e68\") " pod="openshift-apiserver/apiserver-76f77b778f-7sz2n" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.922997 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/2477e356-dc04-44a6-bec0-e7304134493f-images\") pod \"machine-api-operator-5694c8668f-ntjk6\" (UID: \"2477e356-dc04-44a6-bec0-e7304134493f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ntjk6" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.923021 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/2f1a2a22-45fb-441a-a05d-fb6c6dbf9e68-image-import-ca\") pod \"apiserver-76f77b778f-7sz2n\" (UID: \"2f1a2a22-45fb-441a-a05d-fb6c6dbf9e68\") " pod="openshift-apiserver/apiserver-76f77b778f-7sz2n" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.923057 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e8089407-89fb-42c1-8947-58fa83f8ef4c-auth-proxy-config\") pod \"machine-approver-56656f9798-6ng8r\" (UID: \"e8089407-89fb-42c1-8947-58fa83f8ef4c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6ng8r" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.923077 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f1a2a22-45fb-441a-a05d-fb6c6dbf9e68-config\") pod \"apiserver-76f77b778f-7sz2n\" (UID: \"2f1a2a22-45fb-441a-a05d-fb6c6dbf9e68\") " pod="openshift-apiserver/apiserver-76f77b778f-7sz2n" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.923092 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/2f1a2a22-45fb-441a-a05d-fb6c6dbf9e68-encryption-config\") pod \"apiserver-76f77b778f-7sz2n\" (UID: \"2f1a2a22-45fb-441a-a05d-fb6c6dbf9e68\") " pod="openshift-apiserver/apiserver-76f77b778f-7sz2n" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.923123 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f3f0dcdd-283d-4ed5-889a-da260dcf13b0-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-rvflz\" (UID: \"f3f0dcdd-283d-4ed5-889a-da260dcf13b0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-rvflz" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.923638 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jvmfr" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.931309 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-q8xbr"] Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.931785 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-dbt29"] Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.932086 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.932123 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.932131 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.932222 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.932485 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.932591 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.932781 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.933867 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.934026 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.934137 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.934322 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.934453 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.935091 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.935583 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-q8xbr" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.935692 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.935791 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.936822 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.937485 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.939879 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.941461 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-prpgr"] Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.941820 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-bp6hz"] Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.941831 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.941942 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.942056 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-s6ksk"] Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.942141 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-prpgr" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.942167 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-bp6hz" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.942141 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-dbt29" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.942434 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.953571 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.954109 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.954373 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.954516 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.957048 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.957270 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.957421 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.957515 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.957555 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.957631 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.957661 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.958667 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bq279"] Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.960328 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-s6ksk" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.961917 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.962727 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-z4kqj"] Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.963087 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bq279" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.966052 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.966574 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.966755 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.967208 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.967603 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.968853 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z4kqj" Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.973791 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-57fqh"] Sep 29 10:46:39 crc kubenswrapper[4752]: I0929 10:46:39.976980 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-hbftw"] Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.002827 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-r8k8r"] Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.003135 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-whxr2"] Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.003459 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-krscp"] Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.003895 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mfldp"] Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.004201 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-s8pf8"] Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.004521 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-s8pf8" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.004717 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-57fqh" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.004859 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-hbftw" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.005026 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-r8k8r" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.005214 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-whxr2" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.005391 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-krscp" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.005585 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mfldp" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.005737 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.007147 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.007245 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.008694 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.009176 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.009324 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.011376 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-79rht"] Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.011829 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-974fr"] Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.012578 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-974fr" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.012777 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-79rht" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.012875 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.012927 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.012950 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.012965 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.012881 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.013082 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.013080 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.013249 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.013563 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.013693 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.013770 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.013886 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.014034 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.014472 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.014580 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.014768 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.016335 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-cxqx7"] Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.017401 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-cxqx7" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.019569 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.019858 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.022706 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.026362 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2477e356-dc04-44a6-bec0-e7304134493f-config\") pod \"machine-api-operator-5694c8668f-ntjk6\" (UID: \"2477e356-dc04-44a6-bec0-e7304134493f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ntjk6" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.026424 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/2f1a2a22-45fb-441a-a05d-fb6c6dbf9e68-audit\") pod \"apiserver-76f77b778f-7sz2n\" (UID: \"2f1a2a22-45fb-441a-a05d-fb6c6dbf9e68\") " pod="openshift-apiserver/apiserver-76f77b778f-7sz2n" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.026451 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f1a2a22-45fb-441a-a05d-fb6c6dbf9e68-serving-cert\") pod \"apiserver-76f77b778f-7sz2n\" (UID: \"2f1a2a22-45fb-441a-a05d-fb6c6dbf9e68\") " pod="openshift-apiserver/apiserver-76f77b778f-7sz2n" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.026494 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6b7ad647-b9dc-4694-beab-5908b529d9cf-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-48jd6\" (UID: \"6b7ad647-b9dc-4694-beab-5908b529d9cf\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-48jd6" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.026518 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hw72r\" (UniqueName: \"kubernetes.io/projected/2477e356-dc04-44a6-bec0-e7304134493f-kube-api-access-hw72r\") pod \"machine-api-operator-5694c8668f-ntjk6\" (UID: \"2477e356-dc04-44a6-bec0-e7304134493f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ntjk6" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.026539 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8089407-89fb-42c1-8947-58fa83f8ef4c-config\") pod \"machine-approver-56656f9798-6ng8r\" (UID: \"e8089407-89fb-42c1-8947-58fa83f8ef4c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6ng8r" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.026567 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dcnj\" (UniqueName: \"kubernetes.io/projected/e9b3ddc5-03c1-4d65-b890-660e9e8cc6c0-kube-api-access-9dcnj\") pod \"openshift-apiserver-operator-796bbdcf4f-4t7tw\" (UID: \"e9b3ddc5-03c1-4d65-b890-660e9e8cc6c0\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4t7tw" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.026616 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/fd3a77bc-f325-4019-ad7b-e03b97e0471a-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-fw7kk\" (UID: \"fd3a77bc-f325-4019-ad7b-e03b97e0471a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fw7kk" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.026643 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lrwc\" (UniqueName: \"kubernetes.io/projected/2f1a2a22-45fb-441a-a05d-fb6c6dbf9e68-kube-api-access-7lrwc\") pod \"apiserver-76f77b778f-7sz2n\" (UID: \"2f1a2a22-45fb-441a-a05d-fb6c6dbf9e68\") " pod="openshift-apiserver/apiserver-76f77b778f-7sz2n" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.026679 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/2477e356-dc04-44a6-bec0-e7304134493f-images\") pod \"machine-api-operator-5694c8668f-ntjk6\" (UID: \"2477e356-dc04-44a6-bec0-e7304134493f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ntjk6" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.026694 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/2f1a2a22-45fb-441a-a05d-fb6c6dbf9e68-image-import-ca\") pod \"apiserver-76f77b778f-7sz2n\" (UID: \"2f1a2a22-45fb-441a-a05d-fb6c6dbf9e68\") " pod="openshift-apiserver/apiserver-76f77b778f-7sz2n" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.026717 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e8089407-89fb-42c1-8947-58fa83f8ef4c-auth-proxy-config\") pod \"machine-approver-56656f9798-6ng8r\" (UID: \"e8089407-89fb-42c1-8947-58fa83f8ef4c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6ng8r" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.026742 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f1a2a22-45fb-441a-a05d-fb6c6dbf9e68-config\") pod \"apiserver-76f77b778f-7sz2n\" (UID: \"2f1a2a22-45fb-441a-a05d-fb6c6dbf9e68\") " pod="openshift-apiserver/apiserver-76f77b778f-7sz2n" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.026762 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/2f1a2a22-45fb-441a-a05d-fb6c6dbf9e68-encryption-config\") pod \"apiserver-76f77b778f-7sz2n\" (UID: \"2f1a2a22-45fb-441a-a05d-fb6c6dbf9e68\") " pod="openshift-apiserver/apiserver-76f77b778f-7sz2n" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.027608 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/2f1a2a22-45fb-441a-a05d-fb6c6dbf9e68-audit\") pod \"apiserver-76f77b778f-7sz2n\" (UID: \"2f1a2a22-45fb-441a-a05d-fb6c6dbf9e68\") " pod="openshift-apiserver/apiserver-76f77b778f-7sz2n" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.027983 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2477e356-dc04-44a6-bec0-e7304134493f-config\") pod \"machine-api-operator-5694c8668f-ntjk6\" (UID: \"2477e356-dc04-44a6-bec0-e7304134493f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ntjk6" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.028667 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e8089407-89fb-42c1-8947-58fa83f8ef4c-auth-proxy-config\") pod \"machine-approver-56656f9798-6ng8r\" (UID: \"e8089407-89fb-42c1-8947-58fa83f8ef4c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6ng8r" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.028845 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/2f1a2a22-45fb-441a-a05d-fb6c6dbf9e68-image-import-ca\") pod \"apiserver-76f77b778f-7sz2n\" (UID: \"2f1a2a22-45fb-441a-a05d-fb6c6dbf9e68\") " pod="openshift-apiserver/apiserver-76f77b778f-7sz2n" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.029328 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8089407-89fb-42c1-8947-58fa83f8ef4c-config\") pod \"machine-approver-56656f9798-6ng8r\" (UID: \"e8089407-89fb-42c1-8947-58fa83f8ef4c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6ng8r" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.029337 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f1a2a22-45fb-441a-a05d-fb6c6dbf9e68-config\") pod \"apiserver-76f77b778f-7sz2n\" (UID: \"2f1a2a22-45fb-441a-a05d-fb6c6dbf9e68\") " pod="openshift-apiserver/apiserver-76f77b778f-7sz2n" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.029338 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f3f0dcdd-283d-4ed5-889a-da260dcf13b0-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-rvflz\" (UID: \"f3f0dcdd-283d-4ed5-889a-da260dcf13b0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-rvflz" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.029433 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xs56\" (UniqueName: \"kubernetes.io/projected/e8089407-89fb-42c1-8947-58fa83f8ef4c-kube-api-access-6xs56\") pod \"machine-approver-56656f9798-6ng8r\" (UID: \"e8089407-89fb-42c1-8947-58fa83f8ef4c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6ng8r" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.029458 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f3f0dcdd-283d-4ed5-889a-da260dcf13b0-serving-cert\") pod \"controller-manager-879f6c89f-rvflz\" (UID: \"f3f0dcdd-283d-4ed5-889a-da260dcf13b0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-rvflz" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.029480 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/2f1a2a22-45fb-441a-a05d-fb6c6dbf9e68-etcd-client\") pod \"apiserver-76f77b778f-7sz2n\" (UID: \"2f1a2a22-45fb-441a-a05d-fb6c6dbf9e68\") " pod="openshift-apiserver/apiserver-76f77b778f-7sz2n" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.029517 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f3f0dcdd-283d-4ed5-889a-da260dcf13b0-client-ca\") pod \"controller-manager-879f6c89f-rvflz\" (UID: \"f3f0dcdd-283d-4ed5-889a-da260dcf13b0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-rvflz" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.029538 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tktcs\" (UniqueName: \"kubernetes.io/projected/f3f0dcdd-283d-4ed5-889a-da260dcf13b0-kube-api-access-tktcs\") pod \"controller-manager-879f6c89f-rvflz\" (UID: \"f3f0dcdd-283d-4ed5-889a-da260dcf13b0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-rvflz" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.029562 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6b7ad647-b9dc-4694-beab-5908b529d9cf-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-48jd6\" (UID: \"6b7ad647-b9dc-4694-beab-5908b529d9cf\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-48jd6" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.029593 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/2477e356-dc04-44a6-bec0-e7304134493f-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-ntjk6\" (UID: \"2477e356-dc04-44a6-bec0-e7304134493f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ntjk6" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.029612 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwft9\" (UniqueName: \"kubernetes.io/projected/fd3a77bc-f325-4019-ad7b-e03b97e0471a-kube-api-access-xwft9\") pod \"cluster-samples-operator-665b6dd947-fw7kk\" (UID: \"fd3a77bc-f325-4019-ad7b-e03b97e0471a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fw7kk" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.029641 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/e8089407-89fb-42c1-8947-58fa83f8ef4c-machine-approver-tls\") pod \"machine-approver-56656f9798-6ng8r\" (UID: \"e8089407-89fb-42c1-8947-58fa83f8ef4c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6ng8r" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.029664 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e9b3ddc5-03c1-4d65-b890-660e9e8cc6c0-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-4t7tw\" (UID: \"e9b3ddc5-03c1-4d65-b890-660e9e8cc6c0\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4t7tw" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.029699 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5jm8\" (UniqueName: \"kubernetes.io/projected/6b7ad647-b9dc-4694-beab-5908b529d9cf-kube-api-access-c5jm8\") pod \"cluster-image-registry-operator-dc59b4c8b-48jd6\" (UID: \"6b7ad647-b9dc-4694-beab-5908b529d9cf\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-48jd6" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.029723 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/2f1a2a22-45fb-441a-a05d-fb6c6dbf9e68-etcd-serving-ca\") pod \"apiserver-76f77b778f-7sz2n\" (UID: \"2f1a2a22-45fb-441a-a05d-fb6c6dbf9e68\") " pod="openshift-apiserver/apiserver-76f77b778f-7sz2n" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.029762 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3f0dcdd-283d-4ed5-889a-da260dcf13b0-config\") pod \"controller-manager-879f6c89f-rvflz\" (UID: \"f3f0dcdd-283d-4ed5-889a-da260dcf13b0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-rvflz" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.029789 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e9b3ddc5-03c1-4d65-b890-660e9e8cc6c0-config\") pod \"openshift-apiserver-operator-796bbdcf4f-4t7tw\" (UID: \"e9b3ddc5-03c1-4d65-b890-660e9e8cc6c0\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4t7tw" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.029834 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2f1a2a22-45fb-441a-a05d-fb6c6dbf9e68-audit-dir\") pod \"apiserver-76f77b778f-7sz2n\" (UID: \"2f1a2a22-45fb-441a-a05d-fb6c6dbf9e68\") " pod="openshift-apiserver/apiserver-76f77b778f-7sz2n" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.029852 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/6b7ad647-b9dc-4694-beab-5908b529d9cf-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-48jd6\" (UID: \"6b7ad647-b9dc-4694-beab-5908b529d9cf\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-48jd6" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.029873 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/2f1a2a22-45fb-441a-a05d-fb6c6dbf9e68-node-pullsecrets\") pod \"apiserver-76f77b778f-7sz2n\" (UID: \"2f1a2a22-45fb-441a-a05d-fb6c6dbf9e68\") " pod="openshift-apiserver/apiserver-76f77b778f-7sz2n" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.029891 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2f1a2a22-45fb-441a-a05d-fb6c6dbf9e68-trusted-ca-bundle\") pod \"apiserver-76f77b778f-7sz2n\" (UID: \"2f1a2a22-45fb-441a-a05d-fb6c6dbf9e68\") " pod="openshift-apiserver/apiserver-76f77b778f-7sz2n" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.029952 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-hnqp5"] Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.029997 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/2477e356-dc04-44a6-bec0-e7304134493f-images\") pod \"machine-api-operator-5694c8668f-ntjk6\" (UID: \"2477e356-dc04-44a6-bec0-e7304134493f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ntjk6" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.033693 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2f1a2a22-45fb-441a-a05d-fb6c6dbf9e68-trusted-ca-bundle\") pod \"apiserver-76f77b778f-7sz2n\" (UID: \"2f1a2a22-45fb-441a-a05d-fb6c6dbf9e68\") " pod="openshift-apiserver/apiserver-76f77b778f-7sz2n" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.034091 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.036011 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f3f0dcdd-283d-4ed5-889a-da260dcf13b0-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-rvflz\" (UID: \"f3f0dcdd-283d-4ed5-889a-da260dcf13b0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-rvflz" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.036212 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3f0dcdd-283d-4ed5-889a-da260dcf13b0-config\") pod \"controller-manager-879f6c89f-rvflz\" (UID: \"f3f0dcdd-283d-4ed5-889a-da260dcf13b0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-rvflz" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.036240 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.036441 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2f1a2a22-45fb-441a-a05d-fb6c6dbf9e68-audit-dir\") pod \"apiserver-76f77b778f-7sz2n\" (UID: \"2f1a2a22-45fb-441a-a05d-fb6c6dbf9e68\") " pod="openshift-apiserver/apiserver-76f77b778f-7sz2n" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.045971 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/2f1a2a22-45fb-441a-a05d-fb6c6dbf9e68-etcd-serving-ca\") pod \"apiserver-76f77b778f-7sz2n\" (UID: \"2f1a2a22-45fb-441a-a05d-fb6c6dbf9e68\") " pod="openshift-apiserver/apiserver-76f77b778f-7sz2n" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.046408 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/2f1a2a22-45fb-441a-a05d-fb6c6dbf9e68-node-pullsecrets\") pod \"apiserver-76f77b778f-7sz2n\" (UID: \"2f1a2a22-45fb-441a-a05d-fb6c6dbf9e68\") " pod="openshift-apiserver/apiserver-76f77b778f-7sz2n" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.047500 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f3f0dcdd-283d-4ed5-889a-da260dcf13b0-client-ca\") pod \"controller-manager-879f6c89f-rvflz\" (UID: \"f3f0dcdd-283d-4ed5-889a-da260dcf13b0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-rvflz" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.049169 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6b7ad647-b9dc-4694-beab-5908b529d9cf-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-48jd6\" (UID: \"6b7ad647-b9dc-4694-beab-5908b529d9cf\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-48jd6" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.050237 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/fd3a77bc-f325-4019-ad7b-e03b97e0471a-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-fw7kk\" (UID: \"fd3a77bc-f325-4019-ad7b-e03b97e0471a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fw7kk" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.051203 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/e8089407-89fb-42c1-8947-58fa83f8ef4c-machine-approver-tls\") pod \"machine-approver-56656f9798-6ng8r\" (UID: \"e8089407-89fb-42c1-8947-58fa83f8ef4c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6ng8r" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.053605 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-6946k"] Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.053750 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/2477e356-dc04-44a6-bec0-e7304134493f-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-ntjk6\" (UID: \"2477e356-dc04-44a6-bec0-e7304134493f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ntjk6" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.057396 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29319045-qpr28"] Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.058835 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lwcrf"] Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.060126 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f1a2a22-45fb-441a-a05d-fb6c6dbf9e68-serving-cert\") pod \"apiserver-76f77b778f-7sz2n\" (UID: \"2f1a2a22-45fb-441a-a05d-fb6c6dbf9e68\") " pod="openshift-apiserver/apiserver-76f77b778f-7sz2n" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.060231 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6946k" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.060395 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/2f1a2a22-45fb-441a-a05d-fb6c6dbf9e68-encryption-config\") pod \"apiserver-76f77b778f-7sz2n\" (UID: \"2f1a2a22-45fb-441a-a05d-fb6c6dbf9e68\") " pod="openshift-apiserver/apiserver-76f77b778f-7sz2n" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.061195 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hnqp5" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.062210 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29319045-qpr28" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.064851 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5c9h9"] Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.065052 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lwcrf" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.067104 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8vwzv"] Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.068484 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8vwzv" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.068849 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5c9h9" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.069702 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e9b3ddc5-03c1-4d65-b890-660e9e8cc6c0-config\") pod \"openshift-apiserver-operator-796bbdcf4f-4t7tw\" (UID: \"e9b3ddc5-03c1-4d65-b890-660e9e8cc6c0\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4t7tw" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.070932 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.072048 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.074371 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e9b3ddc5-03c1-4d65-b890-660e9e8cc6c0-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-4t7tw\" (UID: \"e9b3ddc5-03c1-4d65-b890-660e9e8cc6c0\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4t7tw" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.074701 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/6b7ad647-b9dc-4694-beab-5908b529d9cf-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-48jd6\" (UID: \"6b7ad647-b9dc-4694-beab-5908b529d9cf\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-48jd6" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.078561 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/2f1a2a22-45fb-441a-a05d-fb6c6dbf9e68-etcd-client\") pod \"apiserver-76f77b778f-7sz2n\" (UID: \"2f1a2a22-45fb-441a-a05d-fb6c6dbf9e68\") " pod="openshift-apiserver/apiserver-76f77b778f-7sz2n" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.080025 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gvhpf"] Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.081348 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gvhpf" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.081592 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f3f0dcdd-283d-4ed5-889a-da260dcf13b0-serving-cert\") pod \"controller-manager-879f6c89f-rvflz\" (UID: \"f3f0dcdd-283d-4ed5-889a-da260dcf13b0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-rvflz" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.089262 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.089486 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.089363 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-pbmfv"] Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.090538 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-pbmfv" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.091492 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-npmvp"] Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.093170 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-mw85q"] Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.093767 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-mw85q" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.094056 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-npmvp" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.095835 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b7cd6"] Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.096565 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b7cd6" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.097559 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-48jd6"] Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.109014 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-ntjk6"] Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.109854 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.110007 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-bhv9s"] Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.110778 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-bhv9s" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.111293 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-fc286"] Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.111822 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-fc286" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.112471 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fw7kk"] Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.113605 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4t7tw"] Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.114642 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-s6ksk"] Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.115696 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bq279"] Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.116746 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-jvmfr"] Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.117987 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29319045-qpr28"] Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.119029 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-rvflz"] Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.119983 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-z4kqj"] Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.121042 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-7sz2n"] Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.122118 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mfldp"] Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.123158 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-h5ql5"] Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.124232 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-q8xbr"] Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.125279 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-79rht"] Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.127922 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-pbmfv"] Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.129383 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-57fqh"] Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.130863 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-bhw29"] Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.132288 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-6946k"] Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.133658 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-hbftw"] Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.133663 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.135155 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-dbt29"] Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.137742 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-cxqx7"] Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.138049 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-s8pf8"] Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.139484 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lwcrf"] Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.140609 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-hnqp5"] Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.141751 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-krscp"] Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.143133 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5c9h9"] Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.144560 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-fc286"] Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.146080 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-jvk96"] Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.147102 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-jvk96" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.147287 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-zr2lt"] Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.148712 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-zr2lt" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.149387 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.149781 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-whxr2"] Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.154173 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-npmvp"] Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.161936 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-bp6hz"] Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.166611 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-mw85q"] Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.168240 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-prpgr"] Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.168735 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.169002 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b7cd6"] Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.170215 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-974fr"] Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.171659 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-zr2lt"] Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.172634 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gvhpf"] Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.174022 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8vwzv"] Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.174977 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-jvk96"] Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.188619 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.209316 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.228368 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.248638 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.269077 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.288487 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.309658 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.328654 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.348929 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.368632 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.389872 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.410226 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.430147 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.448883 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.469968 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.490035 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.510354 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.530543 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.549607 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.570167 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.590057 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.629665 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.649026 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.669513 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.689023 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.709538 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.749627 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.772129 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.789145 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.809504 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.828705 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.849367 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.869485 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.889504 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.909985 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.929421 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.950074 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Sep 29 10:46:40 crc kubenswrapper[4752]: I0929 10:46:40.969197 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Sep 29 10:46:41 crc kubenswrapper[4752]: I0929 10:46:41.014766 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dcnj\" (UniqueName: \"kubernetes.io/projected/e9b3ddc5-03c1-4d65-b890-660e9e8cc6c0-kube-api-access-9dcnj\") pod \"openshift-apiserver-operator-796bbdcf4f-4t7tw\" (UID: \"e9b3ddc5-03c1-4d65-b890-660e9e8cc6c0\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4t7tw" Sep 29 10:46:41 crc kubenswrapper[4752]: I0929 10:46:41.035694 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lrwc\" (UniqueName: \"kubernetes.io/projected/2f1a2a22-45fb-441a-a05d-fb6c6dbf9e68-kube-api-access-7lrwc\") pod \"apiserver-76f77b778f-7sz2n\" (UID: \"2f1a2a22-45fb-441a-a05d-fb6c6dbf9e68\") " pod="openshift-apiserver/apiserver-76f77b778f-7sz2n" Sep 29 10:46:41 crc kubenswrapper[4752]: I0929 10:46:41.043929 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6b7ad647-b9dc-4694-beab-5908b529d9cf-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-48jd6\" (UID: \"6b7ad647-b9dc-4694-beab-5908b529d9cf\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-48jd6" Sep 29 10:46:41 crc kubenswrapper[4752]: I0929 10:46:41.047727 4752 request.go:700] Waited for 1.014230531s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-api/serviceaccounts/machine-api-operator/token Sep 29 10:46:41 crc kubenswrapper[4752]: I0929 10:46:41.067493 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hw72r\" (UniqueName: \"kubernetes.io/projected/2477e356-dc04-44a6-bec0-e7304134493f-kube-api-access-hw72r\") pod \"machine-api-operator-5694c8668f-ntjk6\" (UID: \"2477e356-dc04-44a6-bec0-e7304134493f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ntjk6" Sep 29 10:46:41 crc kubenswrapper[4752]: I0929 10:46:41.085994 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5jm8\" (UniqueName: \"kubernetes.io/projected/6b7ad647-b9dc-4694-beab-5908b529d9cf-kube-api-access-c5jm8\") pod \"cluster-image-registry-operator-dc59b4c8b-48jd6\" (UID: \"6b7ad647-b9dc-4694-beab-5908b529d9cf\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-48jd6" Sep 29 10:46:41 crc kubenswrapper[4752]: I0929 10:46:41.106249 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xs56\" (UniqueName: \"kubernetes.io/projected/e8089407-89fb-42c1-8947-58fa83f8ef4c-kube-api-access-6xs56\") pod \"machine-approver-56656f9798-6ng8r\" (UID: \"e8089407-89fb-42c1-8947-58fa83f8ef4c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6ng8r" Sep 29 10:46:41 crc kubenswrapper[4752]: I0929 10:46:41.115434 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-ntjk6" Sep 29 10:46:41 crc kubenswrapper[4752]: I0929 10:46:41.124437 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tktcs\" (UniqueName: \"kubernetes.io/projected/f3f0dcdd-283d-4ed5-889a-da260dcf13b0-kube-api-access-tktcs\") pod \"controller-manager-879f6c89f-rvflz\" (UID: \"f3f0dcdd-283d-4ed5-889a-da260dcf13b0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-rvflz" Sep 29 10:46:41 crc kubenswrapper[4752]: I0929 10:46:41.131251 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-rvflz" Sep 29 10:46:41 crc kubenswrapper[4752]: I0929 10:46:41.139919 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-48jd6" Sep 29 10:46:41 crc kubenswrapper[4752]: I0929 10:46:41.143539 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwft9\" (UniqueName: \"kubernetes.io/projected/fd3a77bc-f325-4019-ad7b-e03b97e0471a-kube-api-access-xwft9\") pod \"cluster-samples-operator-665b6dd947-fw7kk\" (UID: \"fd3a77bc-f325-4019-ad7b-e03b97e0471a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fw7kk" Sep 29 10:46:41 crc kubenswrapper[4752]: I0929 10:46:41.149661 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Sep 29 10:46:41 crc kubenswrapper[4752]: I0929 10:46:41.169131 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Sep 29 10:46:41 crc kubenswrapper[4752]: I0929 10:46:41.180055 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6ng8r" Sep 29 10:46:41 crc kubenswrapper[4752]: I0929 10:46:41.190358 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Sep 29 10:46:41 crc kubenswrapper[4752]: I0929 10:46:41.208720 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Sep 29 10:46:41 crc kubenswrapper[4752]: I0929 10:46:41.215095 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4t7tw" Sep 29 10:46:41 crc kubenswrapper[4752]: I0929 10:46:41.229370 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-7sz2n" Sep 29 10:46:41 crc kubenswrapper[4752]: I0929 10:46:41.230602 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Sep 29 10:46:41 crc kubenswrapper[4752]: I0929 10:46:41.250252 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Sep 29 10:46:41 crc kubenswrapper[4752]: I0929 10:46:41.268502 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Sep 29 10:46:41 crc kubenswrapper[4752]: I0929 10:46:41.289113 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 29 10:46:41 crc kubenswrapper[4752]: I0929 10:46:41.296425 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fw7kk" Sep 29 10:46:41 crc kubenswrapper[4752]: I0929 10:46:41.311478 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-ntjk6"] Sep 29 10:46:41 crc kubenswrapper[4752]: I0929 10:46:41.312582 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Sep 29 10:46:41 crc kubenswrapper[4752]: I0929 10:46:41.329667 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 29 10:46:41 crc kubenswrapper[4752]: I0929 10:46:41.350619 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Sep 29 10:46:41 crc kubenswrapper[4752]: I0929 10:46:41.368994 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Sep 29 10:46:41 crc kubenswrapper[4752]: I0929 10:46:41.389754 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Sep 29 10:46:41 crc kubenswrapper[4752]: I0929 10:46:41.410227 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Sep 29 10:46:41 crc kubenswrapper[4752]: I0929 10:46:41.429997 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Sep 29 10:46:41 crc kubenswrapper[4752]: I0929 10:46:41.438306 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4t7tw"] Sep 29 10:46:41 crc kubenswrapper[4752]: W0929 10:46:41.446039 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9b3ddc5_03c1_4d65_b890_660e9e8cc6c0.slice/crio-4805214707f7da4d59ba360b21bf5c445d91a41d96e43dad15ef0ab247b7795c WatchSource:0}: Error finding container 4805214707f7da4d59ba360b21bf5c445d91a41d96e43dad15ef0ab247b7795c: Status 404 returned error can't find the container with id 4805214707f7da4d59ba360b21bf5c445d91a41d96e43dad15ef0ab247b7795c Sep 29 10:46:41 crc kubenswrapper[4752]: I0929 10:46:41.449113 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Sep 29 10:46:41 crc kubenswrapper[4752]: I0929 10:46:41.460693 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-7sz2n"] Sep 29 10:46:41 crc kubenswrapper[4752]: I0929 10:46:41.469522 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Sep 29 10:46:41 crc kubenswrapper[4752]: W0929 10:46:41.474405 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f1a2a22_45fb_441a_a05d_fb6c6dbf9e68.slice/crio-e75290bf5912b9330d704a9ecc30324420ba000c384cfc00982f22f7acb188fa WatchSource:0}: Error finding container e75290bf5912b9330d704a9ecc30324420ba000c384cfc00982f22f7acb188fa: Status 404 returned error can't find the container with id e75290bf5912b9330d704a9ecc30324420ba000c384cfc00982f22f7acb188fa Sep 29 10:46:41 crc kubenswrapper[4752]: I0929 10:46:41.489599 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Sep 29 10:46:41 crc kubenswrapper[4752]: I0929 10:46:41.508696 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Sep 29 10:46:41 crc kubenswrapper[4752]: I0929 10:46:41.512401 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fw7kk"] Sep 29 10:46:41 crc kubenswrapper[4752]: I0929 10:46:41.529149 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Sep 29 10:46:41 crc kubenswrapper[4752]: I0929 10:46:41.557932 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-48jd6"] Sep 29 10:46:41 crc kubenswrapper[4752]: I0929 10:46:41.562723 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Sep 29 10:46:41 crc kubenswrapper[4752]: I0929 10:46:41.569056 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Sep 29 10:46:41 crc kubenswrapper[4752]: I0929 10:46:41.570438 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-rvflz"] Sep 29 10:46:41 crc kubenswrapper[4752]: W0929 10:46:41.581195 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf3f0dcdd_283d_4ed5_889a_da260dcf13b0.slice/crio-41a6dd3179fbf9f69b95fcd8c10f11a3a08fec1e76ddb4a9ea951c5e06ba3c8d WatchSource:0}: Error finding container 41a6dd3179fbf9f69b95fcd8c10f11a3a08fec1e76ddb4a9ea951c5e06ba3c8d: Status 404 returned error can't find the container with id 41a6dd3179fbf9f69b95fcd8c10f11a3a08fec1e76ddb4a9ea951c5e06ba3c8d Sep 29 10:46:41 crc kubenswrapper[4752]: I0929 10:46:41.589306 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Sep 29 10:46:41 crc kubenswrapper[4752]: I0929 10:46:41.609649 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Sep 29 10:46:41 crc kubenswrapper[4752]: I0929 10:46:41.629311 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Sep 29 10:46:41 crc kubenswrapper[4752]: I0929 10:46:41.649097 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Sep 29 10:46:41 crc kubenswrapper[4752]: I0929 10:46:41.670159 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Sep 29 10:46:41 crc kubenswrapper[4752]: I0929 10:46:41.689233 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Sep 29 10:46:41 crc kubenswrapper[4752]: I0929 10:46:41.708661 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Sep 29 10:46:41 crc kubenswrapper[4752]: I0929 10:46:41.729470 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Sep 29 10:46:41 crc kubenswrapper[4752]: I0929 10:46:41.749880 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Sep 29 10:46:41 crc kubenswrapper[4752]: I0929 10:46:41.759540 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4t7tw" event={"ID":"e9b3ddc5-03c1-4d65-b890-660e9e8cc6c0","Type":"ContainerStarted","Data":"4805214707f7da4d59ba360b21bf5c445d91a41d96e43dad15ef0ab247b7795c"} Sep 29 10:46:41 crc kubenswrapper[4752]: I0929 10:46:41.762248 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6ng8r" event={"ID":"e8089407-89fb-42c1-8947-58fa83f8ef4c","Type":"ContainerStarted","Data":"9765b4cac490c4044a78248253a0fe07b37868939f84303524d5fa0d9eb19a90"} Sep 29 10:46:41 crc kubenswrapper[4752]: I0929 10:46:41.762289 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6ng8r" event={"ID":"e8089407-89fb-42c1-8947-58fa83f8ef4c","Type":"ContainerStarted","Data":"1cac439740d7084ca5f7aa15fd6cdc83fd1c006bb0d68ec0a9021809cd4fd0b1"} Sep 29 10:46:41 crc kubenswrapper[4752]: I0929 10:46:41.763410 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-48jd6" event={"ID":"6b7ad647-b9dc-4694-beab-5908b529d9cf","Type":"ContainerStarted","Data":"5ea59b539f84c9a43893c21f3c6956f4fbb1ff0df7f0f95c12c75474f9544f0b"} Sep 29 10:46:41 crc kubenswrapper[4752]: I0929 10:46:41.764911 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-rvflz" event={"ID":"f3f0dcdd-283d-4ed5-889a-da260dcf13b0","Type":"ContainerStarted","Data":"41a6dd3179fbf9f69b95fcd8c10f11a3a08fec1e76ddb4a9ea951c5e06ba3c8d"} Sep 29 10:46:41 crc kubenswrapper[4752]: I0929 10:46:41.766133 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-7sz2n" event={"ID":"2f1a2a22-45fb-441a-a05d-fb6c6dbf9e68","Type":"ContainerStarted","Data":"e75290bf5912b9330d704a9ecc30324420ba000c384cfc00982f22f7acb188fa"} Sep 29 10:46:41 crc kubenswrapper[4752]: I0929 10:46:41.767261 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-ntjk6" event={"ID":"2477e356-dc04-44a6-bec0-e7304134493f","Type":"ContainerStarted","Data":"ea1b3a5edcdbf3232f9d8dc6ca46f6a98e3218a5b912b906831ecc19fb8f705e"} Sep 29 10:46:41 crc kubenswrapper[4752]: I0929 10:46:41.767284 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-ntjk6" event={"ID":"2477e356-dc04-44a6-bec0-e7304134493f","Type":"ContainerStarted","Data":"f28a06becba23de0e3f522c5c6cc905bfda4bb6bcce638e97ad4eedc1d44194c"} Sep 29 10:46:41 crc kubenswrapper[4752]: I0929 10:46:41.772502 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Sep 29 10:46:41 crc kubenswrapper[4752]: I0929 10:46:41.789918 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Sep 29 10:46:41 crc kubenswrapper[4752]: I0929 10:46:41.809020 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Sep 29 10:46:41 crc kubenswrapper[4752]: I0929 10:46:41.829583 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Sep 29 10:46:41 crc kubenswrapper[4752]: I0929 10:46:41.849386 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Sep 29 10:46:41 crc kubenswrapper[4752]: I0929 10:46:41.868380 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Sep 29 10:46:41 crc kubenswrapper[4752]: I0929 10:46:41.888585 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Sep 29 10:46:41 crc kubenswrapper[4752]: I0929 10:46:41.909286 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Sep 29 10:46:41 crc kubenswrapper[4752]: I0929 10:46:41.929496 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Sep 29 10:46:41 crc kubenswrapper[4752]: I0929 10:46:41.949452 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Sep 29 10:46:41 crc kubenswrapper[4752]: I0929 10:46:41.969925 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Sep 29 10:46:41 crc kubenswrapper[4752]: I0929 10:46:41.990184 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.009733 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.030177 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.051917 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.067361 4752 request.go:700] Waited for 1.918289573s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/hostpath-provisioner/secrets?fieldSelector=metadata.name%3Dcsi-hostpath-provisioner-sa-dockercfg-qd74k&limit=500&resourceVersion=0 Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.069822 4752 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.089422 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.156090 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/82b648a1-4d55-4994-b471-4938eeb34bd0-etcd-client\") pod \"apiserver-7bbb656c7d-jvmfr\" (UID: \"82b648a1-4d55-4994-b471-4938eeb34bd0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jvmfr" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.156192 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f00d489f-9b4a-421a-a02a-b9b090ae0449-audit-policies\") pod \"oauth-openshift-558db77b4-prpgr\" (UID: \"f00d489f-9b4a-421a-a02a-b9b090ae0449\") " pod="openshift-authentication/oauth-openshift-558db77b4-prpgr" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.156215 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f00d489f-9b4a-421a-a02a-b9b090ae0449-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-prpgr\" (UID: \"f00d489f-9b4a-421a-a02a-b9b090ae0449\") " pod="openshift-authentication/oauth-openshift-558db77b4-prpgr" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.156254 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6445acca-bd3a-41a4-8b4a-16607771e077-config\") pod \"console-operator-58897d9998-hbftw\" (UID: \"6445acca-bd3a-41a4-8b4a-16607771e077\") " pod="openshift-console-operator/console-operator-58897d9998-hbftw" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.156282 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/82b648a1-4d55-4994-b471-4938eeb34bd0-encryption-config\") pod \"apiserver-7bbb656c7d-jvmfr\" (UID: \"82b648a1-4d55-4994-b471-4938eeb34bd0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jvmfr" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.156301 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f00d489f-9b4a-421a-a02a-b9b090ae0449-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-prpgr\" (UID: \"f00d489f-9b4a-421a-a02a-b9b090ae0449\") " pod="openshift-authentication/oauth-openshift-558db77b4-prpgr" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.156322 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/981034b2-30d0-4fea-933d-ef36b1b20d25-metrics-tls\") pod \"ingress-operator-5b745b69d9-z4kqj\" (UID: \"981034b2-30d0-4fea-933d-ef36b1b20d25\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z4kqj" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.156426 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/981034b2-30d0-4fea-933d-ef36b1b20d25-bound-sa-token\") pod \"ingress-operator-5b745b69d9-z4kqj\" (UID: \"981034b2-30d0-4fea-933d-ef36b1b20d25\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z4kqj" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.156513 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brb8j\" (UniqueName: \"kubernetes.io/projected/50f1884b-af0d-429a-b38e-ffd726a9463b-kube-api-access-brb8j\") pod \"kube-storage-version-migrator-operator-b67b599dd-mfldp\" (UID: \"50f1884b-af0d-429a-b38e-ffd726a9463b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mfldp" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.156686 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/53fa29ee-8f5a-4c5b-9d74-3ff726f5ed28-service-ca\") pod \"console-f9d7485db-bp6hz\" (UID: \"53fa29ee-8f5a-4c5b-9d74-3ff726f5ed28\") " pod="openshift-console/console-f9d7485db-bp6hz" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.156717 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f00d489f-9b4a-421a-a02a-b9b090ae0449-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-prpgr\" (UID: \"f00d489f-9b4a-421a-a02a-b9b090ae0449\") " pod="openshift-authentication/oauth-openshift-558db77b4-prpgr" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.156756 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1ee4b36a-3e8c-41f7-9457-7928047cd0c6-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-s6ksk\" (UID: \"1ee4b36a-3e8c-41f7-9457-7928047cd0c6\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-s6ksk" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.156793 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f756d24-5e77-4130-b920-794234a82ece-bound-sa-token\") pod \"image-registry-697d97f7c8-57fqh\" (UID: \"8f756d24-5e77-4130-b920-794234a82ece\") " pod="openshift-image-registry/image-registry-697d97f7c8-57fqh" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.156863 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hk6x6\" (UniqueName: \"kubernetes.io/projected/53fa29ee-8f5a-4c5b-9d74-3ff726f5ed28-kube-api-access-hk6x6\") pod \"console-f9d7485db-bp6hz\" (UID: \"53fa29ee-8f5a-4c5b-9d74-3ff726f5ed28\") " pod="openshift-console/console-f9d7485db-bp6hz" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.156887 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f00d489f-9b4a-421a-a02a-b9b090ae0449-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-prpgr\" (UID: \"f00d489f-9b4a-421a-a02a-b9b090ae0449\") " pod="openshift-authentication/oauth-openshift-558db77b4-prpgr" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.156922 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnlvs\" (UniqueName: \"kubernetes.io/projected/1ee4b36a-3e8c-41f7-9457-7928047cd0c6-kube-api-access-rnlvs\") pod \"openshift-controller-manager-operator-756b6f6bc6-s6ksk\" (UID: \"1ee4b36a-3e8c-41f7-9457-7928047cd0c6\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-s6ksk" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.156962 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f94d0b66-3c2d-46f5-bcdb-078cbb7cccae-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-krscp\" (UID: \"f94d0b66-3c2d-46f5-bcdb-078cbb7cccae\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-krscp" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.157069 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/50f1884b-af0d-429a-b38e-ffd726a9463b-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-mfldp\" (UID: \"50f1884b-af0d-429a-b38e-ffd726a9463b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mfldp" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.157251 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d9cf5107-f1bf-41ee-bd8a-e3dd8dbfeb5d-service-ca-bundle\") pod \"router-default-5444994796-r8k8r\" (UID: \"d9cf5107-f1bf-41ee-bd8a-e3dd8dbfeb5d\") " pod="openshift-ingress/router-default-5444994796-r8k8r" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.157298 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/d9cf5107-f1bf-41ee-bd8a-e3dd8dbfeb5d-stats-auth\") pod \"router-default-5444994796-r8k8r\" (UID: \"d9cf5107-f1bf-41ee-bd8a-e3dd8dbfeb5d\") " pod="openshift-ingress/router-default-5444994796-r8k8r" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.157334 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/78a002dc-c902-472c-b269-9ec7c99ab835-serving-cert\") pod \"route-controller-manager-6576b87f9c-q8xbr\" (UID: \"78a002dc-c902-472c-b269-9ec7c99ab835\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-q8xbr" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.157517 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/82b648a1-4d55-4994-b471-4938eeb34bd0-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-jvmfr\" (UID: \"82b648a1-4d55-4994-b471-4938eeb34bd0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jvmfr" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.157558 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f00d489f-9b4a-421a-a02a-b9b090ae0449-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-prpgr\" (UID: \"f00d489f-9b4a-421a-a02a-b9b090ae0449\") " pod="openshift-authentication/oauth-openshift-558db77b4-prpgr" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.157608 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f00d489f-9b4a-421a-a02a-b9b090ae0449-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-prpgr\" (UID: \"f00d489f-9b4a-421a-a02a-b9b090ae0449\") " pod="openshift-authentication/oauth-openshift-558db77b4-prpgr" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.157722 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tddv6\" (UniqueName: \"kubernetes.io/projected/44ae2b29-ec3a-4321-8590-4d316d810034-kube-api-access-tddv6\") pod \"downloads-7954f5f757-bhw29\" (UID: \"44ae2b29-ec3a-4321-8590-4d316d810034\") " pod="openshift-console/downloads-7954f5f757-bhw29" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.157822 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/53fa29ee-8f5a-4c5b-9d74-3ff726f5ed28-console-serving-cert\") pod \"console-f9d7485db-bp6hz\" (UID: \"53fa29ee-8f5a-4c5b-9d74-3ff726f5ed28\") " pod="openshift-console/console-f9d7485db-bp6hz" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.157847 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/53fa29ee-8f5a-4c5b-9d74-3ff726f5ed28-console-config\") pod \"console-f9d7485db-bp6hz\" (UID: \"53fa29ee-8f5a-4c5b-9d74-3ff726f5ed28\") " pod="openshift-console/console-f9d7485db-bp6hz" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.157876 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fc5ce698-c6c1-41fc-9b25-864081396f26-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-bq279\" (UID: \"fc5ce698-c6c1-41fc-9b25-864081396f26\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bq279" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.157904 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzvrr\" (UniqueName: \"kubernetes.io/projected/d9cf5107-f1bf-41ee-bd8a-e3dd8dbfeb5d-kube-api-access-pzvrr\") pod \"router-default-5444994796-r8k8r\" (UID: \"d9cf5107-f1bf-41ee-bd8a-e3dd8dbfeb5d\") " pod="openshift-ingress/router-default-5444994796-r8k8r" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.157955 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f00d489f-9b4a-421a-a02a-b9b090ae0449-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-prpgr\" (UID: \"f00d489f-9b4a-421a-a02a-b9b090ae0449\") " pod="openshift-authentication/oauth-openshift-558db77b4-prpgr" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.158021 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/82b648a1-4d55-4994-b471-4938eeb34bd0-audit-dir\") pod \"apiserver-7bbb656c7d-jvmfr\" (UID: \"82b648a1-4d55-4994-b471-4938eeb34bd0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jvmfr" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.158085 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/53fa29ee-8f5a-4c5b-9d74-3ff726f5ed28-console-oauth-config\") pod \"console-f9d7485db-bp6hz\" (UID: \"53fa29ee-8f5a-4c5b-9d74-3ff726f5ed28\") " pod="openshift-console/console-f9d7485db-bp6hz" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.158114 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6445acca-bd3a-41a4-8b4a-16607771e077-serving-cert\") pod \"console-operator-58897d9998-hbftw\" (UID: \"6445acca-bd3a-41a4-8b4a-16607771e077\") " pod="openshift-console-operator/console-operator-58897d9998-hbftw" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.158141 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdmbs\" (UniqueName: \"kubernetes.io/projected/8f756d24-5e77-4130-b920-794234a82ece-kube-api-access-gdmbs\") pod \"image-registry-697d97f7c8-57fqh\" (UID: \"8f756d24-5e77-4130-b920-794234a82ece\") " pod="openshift-image-registry/image-registry-697d97f7c8-57fqh" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.158168 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f756d24-5e77-4130-b920-794234a82ece-installation-pull-secrets\") pod \"image-registry-697d97f7c8-57fqh\" (UID: \"8f756d24-5e77-4130-b920-794234a82ece\") " pod="openshift-image-registry/image-registry-697d97f7c8-57fqh" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.158195 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/82b648a1-4d55-4994-b471-4938eeb34bd0-serving-cert\") pod \"apiserver-7bbb656c7d-jvmfr\" (UID: \"82b648a1-4d55-4994-b471-4938eeb34bd0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jvmfr" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.158217 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc5ce698-c6c1-41fc-9b25-864081396f26-config\") pod \"kube-controller-manager-operator-78b949d7b-bq279\" (UID: \"fc5ce698-c6c1-41fc-9b25-864081396f26\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bq279" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.158244 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78a002dc-c902-472c-b269-9ec7c99ab835-config\") pod \"route-controller-manager-6576b87f9c-q8xbr\" (UID: \"78a002dc-c902-472c-b269-9ec7c99ab835\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-q8xbr" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.158275 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c234e889-6259-4ada-827f-532882f57c4c-service-ca-bundle\") pod \"authentication-operator-69f744f599-dbt29\" (UID: \"c234e889-6259-4ada-827f-532882f57c4c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dbt29" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.158324 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwmb6\" (UniqueName: \"kubernetes.io/projected/78a002dc-c902-472c-b269-9ec7c99ab835-kube-api-access-wwmb6\") pod \"route-controller-manager-6576b87f9c-q8xbr\" (UID: \"78a002dc-c902-472c-b269-9ec7c99ab835\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-q8xbr" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.158351 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/981034b2-30d0-4fea-933d-ef36b1b20d25-trusted-ca\") pod \"ingress-operator-5b745b69d9-z4kqj\" (UID: \"981034b2-30d0-4fea-933d-ef36b1b20d25\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z4kqj" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.158381 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f756d24-5e77-4130-b920-794234a82ece-registry-tls\") pod \"image-registry-697d97f7c8-57fqh\" (UID: \"8f756d24-5e77-4130-b920-794234a82ece\") " pod="openshift-image-registry/image-registry-697d97f7c8-57fqh" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.158405 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9ce060b9-be39-4731-a723-388817737e31-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-whxr2\" (UID: \"9ce060b9-be39-4731-a723-388817737e31\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-whxr2" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.158436 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d9cf5107-f1bf-41ee-bd8a-e3dd8dbfeb5d-metrics-certs\") pod \"router-default-5444994796-r8k8r\" (UID: \"d9cf5107-f1bf-41ee-bd8a-e3dd8dbfeb5d\") " pod="openshift-ingress/router-default-5444994796-r8k8r" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.158463 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkk2f\" (UniqueName: \"kubernetes.io/projected/24ab4270-1ece-4201-94ae-51c71902c3f1-kube-api-access-nkk2f\") pod \"control-plane-machine-set-operator-78cbb6b69f-s8pf8\" (UID: \"24ab4270-1ece-4201-94ae-51c71902c3f1\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-s8pf8" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.158493 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c234e889-6259-4ada-827f-532882f57c4c-serving-cert\") pod \"authentication-operator-69f744f599-dbt29\" (UID: \"c234e889-6259-4ada-827f-532882f57c4c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dbt29" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.158524 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f00d489f-9b4a-421a-a02a-b9b090ae0449-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-prpgr\" (UID: \"f00d489f-9b4a-421a-a02a-b9b090ae0449\") " pod="openshift-authentication/oauth-openshift-558db77b4-prpgr" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.158556 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f756d24-5e77-4130-b920-794234a82ece-ca-trust-extracted\") pod \"image-registry-697d97f7c8-57fqh\" (UID: \"8f756d24-5e77-4130-b920-794234a82ece\") " pod="openshift-image-registry/image-registry-697d97f7c8-57fqh" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.158581 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f00d489f-9b4a-421a-a02a-b9b090ae0449-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-prpgr\" (UID: \"f00d489f-9b4a-421a-a02a-b9b090ae0449\") " pod="openshift-authentication/oauth-openshift-558db77b4-prpgr" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.158609 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l76nd\" (UniqueName: \"kubernetes.io/projected/981034b2-30d0-4fea-933d-ef36b1b20d25-kube-api-access-l76nd\") pod \"ingress-operator-5b745b69d9-z4kqj\" (UID: \"981034b2-30d0-4fea-933d-ef36b1b20d25\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z4kqj" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.158758 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fc5ce698-c6c1-41fc-9b25-864081396f26-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-bq279\" (UID: \"fc5ce698-c6c1-41fc-9b25-864081396f26\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bq279" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.158786 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/78a002dc-c902-472c-b269-9ec7c99ab835-client-ca\") pod \"route-controller-manager-6576b87f9c-q8xbr\" (UID: \"78a002dc-c902-472c-b269-9ec7c99ab835\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-q8xbr" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.158854 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ee4b36a-3e8c-41f7-9457-7928047cd0c6-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-s6ksk\" (UID: \"1ee4b36a-3e8c-41f7-9457-7928047cd0c6\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-s6ksk" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.158881 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50f1884b-af0d-429a-b38e-ffd726a9463b-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-mfldp\" (UID: \"50f1884b-af0d-429a-b38e-ffd726a9463b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mfldp" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.158945 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f756d24-5e77-4130-b920-794234a82ece-trusted-ca\") pod \"image-registry-697d97f7c8-57fqh\" (UID: \"8f756d24-5e77-4130-b920-794234a82ece\") " pod="openshift-image-registry/image-registry-697d97f7c8-57fqh" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.159012 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/53fa29ee-8f5a-4c5b-9d74-3ff726f5ed28-oauth-serving-cert\") pod \"console-f9d7485db-bp6hz\" (UID: \"53fa29ee-8f5a-4c5b-9d74-3ff726f5ed28\") " pod="openshift-console/console-f9d7485db-bp6hz" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.159061 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6445acca-bd3a-41a4-8b4a-16607771e077-trusted-ca\") pod \"console-operator-58897d9998-hbftw\" (UID: \"6445acca-bd3a-41a4-8b4a-16607771e077\") " pod="openshift-console-operator/console-operator-58897d9998-hbftw" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.159091 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ce060b9-be39-4731-a723-388817737e31-config\") pod \"kube-apiserver-operator-766d6c64bb-whxr2\" (UID: \"9ce060b9-be39-4731-a723-388817737e31\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-whxr2" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.159126 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/82b648a1-4d55-4994-b471-4938eeb34bd0-audit-policies\") pod \"apiserver-7bbb656c7d-jvmfr\" (UID: \"82b648a1-4d55-4994-b471-4938eeb34bd0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jvmfr" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.159146 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9ce060b9-be39-4731-a723-388817737e31-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-whxr2\" (UID: \"9ce060b9-be39-4731-a723-388817737e31\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-whxr2" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.159169 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c234e889-6259-4ada-827f-532882f57c4c-config\") pod \"authentication-operator-69f744f599-dbt29\" (UID: \"c234e889-6259-4ada-827f-532882f57c4c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dbt29" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.159190 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c234e889-6259-4ada-827f-532882f57c4c-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-dbt29\" (UID: \"c234e889-6259-4ada-827f-532882f57c4c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dbt29" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.159250 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-57fqh\" (UID: \"8f756d24-5e77-4130-b920-794234a82ece\") " pod="openshift-image-registry/image-registry-697d97f7c8-57fqh" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.159277 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/033ab5f2-0ffb-4f26-b703-8380971d1d7e-serving-cert\") pod \"openshift-config-operator-7777fb866f-h5ql5\" (UID: \"033ab5f2-0ffb-4f26-b703-8380971d1d7e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-h5ql5" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.159301 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhhxv\" (UniqueName: \"kubernetes.io/projected/033ab5f2-0ffb-4f26-b703-8380971d1d7e-kube-api-access-xhhxv\") pod \"openshift-config-operator-7777fb866f-h5ql5\" (UID: \"033ab5f2-0ffb-4f26-b703-8380971d1d7e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-h5ql5" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.159324 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/53fa29ee-8f5a-4c5b-9d74-3ff726f5ed28-trusted-ca-bundle\") pod \"console-f9d7485db-bp6hz\" (UID: \"53fa29ee-8f5a-4c5b-9d74-3ff726f5ed28\") " pod="openshift-console/console-f9d7485db-bp6hz" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.159372 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f00d489f-9b4a-421a-a02a-b9b090ae0449-audit-dir\") pod \"oauth-openshift-558db77b4-prpgr\" (UID: \"f00d489f-9b4a-421a-a02a-b9b090ae0449\") " pod="openshift-authentication/oauth-openshift-558db77b4-prpgr" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.159449 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/82b648a1-4d55-4994-b471-4938eeb34bd0-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-jvmfr\" (UID: \"82b648a1-4d55-4994-b471-4938eeb34bd0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jvmfr" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.159476 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59kq2\" (UniqueName: \"kubernetes.io/projected/c234e889-6259-4ada-827f-532882f57c4c-kube-api-access-59kq2\") pod \"authentication-operator-69f744f599-dbt29\" (UID: \"c234e889-6259-4ada-827f-532882f57c4c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dbt29" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.159585 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/033ab5f2-0ffb-4f26-b703-8380971d1d7e-available-featuregates\") pod \"openshift-config-operator-7777fb866f-h5ql5\" (UID: \"033ab5f2-0ffb-4f26-b703-8380971d1d7e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-h5ql5" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.159614 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrgs6\" (UniqueName: \"kubernetes.io/projected/f00d489f-9b4a-421a-a02a-b9b090ae0449-kube-api-access-qrgs6\") pod \"oauth-openshift-558db77b4-prpgr\" (UID: \"f00d489f-9b4a-421a-a02a-b9b090ae0449\") " pod="openshift-authentication/oauth-openshift-558db77b4-prpgr" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.159639 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/24ab4270-1ece-4201-94ae-51c71902c3f1-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-s8pf8\" (UID: \"24ab4270-1ece-4201-94ae-51c71902c3f1\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-s8pf8" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.159720 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qv6kt\" (UniqueName: \"kubernetes.io/projected/6445acca-bd3a-41a4-8b4a-16607771e077-kube-api-access-qv6kt\") pod \"console-operator-58897d9998-hbftw\" (UID: \"6445acca-bd3a-41a4-8b4a-16607771e077\") " pod="openshift-console-operator/console-operator-58897d9998-hbftw" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.159759 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4s4nh\" (UniqueName: \"kubernetes.io/projected/f94d0b66-3c2d-46f5-bcdb-078cbb7cccae-kube-api-access-4s4nh\") pod \"multus-admission-controller-857f4d67dd-krscp\" (UID: \"f94d0b66-3c2d-46f5-bcdb-078cbb7cccae\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-krscp" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.159833 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nk7nw\" (UniqueName: \"kubernetes.io/projected/82b648a1-4d55-4994-b471-4938eeb34bd0-kube-api-access-nk7nw\") pod \"apiserver-7bbb656c7d-jvmfr\" (UID: \"82b648a1-4d55-4994-b471-4938eeb34bd0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jvmfr" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.159863 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/d9cf5107-f1bf-41ee-bd8a-e3dd8dbfeb5d-default-certificate\") pod \"router-default-5444994796-r8k8r\" (UID: \"d9cf5107-f1bf-41ee-bd8a-e3dd8dbfeb5d\") " pod="openshift-ingress/router-default-5444994796-r8k8r" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.159918 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f756d24-5e77-4130-b920-794234a82ece-registry-certificates\") pod \"image-registry-697d97f7c8-57fqh\" (UID: \"8f756d24-5e77-4130-b920-794234a82ece\") " pod="openshift-image-registry/image-registry-697d97f7c8-57fqh" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.159961 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f00d489f-9b4a-421a-a02a-b9b090ae0449-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-prpgr\" (UID: \"f00d489f-9b4a-421a-a02a-b9b090ae0449\") " pod="openshift-authentication/oauth-openshift-558db77b4-prpgr" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.159988 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f00d489f-9b4a-421a-a02a-b9b090ae0449-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-prpgr\" (UID: \"f00d489f-9b4a-421a-a02a-b9b090ae0449\") " pod="openshift-authentication/oauth-openshift-558db77b4-prpgr" Sep 29 10:46:42 crc kubenswrapper[4752]: E0929 10:46:42.160039 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 10:46:42.659987835 +0000 UTC m=+143.449129742 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-57fqh" (UID: "8f756d24-5e77-4130-b920-794234a82ece") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.261100 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 10:46:42 crc kubenswrapper[4752]: E0929 10:46:42.261300 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 10:46:42.761268521 +0000 UTC m=+143.550410188 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.261349 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gf22s\" (UniqueName: \"kubernetes.io/projected/d04ea3ea-71ac-481c-990b-a989a6f61516-kube-api-access-gf22s\") pod \"marketplace-operator-79b997595-pbmfv\" (UID: \"d04ea3ea-71ac-481c-990b-a989a6f61516\") " pod="openshift-marketplace/marketplace-operator-79b997595-pbmfv" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.261390 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fc5ce698-c6c1-41fc-9b25-864081396f26-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-bq279\" (UID: \"fc5ce698-c6c1-41fc-9b25-864081396f26\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bq279" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.261410 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ee4b36a-3e8c-41f7-9457-7928047cd0c6-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-s6ksk\" (UID: \"1ee4b36a-3e8c-41f7-9457-7928047cd0c6\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-s6ksk" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.261457 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50f1884b-af0d-429a-b38e-ffd726a9463b-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-mfldp\" (UID: \"50f1884b-af0d-429a-b38e-ffd726a9463b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mfldp" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.261485 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f756d24-5e77-4130-b920-794234a82ece-trusted-ca\") pod \"image-registry-697d97f7c8-57fqh\" (UID: \"8f756d24-5e77-4130-b920-794234a82ece\") " pod="openshift-image-registry/image-registry-697d97f7c8-57fqh" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.261503 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/53fa29ee-8f5a-4c5b-9d74-3ff726f5ed28-oauth-serving-cert\") pod \"console-f9d7485db-bp6hz\" (UID: \"53fa29ee-8f5a-4c5b-9d74-3ff726f5ed28\") " pod="openshift-console/console-f9d7485db-bp6hz" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.261519 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6445acca-bd3a-41a4-8b4a-16607771e077-trusted-ca\") pod \"console-operator-58897d9998-hbftw\" (UID: \"6445acca-bd3a-41a4-8b4a-16607771e077\") " pod="openshift-console-operator/console-operator-58897d9998-hbftw" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.261534 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c234e889-6259-4ada-827f-532882f57c4c-config\") pod \"authentication-operator-69f744f599-dbt29\" (UID: \"c234e889-6259-4ada-827f-532882f57c4c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dbt29" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.261552 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/2914c017-5539-4c32-9b8a-c494dd5b397a-images\") pod \"machine-config-operator-74547568cd-hnqp5\" (UID: \"2914c017-5539-4c32-9b8a-c494dd5b397a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hnqp5" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.261568 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d04ea3ea-71ac-481c-990b-a989a6f61516-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-pbmfv\" (UID: \"d04ea3ea-71ac-481c-990b-a989a6f61516\") " pod="openshift-marketplace/marketplace-operator-79b997595-pbmfv" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.261586 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9ce060b9-be39-4731-a723-388817737e31-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-whxr2\" (UID: \"9ce060b9-be39-4731-a723-388817737e31\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-whxr2" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.261603 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhhxv\" (UniqueName: \"kubernetes.io/projected/033ab5f2-0ffb-4f26-b703-8380971d1d7e-kube-api-access-xhhxv\") pod \"openshift-config-operator-7777fb866f-h5ql5\" (UID: \"033ab5f2-0ffb-4f26-b703-8380971d1d7e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-h5ql5" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.261619 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f00d489f-9b4a-421a-a02a-b9b090ae0449-audit-dir\") pod \"oauth-openshift-558db77b4-prpgr\" (UID: \"f00d489f-9b4a-421a-a02a-b9b090ae0449\") " pod="openshift-authentication/oauth-openshift-558db77b4-prpgr" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.261642 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-57fqh\" (UID: \"8f756d24-5e77-4130-b920-794234a82ece\") " pod="openshift-image-registry/image-registry-697d97f7c8-57fqh" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.261659 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/033ab5f2-0ffb-4f26-b703-8380971d1d7e-serving-cert\") pod \"openshift-config-operator-7777fb866f-h5ql5\" (UID: \"033ab5f2-0ffb-4f26-b703-8380971d1d7e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-h5ql5" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.261684 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5eaef9fd-30e6-47e1-afbf-8c0a464a4512-proxy-tls\") pod \"machine-config-controller-84d6567774-974fr\" (UID: \"5eaef9fd-30e6-47e1-afbf-8c0a464a4512\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-974fr" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.261731 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9a2a2cd8-7738-47ce-9432-770f570fa47e-config-volume\") pod \"dns-default-jvk96\" (UID: \"9a2a2cd8-7738-47ce-9432-770f570fa47e\") " pod="openshift-dns/dns-default-jvk96" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.261885 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f00d489f-9b4a-421a-a02a-b9b090ae0449-audit-dir\") pod \"oauth-openshift-558db77b4-prpgr\" (UID: \"f00d489f-9b4a-421a-a02a-b9b090ae0449\") " pod="openshift-authentication/oauth-openshift-558db77b4-prpgr" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.262624 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ee4b36a-3e8c-41f7-9457-7928047cd0c6-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-s6ksk\" (UID: \"1ee4b36a-3e8c-41f7-9457-7928047cd0c6\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-s6ksk" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.262657 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c234e889-6259-4ada-827f-532882f57c4c-config\") pod \"authentication-operator-69f744f599-dbt29\" (UID: \"c234e889-6259-4ada-827f-532882f57c4c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dbt29" Sep 29 10:46:42 crc kubenswrapper[4752]: E0929 10:46:42.262727 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 10:46:42.762706631 +0000 UTC m=+143.551848298 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-57fqh" (UID: "8f756d24-5e77-4130-b920-794234a82ece") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.262753 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50f1884b-af0d-429a-b38e-ffd726a9463b-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-mfldp\" (UID: \"50f1884b-af0d-429a-b38e-ffd726a9463b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mfldp" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.262773 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/033ab5f2-0ffb-4f26-b703-8380971d1d7e-available-featuregates\") pod \"openshift-config-operator-7777fb866f-h5ql5\" (UID: \"033ab5f2-0ffb-4f26-b703-8380971d1d7e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-h5ql5" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.262812 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrgs6\" (UniqueName: \"kubernetes.io/projected/f00d489f-9b4a-421a-a02a-b9b090ae0449-kube-api-access-qrgs6\") pod \"oauth-openshift-558db77b4-prpgr\" (UID: \"f00d489f-9b4a-421a-a02a-b9b090ae0449\") " pod="openshift-authentication/oauth-openshift-558db77b4-prpgr" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.262833 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bmhr\" (UniqueName: \"kubernetes.io/projected/2c7183be-dcde-4752-8650-ffd4e9834ce5-kube-api-access-9bmhr\") pod \"catalog-operator-68c6474976-5c9h9\" (UID: \"2c7183be-dcde-4752-8650-ffd4e9834ce5\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5c9h9" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.262860 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0f54680a-b95f-4074-a763-859a7e96962d-metrics-tls\") pod \"dns-operator-744455d44c-cxqx7\" (UID: \"0f54680a-b95f-4074-a763-859a7e96962d\") " pod="openshift-dns-operator/dns-operator-744455d44c-cxqx7" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.262921 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qv6kt\" (UniqueName: \"kubernetes.io/projected/6445acca-bd3a-41a4-8b4a-16607771e077-kube-api-access-qv6kt\") pod \"console-operator-58897d9998-hbftw\" (UID: \"6445acca-bd3a-41a4-8b4a-16607771e077\") " pod="openshift-console-operator/console-operator-58897d9998-hbftw" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.262946 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4s4nh\" (UniqueName: \"kubernetes.io/projected/f94d0b66-3c2d-46f5-bcdb-078cbb7cccae-kube-api-access-4s4nh\") pod \"multus-admission-controller-857f4d67dd-krscp\" (UID: \"f94d0b66-3c2d-46f5-bcdb-078cbb7cccae\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-krscp" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.262969 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/d9cf5107-f1bf-41ee-bd8a-e3dd8dbfeb5d-default-certificate\") pod \"router-default-5444994796-r8k8r\" (UID: \"d9cf5107-f1bf-41ee-bd8a-e3dd8dbfeb5d\") " pod="openshift-ingress/router-default-5444994796-r8k8r" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.262991 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f756d24-5e77-4130-b920-794234a82ece-registry-certificates\") pod \"image-registry-697d97f7c8-57fqh\" (UID: \"8f756d24-5e77-4130-b920-794234a82ece\") " pod="openshift-image-registry/image-registry-697d97f7c8-57fqh" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.263010 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f00d489f-9b4a-421a-a02a-b9b090ae0449-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-prpgr\" (UID: \"f00d489f-9b4a-421a-a02a-b9b090ae0449\") " pod="openshift-authentication/oauth-openshift-558db77b4-prpgr" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.263037 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f00d489f-9b4a-421a-a02a-b9b090ae0449-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-prpgr\" (UID: \"f00d489f-9b4a-421a-a02a-b9b090ae0449\") " pod="openshift-authentication/oauth-openshift-558db77b4-prpgr" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.263061 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/6c5671cc-4e9d-423d-b0d6-ea9e86420210-signing-key\") pod \"service-ca-9c57cc56f-npmvp\" (UID: \"6c5671cc-4e9d-423d-b0d6-ea9e86420210\") " pod="openshift-service-ca/service-ca-9c57cc56f-npmvp" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.263104 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/82b648a1-4d55-4994-b471-4938eeb34bd0-etcd-client\") pod \"apiserver-7bbb656c7d-jvmfr\" (UID: \"82b648a1-4d55-4994-b471-4938eeb34bd0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jvmfr" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.263122 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6445acca-bd3a-41a4-8b4a-16607771e077-config\") pod \"console-operator-58897d9998-hbftw\" (UID: \"6445acca-bd3a-41a4-8b4a-16607771e077\") " pod="openshift-console-operator/console-operator-58897d9998-hbftw" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.263146 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f00d489f-9b4a-421a-a02a-b9b090ae0449-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-prpgr\" (UID: \"f00d489f-9b4a-421a-a02a-b9b090ae0449\") " pod="openshift-authentication/oauth-openshift-558db77b4-prpgr" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.263165 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/981034b2-30d0-4fea-933d-ef36b1b20d25-bound-sa-token\") pod \"ingress-operator-5b745b69d9-z4kqj\" (UID: \"981034b2-30d0-4fea-933d-ef36b1b20d25\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z4kqj" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.263171 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/53fa29ee-8f5a-4c5b-9d74-3ff726f5ed28-oauth-serving-cert\") pod \"console-f9d7485db-bp6hz\" (UID: \"53fa29ee-8f5a-4c5b-9d74-3ff726f5ed28\") " pod="openshift-console/console-f9d7485db-bp6hz" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.263199 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/53fa29ee-8f5a-4c5b-9d74-3ff726f5ed28-service-ca\") pod \"console-f9d7485db-bp6hz\" (UID: \"53fa29ee-8f5a-4c5b-9d74-3ff726f5ed28\") " pod="openshift-console/console-f9d7485db-bp6hz" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.263304 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f00d489f-9b4a-421a-a02a-b9b090ae0449-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-prpgr\" (UID: \"f00d489f-9b4a-421a-a02a-b9b090ae0449\") " pod="openshift-authentication/oauth-openshift-558db77b4-prpgr" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.263332 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1ee4b36a-3e8c-41f7-9457-7928047cd0c6-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-s6ksk\" (UID: \"1ee4b36a-3e8c-41f7-9457-7928047cd0c6\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-s6ksk" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.263358 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c6e8ca63-ed5f-4e08-8c2a-ac799993720c-apiservice-cert\") pod \"packageserver-d55dfcdfc-b7cd6\" (UID: \"c6e8ca63-ed5f-4e08-8c2a-ac799993720c\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b7cd6" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.263404 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6445acca-bd3a-41a4-8b4a-16607771e077-trusted-ca\") pod \"console-operator-58897d9998-hbftw\" (UID: \"6445acca-bd3a-41a4-8b4a-16607771e077\") " pod="openshift-console-operator/console-operator-58897d9998-hbftw" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.263419 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f756d24-5e77-4130-b920-794234a82ece-bound-sa-token\") pod \"image-registry-697d97f7c8-57fqh\" (UID: \"8f756d24-5e77-4130-b920-794234a82ece\") " pod="openshift-image-registry/image-registry-697d97f7c8-57fqh" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.263518 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hk6x6\" (UniqueName: \"kubernetes.io/projected/53fa29ee-8f5a-4c5b-9d74-3ff726f5ed28-kube-api-access-hk6x6\") pod \"console-f9d7485db-bp6hz\" (UID: \"53fa29ee-8f5a-4c5b-9d74-3ff726f5ed28\") " pod="openshift-console/console-f9d7485db-bp6hz" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.263545 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnlvs\" (UniqueName: \"kubernetes.io/projected/1ee4b36a-3e8c-41f7-9457-7928047cd0c6-kube-api-access-rnlvs\") pod \"openshift-controller-manager-operator-756b6f6bc6-s6ksk\" (UID: \"1ee4b36a-3e8c-41f7-9457-7928047cd0c6\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-s6ksk" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.263101 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/033ab5f2-0ffb-4f26-b703-8380971d1d7e-available-featuregates\") pod \"openshift-config-operator-7777fb866f-h5ql5\" (UID: \"033ab5f2-0ffb-4f26-b703-8380971d1d7e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-h5ql5" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.263574 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-km27r\" (UniqueName: \"kubernetes.io/projected/40deae84-a862-4cbf-8acc-d03e9f516ff4-kube-api-access-km27r\") pod \"migrator-59844c95c7-6946k\" (UID: \"40deae84-a862-4cbf-8acc-d03e9f516ff4\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6946k" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.263598 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/a2cc4fff-1922-45ac-871c-bab6f753b026-mountpoint-dir\") pod \"csi-hostpathplugin-zr2lt\" (UID: \"a2cc4fff-1922-45ac-871c-bab6f753b026\") " pod="hostpath-provisioner/csi-hostpathplugin-zr2lt" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.263546 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f756d24-5e77-4130-b920-794234a82ece-trusted-ca\") pod \"image-registry-697d97f7c8-57fqh\" (UID: \"8f756d24-5e77-4130-b920-794234a82ece\") " pod="openshift-image-registry/image-registry-697d97f7c8-57fqh" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.263625 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f94d0b66-3c2d-46f5-bcdb-078cbb7cccae-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-krscp\" (UID: \"f94d0b66-3c2d-46f5-bcdb-078cbb7cccae\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-krscp" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.263646 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/50f1884b-af0d-429a-b38e-ffd726a9463b-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-mfldp\" (UID: \"50f1884b-af0d-429a-b38e-ffd726a9463b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mfldp" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.263684 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9a2a2cd8-7738-47ce-9432-770f570fa47e-metrics-tls\") pod \"dns-default-jvk96\" (UID: \"9a2a2cd8-7738-47ce-9432-770f570fa47e\") " pod="openshift-dns/dns-default-jvk96" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.263702 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f00d489f-9b4a-421a-a02a-b9b090ae0449-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-prpgr\" (UID: \"f00d489f-9b4a-421a-a02a-b9b090ae0449\") " pod="openshift-authentication/oauth-openshift-558db77b4-prpgr" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.263711 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f00d489f-9b4a-421a-a02a-b9b090ae0449-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-prpgr\" (UID: \"f00d489f-9b4a-421a-a02a-b9b090ae0449\") " pod="openshift-authentication/oauth-openshift-558db77b4-prpgr" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.263882 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/53fa29ee-8f5a-4c5b-9d74-3ff726f5ed28-service-ca\") pod \"console-f9d7485db-bp6hz\" (UID: \"53fa29ee-8f5a-4c5b-9d74-3ff726f5ed28\") " pod="openshift-console/console-f9d7485db-bp6hz" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.263929 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f00d489f-9b4a-421a-a02a-b9b090ae0449-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-prpgr\" (UID: \"f00d489f-9b4a-421a-a02a-b9b090ae0449\") " pod="openshift-authentication/oauth-openshift-558db77b4-prpgr" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.263954 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/9681ca7c-896e-41a2-8f2a-129b813f6695-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-8vwzv\" (UID: \"9681ca7c-896e-41a2-8f2a-129b813f6695\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8vwzv" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.263990 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sr2hm\" (UniqueName: \"kubernetes.io/projected/c6e8ca63-ed5f-4e08-8c2a-ac799993720c-kube-api-access-sr2hm\") pod \"packageserver-d55dfcdfc-b7cd6\" (UID: \"c6e8ca63-ed5f-4e08-8c2a-ac799993720c\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b7cd6" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.265036 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f756d24-5e77-4130-b920-794234a82ece-registry-certificates\") pod \"image-registry-697d97f7c8-57fqh\" (UID: \"8f756d24-5e77-4130-b920-794234a82ece\") " pod="openshift-image-registry/image-registry-697d97f7c8-57fqh" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.265276 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6445acca-bd3a-41a4-8b4a-16607771e077-config\") pod \"console-operator-58897d9998-hbftw\" (UID: \"6445acca-bd3a-41a4-8b4a-16607771e077\") " pod="openshift-console-operator/console-operator-58897d9998-hbftw" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.271513 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/53fa29ee-8f5a-4c5b-9d74-3ff726f5ed28-console-config\") pod \"console-f9d7485db-bp6hz\" (UID: \"53fa29ee-8f5a-4c5b-9d74-3ff726f5ed28\") " pod="openshift-console/console-f9d7485db-bp6hz" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.271780 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzvrr\" (UniqueName: \"kubernetes.io/projected/d9cf5107-f1bf-41ee-bd8a-e3dd8dbfeb5d-kube-api-access-pzvrr\") pod \"router-default-5444994796-r8k8r\" (UID: \"d9cf5107-f1bf-41ee-bd8a-e3dd8dbfeb5d\") " pod="openshift-ingress/router-default-5444994796-r8k8r" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.272916 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/53fa29ee-8f5a-4c5b-9d74-3ff726f5ed28-console-config\") pod \"console-f9d7485db-bp6hz\" (UID: \"53fa29ee-8f5a-4c5b-9d74-3ff726f5ed28\") " pod="openshift-console/console-f9d7485db-bp6hz" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.273211 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/53fa29ee-8f5a-4c5b-9d74-3ff726f5ed28-console-serving-cert\") pod \"console-f9d7485db-bp6hz\" (UID: \"53fa29ee-8f5a-4c5b-9d74-3ff726f5ed28\") " pod="openshift-console/console-f9d7485db-bp6hz" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.273353 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f00d489f-9b4a-421a-a02a-b9b090ae0449-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-prpgr\" (UID: \"f00d489f-9b4a-421a-a02a-b9b090ae0449\") " pod="openshift-authentication/oauth-openshift-558db77b4-prpgr" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.273458 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ae78dd2a-9513-40d7-b38c-6ba848c3c558-etcd-client\") pod \"etcd-operator-b45778765-79rht\" (UID: \"ae78dd2a-9513-40d7-b38c-6ba848c3c558\") " pod="openshift-etcd-operator/etcd-operator-b45778765-79rht" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.273586 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2914c017-5539-4c32-9b8a-c494dd5b397a-proxy-tls\") pod \"machine-config-operator-74547568cd-hnqp5\" (UID: \"2914c017-5539-4c32-9b8a-c494dd5b397a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hnqp5" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.273682 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/ae78dd2a-9513-40d7-b38c-6ba848c3c558-etcd-service-ca\") pod \"etcd-operator-b45778765-79rht\" (UID: \"ae78dd2a-9513-40d7-b38c-6ba848c3c558\") " pod="openshift-etcd-operator/etcd-operator-b45778765-79rht" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.273835 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/bb3c5ed6-797d-4357-85a0-7520a82c5f06-node-bootstrap-token\") pod \"machine-config-server-bhv9s\" (UID: \"bb3c5ed6-797d-4357-85a0-7520a82c5f06\") " pod="openshift-machine-config-operator/machine-config-server-bhv9s" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.273948 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae78dd2a-9513-40d7-b38c-6ba848c3c558-config\") pod \"etcd-operator-b45778765-79rht\" (UID: \"ae78dd2a-9513-40d7-b38c-6ba848c3c558\") " pod="openshift-etcd-operator/etcd-operator-b45778765-79rht" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.274075 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d04ea3ea-71ac-481c-990b-a989a6f61516-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-pbmfv\" (UID: \"d04ea3ea-71ac-481c-990b-a989a6f61516\") " pod="openshift-marketplace/marketplace-operator-79b997595-pbmfv" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.274185 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6445acca-bd3a-41a4-8b4a-16607771e077-serving-cert\") pod \"console-operator-58897d9998-hbftw\" (UID: \"6445acca-bd3a-41a4-8b4a-16607771e077\") " pod="openshift-console-operator/console-operator-58897d9998-hbftw" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.274281 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/2c7183be-dcde-4752-8650-ffd4e9834ce5-srv-cert\") pod \"catalog-operator-68c6474976-5c9h9\" (UID: \"2c7183be-dcde-4752-8650-ffd4e9834ce5\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5c9h9" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.274377 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4c6rh\" (UniqueName: \"kubernetes.io/projected/4ad84a5d-2ceb-457c-af21-d15ac31a54ef-kube-api-access-4c6rh\") pod \"ingress-canary-fc286\" (UID: \"4ad84a5d-2ceb-457c-af21-d15ac31a54ef\") " pod="openshift-ingress-canary/ingress-canary-fc286" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.274533 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f756d24-5e77-4130-b920-794234a82ece-installation-pull-secrets\") pod \"image-registry-697d97f7c8-57fqh\" (UID: \"8f756d24-5e77-4130-b920-794234a82ece\") " pod="openshift-image-registry/image-registry-697d97f7c8-57fqh" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.274623 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/82b648a1-4d55-4994-b471-4938eeb34bd0-serving-cert\") pod \"apiserver-7bbb656c7d-jvmfr\" (UID: \"82b648a1-4d55-4994-b471-4938eeb34bd0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jvmfr" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.274710 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/bb3c5ed6-797d-4357-85a0-7520a82c5f06-certs\") pod \"machine-config-server-bhv9s\" (UID: \"bb3c5ed6-797d-4357-85a0-7520a82c5f06\") " pod="openshift-machine-config-operator/machine-config-server-bhv9s" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.274779 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/a2cc4fff-1922-45ac-871c-bab6f753b026-plugins-dir\") pod \"csi-hostpathplugin-zr2lt\" (UID: \"a2cc4fff-1922-45ac-871c-bab6f753b026\") " pod="hostpath-provisioner/csi-hostpathplugin-zr2lt" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.274890 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2914c017-5539-4c32-9b8a-c494dd5b397a-auth-proxy-config\") pod \"machine-config-operator-74547568cd-hnqp5\" (UID: \"2914c017-5539-4c32-9b8a-c494dd5b397a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hnqp5" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.275001 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/49c0607a-8e84-4961-9e37-45434deacc31-serving-cert\") pod \"service-ca-operator-777779d784-mw85q\" (UID: \"49c0607a-8e84-4961-9e37-45434deacc31\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-mw85q" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.275087 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f756d24-5e77-4130-b920-794234a82ece-registry-tls\") pod \"image-registry-697d97f7c8-57fqh\" (UID: \"8f756d24-5e77-4130-b920-794234a82ece\") " pod="openshift-image-registry/image-registry-697d97f7c8-57fqh" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.275167 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d9cf5107-f1bf-41ee-bd8a-e3dd8dbfeb5d-metrics-certs\") pod \"router-default-5444994796-r8k8r\" (UID: \"d9cf5107-f1bf-41ee-bd8a-e3dd8dbfeb5d\") " pod="openshift-ingress/router-default-5444994796-r8k8r" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.275263 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c234e889-6259-4ada-827f-532882f57c4c-serving-cert\") pod \"authentication-operator-69f744f599-dbt29\" (UID: \"c234e889-6259-4ada-827f-532882f57c4c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dbt29" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.275456 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/6c5671cc-4e9d-423d-b0d6-ea9e86420210-signing-cabundle\") pod \"service-ca-9c57cc56f-npmvp\" (UID: \"6c5671cc-4e9d-423d-b0d6-ea9e86420210\") " pod="openshift-service-ca/service-ca-9c57cc56f-npmvp" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.275584 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f00d489f-9b4a-421a-a02a-b9b090ae0449-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-prpgr\" (UID: \"f00d489f-9b4a-421a-a02a-b9b090ae0449\") " pod="openshift-authentication/oauth-openshift-558db77b4-prpgr" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.275685 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/074633af-0fd3-4335-ba5e-af7840383694-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lwcrf\" (UID: \"074633af-0fd3-4335-ba5e-af7840383694\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lwcrf" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.275783 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f756d24-5e77-4130-b920-794234a82ece-ca-trust-extracted\") pod \"image-registry-697d97f7c8-57fqh\" (UID: \"8f756d24-5e77-4130-b920-794234a82ece\") " pod="openshift-image-registry/image-registry-697d97f7c8-57fqh" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.275875 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l76nd\" (UniqueName: \"kubernetes.io/projected/981034b2-30d0-4fea-933d-ef36b1b20d25-kube-api-access-l76nd\") pod \"ingress-operator-5b745b69d9-z4kqj\" (UID: \"981034b2-30d0-4fea-933d-ef36b1b20d25\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z4kqj" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.275963 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/78a002dc-c902-472c-b269-9ec7c99ab835-client-ca\") pod \"route-controller-manager-6576b87f9c-q8xbr\" (UID: \"78a002dc-c902-472c-b269-9ec7c99ab835\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-q8xbr" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.276043 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-492xz\" (UniqueName: \"kubernetes.io/projected/2914c017-5539-4c32-9b8a-c494dd5b397a-kube-api-access-492xz\") pod \"machine-config-operator-74547568cd-hnqp5\" (UID: \"2914c017-5539-4c32-9b8a-c494dd5b397a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hnqp5" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.276113 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ce060b9-be39-4731-a723-388817737e31-config\") pod \"kube-apiserver-operator-766d6c64bb-whxr2\" (UID: \"9ce060b9-be39-4731-a723-388817737e31\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-whxr2" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.276180 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mb8bh\" (UniqueName: \"kubernetes.io/projected/5eaef9fd-30e6-47e1-afbf-8c0a464a4512-kube-api-access-mb8bh\") pod \"machine-config-controller-84d6567774-974fr\" (UID: \"5eaef9fd-30e6-47e1-afbf-8c0a464a4512\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-974fr" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.276255 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c6e8ca63-ed5f-4e08-8c2a-ac799993720c-webhook-cert\") pod \"packageserver-d55dfcdfc-b7cd6\" (UID: \"c6e8ca63-ed5f-4e08-8c2a-ac799993720c\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b7cd6" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.282116 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f00d489f-9b4a-421a-a02a-b9b090ae0449-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-prpgr\" (UID: \"f00d489f-9b4a-421a-a02a-b9b090ae0449\") " pod="openshift-authentication/oauth-openshift-558db77b4-prpgr" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.282162 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1ee4b36a-3e8c-41f7-9457-7928047cd0c6-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-s6ksk\" (UID: \"1ee4b36a-3e8c-41f7-9457-7928047cd0c6\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-s6ksk" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.282753 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/d9cf5107-f1bf-41ee-bd8a-e3dd8dbfeb5d-default-certificate\") pod \"router-default-5444994796-r8k8r\" (UID: \"d9cf5107-f1bf-41ee-bd8a-e3dd8dbfeb5d\") " pod="openshift-ingress/router-default-5444994796-r8k8r" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.282898 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f756d24-5e77-4130-b920-794234a82ece-ca-trust-extracted\") pod \"image-registry-697d97f7c8-57fqh\" (UID: \"8f756d24-5e77-4130-b920-794234a82ece\") " pod="openshift-image-registry/image-registry-697d97f7c8-57fqh" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.283736 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ce060b9-be39-4731-a723-388817737e31-config\") pod \"kube-apiserver-operator-766d6c64bb-whxr2\" (UID: \"9ce060b9-be39-4731-a723-388817737e31\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-whxr2" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.283962 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/78a002dc-c902-472c-b269-9ec7c99ab835-client-ca\") pod \"route-controller-manager-6576b87f9c-q8xbr\" (UID: \"78a002dc-c902-472c-b269-9ec7c99ab835\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-q8xbr" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.284061 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/82b648a1-4d55-4994-b471-4938eeb34bd0-audit-policies\") pod \"apiserver-7bbb656c7d-jvmfr\" (UID: \"82b648a1-4d55-4994-b471-4938eeb34bd0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jvmfr" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.284210 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c234e889-6259-4ada-827f-532882f57c4c-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-dbt29\" (UID: \"c234e889-6259-4ada-827f-532882f57c4c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dbt29" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.284239 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/259601c2-169a-446f-9fb7-e93aefc143b4-srv-cert\") pod \"olm-operator-6b444d44fb-gvhpf\" (UID: \"259601c2-169a-446f-9fb7-e93aefc143b4\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gvhpf" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.284554 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/50f1884b-af0d-429a-b38e-ffd726a9463b-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-mfldp\" (UID: \"50f1884b-af0d-429a-b38e-ffd726a9463b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mfldp" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.284723 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/82b648a1-4d55-4994-b471-4938eeb34bd0-audit-policies\") pod \"apiserver-7bbb656c7d-jvmfr\" (UID: \"82b648a1-4d55-4994-b471-4938eeb34bd0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jvmfr" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.284853 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/53fa29ee-8f5a-4c5b-9d74-3ff726f5ed28-trusted-ca-bundle\") pod \"console-f9d7485db-bp6hz\" (UID: \"53fa29ee-8f5a-4c5b-9d74-3ff726f5ed28\") " pod="openshift-console/console-f9d7485db-bp6hz" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.284969 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/82b648a1-4d55-4994-b471-4938eeb34bd0-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-jvmfr\" (UID: \"82b648a1-4d55-4994-b471-4938eeb34bd0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jvmfr" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.285006 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59kq2\" (UniqueName: \"kubernetes.io/projected/c234e889-6259-4ada-827f-532882f57c4c-kube-api-access-59kq2\") pod \"authentication-operator-69f744f599-dbt29\" (UID: \"c234e889-6259-4ada-827f-532882f57c4c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dbt29" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.285068 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/c6e8ca63-ed5f-4e08-8c2a-ac799993720c-tmpfs\") pod \"packageserver-d55dfcdfc-b7cd6\" (UID: \"c6e8ca63-ed5f-4e08-8c2a-ac799993720c\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b7cd6" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.285588 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c234e889-6259-4ada-827f-532882f57c4c-serving-cert\") pod \"authentication-operator-69f744f599-dbt29\" (UID: \"c234e889-6259-4ada-827f-532882f57c4c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dbt29" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.285665 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f00d489f-9b4a-421a-a02a-b9b090ae0449-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-prpgr\" (UID: \"f00d489f-9b4a-421a-a02a-b9b090ae0449\") " pod="openshift-authentication/oauth-openshift-558db77b4-prpgr" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.285996 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/033ab5f2-0ffb-4f26-b703-8380971d1d7e-serving-cert\") pod \"openshift-config-operator-7777fb866f-h5ql5\" (UID: \"033ab5f2-0ffb-4f26-b703-8380971d1d7e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-h5ql5" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.286119 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c234e889-6259-4ada-827f-532882f57c4c-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-dbt29\" (UID: \"c234e889-6259-4ada-827f-532882f57c4c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dbt29" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.286148 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f00d489f-9b4a-421a-a02a-b9b090ae0449-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-prpgr\" (UID: \"f00d489f-9b4a-421a-a02a-b9b090ae0449\") " pod="openshift-authentication/oauth-openshift-558db77b4-prpgr" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.286314 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/53fa29ee-8f5a-4c5b-9d74-3ff726f5ed28-trusted-ca-bundle\") pod \"console-f9d7485db-bp6hz\" (UID: \"53fa29ee-8f5a-4c5b-9d74-3ff726f5ed28\") " pod="openshift-console/console-f9d7485db-bp6hz" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.286364 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f94d0b66-3c2d-46f5-bcdb-078cbb7cccae-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-krscp\" (UID: \"f94d0b66-3c2d-46f5-bcdb-078cbb7cccae\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-krscp" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.286390 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/24ab4270-1ece-4201-94ae-51c71902c3f1-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-s8pf8\" (UID: \"24ab4270-1ece-4201-94ae-51c71902c3f1\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-s8pf8" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.286425 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/074633af-0fd3-4335-ba5e-af7840383694-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lwcrf\" (UID: \"074633af-0fd3-4335-ba5e-af7840383694\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lwcrf" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.286472 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nk7nw\" (UniqueName: \"kubernetes.io/projected/82b648a1-4d55-4994-b471-4938eeb34bd0-kube-api-access-nk7nw\") pod \"apiserver-7bbb656c7d-jvmfr\" (UID: \"82b648a1-4d55-4994-b471-4938eeb34bd0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jvmfr" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.286608 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f00d489f-9b4a-421a-a02a-b9b090ae0449-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-prpgr\" (UID: \"f00d489f-9b4a-421a-a02a-b9b090ae0449\") " pod="openshift-authentication/oauth-openshift-558db77b4-prpgr" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.286893 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/82b648a1-4d55-4994-b471-4938eeb34bd0-etcd-client\") pod \"apiserver-7bbb656c7d-jvmfr\" (UID: \"82b648a1-4d55-4994-b471-4938eeb34bd0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jvmfr" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.287022 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f00d489f-9b4a-421a-a02a-b9b090ae0449-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-prpgr\" (UID: \"f00d489f-9b4a-421a-a02a-b9b090ae0449\") " pod="openshift-authentication/oauth-openshift-558db77b4-prpgr" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.287070 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/82b648a1-4d55-4994-b471-4938eeb34bd0-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-jvmfr\" (UID: \"82b648a1-4d55-4994-b471-4938eeb34bd0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jvmfr" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.287163 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbcwc\" (UniqueName: \"kubernetes.io/projected/ae8c092c-ec9d-456a-9ba3-5501c22f6280-kube-api-access-gbcwc\") pod \"collect-profiles-29319045-qpr28\" (UID: \"ae8c092c-ec9d-456a-9ba3-5501c22f6280\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319045-qpr28" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.287333 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f00d489f-9b4a-421a-a02a-b9b090ae0449-audit-policies\") pod \"oauth-openshift-558db77b4-prpgr\" (UID: \"f00d489f-9b4a-421a-a02a-b9b090ae0449\") " pod="openshift-authentication/oauth-openshift-558db77b4-prpgr" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.287448 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f00d489f-9b4a-421a-a02a-b9b090ae0449-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-prpgr\" (UID: \"f00d489f-9b4a-421a-a02a-b9b090ae0449\") " pod="openshift-authentication/oauth-openshift-558db77b4-prpgr" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.287897 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mjc4\" (UniqueName: \"kubernetes.io/projected/259601c2-169a-446f-9fb7-e93aefc143b4-kube-api-access-6mjc4\") pod \"olm-operator-6b444d44fb-gvhpf\" (UID: \"259601c2-169a-446f-9fb7-e93aefc143b4\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gvhpf" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.287962 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/82b648a1-4d55-4994-b471-4938eeb34bd0-encryption-config\") pod \"apiserver-7bbb656c7d-jvmfr\" (UID: \"82b648a1-4d55-4994-b471-4938eeb34bd0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jvmfr" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.288002 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhc8c\" (UniqueName: \"kubernetes.io/projected/a2cc4fff-1922-45ac-871c-bab6f753b026-kube-api-access-fhc8c\") pod \"csi-hostpathplugin-zr2lt\" (UID: \"a2cc4fff-1922-45ac-871c-bab6f753b026\") " pod="hostpath-provisioner/csi-hostpathplugin-zr2lt" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.288004 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f00d489f-9b4a-421a-a02a-b9b090ae0449-audit-policies\") pod \"oauth-openshift-558db77b4-prpgr\" (UID: \"f00d489f-9b4a-421a-a02a-b9b090ae0449\") " pod="openshift-authentication/oauth-openshift-558db77b4-prpgr" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.288020 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ae8c092c-ec9d-456a-9ba3-5501c22f6280-secret-volume\") pod \"collect-profiles-29319045-qpr28\" (UID: \"ae8c092c-ec9d-456a-9ba3-5501c22f6280\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319045-qpr28" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.288051 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/981034b2-30d0-4fea-933d-ef36b1b20d25-metrics-tls\") pod \"ingress-operator-5b745b69d9-z4kqj\" (UID: \"981034b2-30d0-4fea-933d-ef36b1b20d25\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z4kqj" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.288087 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brb8j\" (UniqueName: \"kubernetes.io/projected/50f1884b-af0d-429a-b38e-ffd726a9463b-kube-api-access-brb8j\") pod \"kube-storage-version-migrator-operator-b67b599dd-mfldp\" (UID: \"50f1884b-af0d-429a-b38e-ffd726a9463b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mfldp" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.288134 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4ad84a5d-2ceb-457c-af21-d15ac31a54ef-cert\") pod \"ingress-canary-fc286\" (UID: \"4ad84a5d-2ceb-457c-af21-d15ac31a54ef\") " pod="openshift-ingress-canary/ingress-canary-fc286" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.288160 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49c0607a-8e84-4961-9e37-45434deacc31-config\") pod \"service-ca-operator-777779d784-mw85q\" (UID: \"49c0607a-8e84-4961-9e37-45434deacc31\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-mw85q" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.288203 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f00d489f-9b4a-421a-a02a-b9b090ae0449-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-prpgr\" (UID: \"f00d489f-9b4a-421a-a02a-b9b090ae0449\") " pod="openshift-authentication/oauth-openshift-558db77b4-prpgr" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.288254 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d9cf5107-f1bf-41ee-bd8a-e3dd8dbfeb5d-service-ca-bundle\") pod \"router-default-5444994796-r8k8r\" (UID: \"d9cf5107-f1bf-41ee-bd8a-e3dd8dbfeb5d\") " pod="openshift-ingress/router-default-5444994796-r8k8r" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.288282 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/d9cf5107-f1bf-41ee-bd8a-e3dd8dbfeb5d-stats-auth\") pod \"router-default-5444994796-r8k8r\" (UID: \"d9cf5107-f1bf-41ee-bd8a-e3dd8dbfeb5d\") " pod="openshift-ingress/router-default-5444994796-r8k8r" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.288309 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/78a002dc-c902-472c-b269-9ec7c99ab835-serving-cert\") pod \"route-controller-manager-6576b87f9c-q8xbr\" (UID: \"78a002dc-c902-472c-b269-9ec7c99ab835\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-q8xbr" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.288729 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f00d489f-9b4a-421a-a02a-b9b090ae0449-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-prpgr\" (UID: \"f00d489f-9b4a-421a-a02a-b9b090ae0449\") " pod="openshift-authentication/oauth-openshift-558db77b4-prpgr" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.290096 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f00d489f-9b4a-421a-a02a-b9b090ae0449-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-prpgr\" (UID: \"f00d489f-9b4a-421a-a02a-b9b090ae0449\") " pod="openshift-authentication/oauth-openshift-558db77b4-prpgr" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.290314 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f00d489f-9b4a-421a-a02a-b9b090ae0449-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-prpgr\" (UID: \"f00d489f-9b4a-421a-a02a-b9b090ae0449\") " pod="openshift-authentication/oauth-openshift-558db77b4-prpgr" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.290539 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/82b648a1-4d55-4994-b471-4938eeb34bd0-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-jvmfr\" (UID: \"82b648a1-4d55-4994-b471-4938eeb34bd0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jvmfr" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.291452 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/82b648a1-4d55-4994-b471-4938eeb34bd0-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-jvmfr\" (UID: \"82b648a1-4d55-4994-b471-4938eeb34bd0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jvmfr" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.291701 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/53fa29ee-8f5a-4c5b-9d74-3ff726f5ed28-console-serving-cert\") pod \"console-f9d7485db-bp6hz\" (UID: \"53fa29ee-8f5a-4c5b-9d74-3ff726f5ed28\") " pod="openshift-console/console-f9d7485db-bp6hz" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.291820 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tddv6\" (UniqueName: \"kubernetes.io/projected/44ae2b29-ec3a-4321-8590-4d316d810034-kube-api-access-tddv6\") pod \"downloads-7954f5f757-bhw29\" (UID: \"44ae2b29-ec3a-4321-8590-4d316d810034\") " pod="openshift-console/downloads-7954f5f757-bhw29" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.291899 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fc5ce698-c6c1-41fc-9b25-864081396f26-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-bq279\" (UID: \"fc5ce698-c6c1-41fc-9b25-864081396f26\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bq279" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.291972 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/259601c2-169a-446f-9fb7-e93aefc143b4-profile-collector-cert\") pod \"olm-operator-6b444d44fb-gvhpf\" (UID: \"259601c2-169a-446f-9fb7-e93aefc143b4\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gvhpf" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.292010 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a2cc4fff-1922-45ac-871c-bab6f753b026-socket-dir\") pod \"csi-hostpathplugin-zr2lt\" (UID: \"a2cc4fff-1922-45ac-871c-bab6f753b026\") " pod="hostpath-provisioner/csi-hostpathplugin-zr2lt" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.292042 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/82b648a1-4d55-4994-b471-4938eeb34bd0-audit-dir\") pod \"apiserver-7bbb656c7d-jvmfr\" (UID: \"82b648a1-4d55-4994-b471-4938eeb34bd0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jvmfr" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.292116 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5eaef9fd-30e6-47e1-afbf-8c0a464a4512-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-974fr\" (UID: \"5eaef9fd-30e6-47e1-afbf-8c0a464a4512\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-974fr" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.292157 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/53fa29ee-8f5a-4c5b-9d74-3ff726f5ed28-console-oauth-config\") pod \"console-f9d7485db-bp6hz\" (UID: \"53fa29ee-8f5a-4c5b-9d74-3ff726f5ed28\") " pod="openshift-console/console-f9d7485db-bp6hz" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.292223 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdmbs\" (UniqueName: \"kubernetes.io/projected/8f756d24-5e77-4130-b920-794234a82ece-kube-api-access-gdmbs\") pod \"image-registry-697d97f7c8-57fqh\" (UID: \"8f756d24-5e77-4130-b920-794234a82ece\") " pod="openshift-image-registry/image-registry-697d97f7c8-57fqh" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.292257 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b26v7\" (UniqueName: \"kubernetes.io/projected/bb3c5ed6-797d-4357-85a0-7520a82c5f06-kube-api-access-b26v7\") pod \"machine-config-server-bhv9s\" (UID: \"bb3c5ed6-797d-4357-85a0-7520a82c5f06\") " pod="openshift-machine-config-operator/machine-config-server-bhv9s" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.292281 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wtkn\" (UniqueName: \"kubernetes.io/projected/6c5671cc-4e9d-423d-b0d6-ea9e86420210-kube-api-access-4wtkn\") pod \"service-ca-9c57cc56f-npmvp\" (UID: \"6c5671cc-4e9d-423d-b0d6-ea9e86420210\") " pod="openshift-service-ca/service-ca-9c57cc56f-npmvp" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.292307 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc5ce698-c6c1-41fc-9b25-864081396f26-config\") pod \"kube-controller-manager-operator-78b949d7b-bq279\" (UID: \"fc5ce698-c6c1-41fc-9b25-864081396f26\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bq279" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.292380 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rxcj\" (UniqueName: \"kubernetes.io/projected/9a2a2cd8-7738-47ce-9432-770f570fa47e-kube-api-access-2rxcj\") pod \"dns-default-jvk96\" (UID: \"9a2a2cd8-7738-47ce-9432-770f570fa47e\") " pod="openshift-dns/dns-default-jvk96" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.292388 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f756d24-5e77-4130-b920-794234a82ece-registry-tls\") pod \"image-registry-697d97f7c8-57fqh\" (UID: \"8f756d24-5e77-4130-b920-794234a82ece\") " pod="openshift-image-registry/image-registry-697d97f7c8-57fqh" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.292414 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78a002dc-c902-472c-b269-9ec7c99ab835-config\") pod \"route-controller-manager-6576b87f9c-q8xbr\" (UID: \"78a002dc-c902-472c-b269-9ec7c99ab835\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-q8xbr" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.292443 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/074633af-0fd3-4335-ba5e-af7840383694-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lwcrf\" (UID: \"074633af-0fd3-4335-ba5e-af7840383694\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lwcrf" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.292471 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c234e889-6259-4ada-827f-532882f57c4c-service-ca-bundle\") pod \"authentication-operator-69f744f599-dbt29\" (UID: \"c234e889-6259-4ada-827f-532882f57c4c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dbt29" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.292494 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ae78dd2a-9513-40d7-b38c-6ba848c3c558-serving-cert\") pod \"etcd-operator-b45778765-79rht\" (UID: \"ae78dd2a-9513-40d7-b38c-6ba848c3c558\") " pod="openshift-etcd-operator/etcd-operator-b45778765-79rht" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.292515 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/ae78dd2a-9513-40d7-b38c-6ba848c3c558-etcd-ca\") pod \"etcd-operator-b45778765-79rht\" (UID: \"ae78dd2a-9513-40d7-b38c-6ba848c3c558\") " pod="openshift-etcd-operator/etcd-operator-b45778765-79rht" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.292539 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/981034b2-30d0-4fea-933d-ef36b1b20d25-trusted-ca\") pod \"ingress-operator-5b745b69d9-z4kqj\" (UID: \"981034b2-30d0-4fea-933d-ef36b1b20d25\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z4kqj" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.292561 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6v4bh\" (UniqueName: \"kubernetes.io/projected/9681ca7c-896e-41a2-8f2a-129b813f6695-kube-api-access-6v4bh\") pod \"package-server-manager-789f6589d5-8vwzv\" (UID: \"9681ca7c-896e-41a2-8f2a-129b813f6695\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8vwzv" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.292663 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwmb6\" (UniqueName: \"kubernetes.io/projected/78a002dc-c902-472c-b269-9ec7c99ab835-kube-api-access-wwmb6\") pod \"route-controller-manager-6576b87f9c-q8xbr\" (UID: \"78a002dc-c902-472c-b269-9ec7c99ab835\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-q8xbr" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.292689 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkk2f\" (UniqueName: \"kubernetes.io/projected/24ab4270-1ece-4201-94ae-51c71902c3f1-kube-api-access-nkk2f\") pod \"control-plane-machine-set-operator-78cbb6b69f-s8pf8\" (UID: \"24ab4270-1ece-4201-94ae-51c71902c3f1\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-s8pf8" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.292714 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a2cc4fff-1922-45ac-871c-bab6f753b026-registration-dir\") pod \"csi-hostpathplugin-zr2lt\" (UID: \"a2cc4fff-1922-45ac-871c-bab6f753b026\") " pod="hostpath-provisioner/csi-hostpathplugin-zr2lt" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.292776 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9ce060b9-be39-4731-a723-388817737e31-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-whxr2\" (UID: \"9ce060b9-be39-4731-a723-388817737e31\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-whxr2" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.292847 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6k948\" (UniqueName: \"kubernetes.io/projected/49c0607a-8e84-4961-9e37-45434deacc31-kube-api-access-6k948\") pod \"service-ca-operator-777779d784-mw85q\" (UID: \"49c0607a-8e84-4961-9e37-45434deacc31\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-mw85q" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.292871 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ae8c092c-ec9d-456a-9ba3-5501c22f6280-config-volume\") pod \"collect-profiles-29319045-qpr28\" (UID: \"ae8c092c-ec9d-456a-9ba3-5501c22f6280\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319045-qpr28" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.292895 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jgtb\" (UniqueName: \"kubernetes.io/projected/ae78dd2a-9513-40d7-b38c-6ba848c3c558-kube-api-access-4jgtb\") pod \"etcd-operator-b45778765-79rht\" (UID: \"ae78dd2a-9513-40d7-b38c-6ba848c3c558\") " pod="openshift-etcd-operator/etcd-operator-b45778765-79rht" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.292917 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/2c7183be-dcde-4752-8650-ffd4e9834ce5-profile-collector-cert\") pod \"catalog-operator-68c6474976-5c9h9\" (UID: \"2c7183be-dcde-4752-8650-ffd4e9834ce5\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5c9h9" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.292941 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pl84\" (UniqueName: \"kubernetes.io/projected/0f54680a-b95f-4074-a763-859a7e96962d-kube-api-access-9pl84\") pod \"dns-operator-744455d44c-cxqx7\" (UID: \"0f54680a-b95f-4074-a763-859a7e96962d\") " pod="openshift-dns-operator/dns-operator-744455d44c-cxqx7" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.292973 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f00d489f-9b4a-421a-a02a-b9b090ae0449-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-prpgr\" (UID: \"f00d489f-9b4a-421a-a02a-b9b090ae0449\") " pod="openshift-authentication/oauth-openshift-558db77b4-prpgr" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.292994 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/a2cc4fff-1922-45ac-871c-bab6f753b026-csi-data-dir\") pod \"csi-hostpathplugin-zr2lt\" (UID: \"a2cc4fff-1922-45ac-871c-bab6f753b026\") " pod="hostpath-provisioner/csi-hostpathplugin-zr2lt" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.293195 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6445acca-bd3a-41a4-8b4a-16607771e077-serving-cert\") pod \"console-operator-58897d9998-hbftw\" (UID: \"6445acca-bd3a-41a4-8b4a-16607771e077\") " pod="openshift-console-operator/console-operator-58897d9998-hbftw" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.293466 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d9cf5107-f1bf-41ee-bd8a-e3dd8dbfeb5d-service-ca-bundle\") pod \"router-default-5444994796-r8k8r\" (UID: \"d9cf5107-f1bf-41ee-bd8a-e3dd8dbfeb5d\") " pod="openshift-ingress/router-default-5444994796-r8k8r" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.293814 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/82b648a1-4d55-4994-b471-4938eeb34bd0-serving-cert\") pod \"apiserver-7bbb656c7d-jvmfr\" (UID: \"82b648a1-4d55-4994-b471-4938eeb34bd0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jvmfr" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.294266 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc5ce698-c6c1-41fc-9b25-864081396f26-config\") pod \"kube-controller-manager-operator-78b949d7b-bq279\" (UID: \"fc5ce698-c6c1-41fc-9b25-864081396f26\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bq279" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.294924 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f756d24-5e77-4130-b920-794234a82ece-installation-pull-secrets\") pod \"image-registry-697d97f7c8-57fqh\" (UID: \"8f756d24-5e77-4130-b920-794234a82ece\") " pod="openshift-image-registry/image-registry-697d97f7c8-57fqh" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.295144 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f00d489f-9b4a-421a-a02a-b9b090ae0449-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-prpgr\" (UID: \"f00d489f-9b4a-421a-a02a-b9b090ae0449\") " pod="openshift-authentication/oauth-openshift-558db77b4-prpgr" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.295231 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/981034b2-30d0-4fea-933d-ef36b1b20d25-trusted-ca\") pod \"ingress-operator-5b745b69d9-z4kqj\" (UID: \"981034b2-30d0-4fea-933d-ef36b1b20d25\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z4kqj" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.295270 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/82b648a1-4d55-4994-b471-4938eeb34bd0-audit-dir\") pod \"apiserver-7bbb656c7d-jvmfr\" (UID: \"82b648a1-4d55-4994-b471-4938eeb34bd0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jvmfr" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.296307 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/24ab4270-1ece-4201-94ae-51c71902c3f1-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-s8pf8\" (UID: \"24ab4270-1ece-4201-94ae-51c71902c3f1\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-s8pf8" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.296765 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/d9cf5107-f1bf-41ee-bd8a-e3dd8dbfeb5d-stats-auth\") pod \"router-default-5444994796-r8k8r\" (UID: \"d9cf5107-f1bf-41ee-bd8a-e3dd8dbfeb5d\") " pod="openshift-ingress/router-default-5444994796-r8k8r" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.296929 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c234e889-6259-4ada-827f-532882f57c4c-service-ca-bundle\") pod \"authentication-operator-69f744f599-dbt29\" (UID: \"c234e889-6259-4ada-827f-532882f57c4c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dbt29" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.296935 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78a002dc-c902-472c-b269-9ec7c99ab835-config\") pod \"route-controller-manager-6576b87f9c-q8xbr\" (UID: \"78a002dc-c902-472c-b269-9ec7c99ab835\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-q8xbr" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.297786 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f00d489f-9b4a-421a-a02a-b9b090ae0449-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-prpgr\" (UID: \"f00d489f-9b4a-421a-a02a-b9b090ae0449\") " pod="openshift-authentication/oauth-openshift-558db77b4-prpgr" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.298380 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9ce060b9-be39-4731-a723-388817737e31-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-whxr2\" (UID: \"9ce060b9-be39-4731-a723-388817737e31\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-whxr2" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.298989 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/981034b2-30d0-4fea-933d-ef36b1b20d25-metrics-tls\") pod \"ingress-operator-5b745b69d9-z4kqj\" (UID: \"981034b2-30d0-4fea-933d-ef36b1b20d25\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z4kqj" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.299771 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/78a002dc-c902-472c-b269-9ec7c99ab835-serving-cert\") pod \"route-controller-manager-6576b87f9c-q8xbr\" (UID: \"78a002dc-c902-472c-b269-9ec7c99ab835\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-q8xbr" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.300575 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/82b648a1-4d55-4994-b471-4938eeb34bd0-encryption-config\") pod \"apiserver-7bbb656c7d-jvmfr\" (UID: \"82b648a1-4d55-4994-b471-4938eeb34bd0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jvmfr" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.300920 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/53fa29ee-8f5a-4c5b-9d74-3ff726f5ed28-console-oauth-config\") pod \"console-f9d7485db-bp6hz\" (UID: \"53fa29ee-8f5a-4c5b-9d74-3ff726f5ed28\") " pod="openshift-console/console-f9d7485db-bp6hz" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.301164 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d9cf5107-f1bf-41ee-bd8a-e3dd8dbfeb5d-metrics-certs\") pod \"router-default-5444994796-r8k8r\" (UID: \"d9cf5107-f1bf-41ee-bd8a-e3dd8dbfeb5d\") " pod="openshift-ingress/router-default-5444994796-r8k8r" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.301574 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fc5ce698-c6c1-41fc-9b25-864081396f26-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-bq279\" (UID: \"fc5ce698-c6c1-41fc-9b25-864081396f26\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bq279" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.302240 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9ce060b9-be39-4731-a723-388817737e31-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-whxr2\" (UID: \"9ce060b9-be39-4731-a723-388817737e31\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-whxr2" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.311243 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhhxv\" (UniqueName: \"kubernetes.io/projected/033ab5f2-0ffb-4f26-b703-8380971d1d7e-kube-api-access-xhhxv\") pod \"openshift-config-operator-7777fb866f-h5ql5\" (UID: \"033ab5f2-0ffb-4f26-b703-8380971d1d7e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-h5ql5" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.331356 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fc5ce698-c6c1-41fc-9b25-864081396f26-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-bq279\" (UID: \"fc5ce698-c6c1-41fc-9b25-864081396f26\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bq279" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.333280 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-whxr2" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.342515 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrgs6\" (UniqueName: \"kubernetes.io/projected/f00d489f-9b4a-421a-a02a-b9b090ae0449-kube-api-access-qrgs6\") pod \"oauth-openshift-558db77b4-prpgr\" (UID: \"f00d489f-9b4a-421a-a02a-b9b090ae0449\") " pod="openshift-authentication/oauth-openshift-558db77b4-prpgr" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.365670 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qv6kt\" (UniqueName: \"kubernetes.io/projected/6445acca-bd3a-41a4-8b4a-16607771e077-kube-api-access-qv6kt\") pod \"console-operator-58897d9998-hbftw\" (UID: \"6445acca-bd3a-41a4-8b4a-16607771e077\") " pod="openshift-console-operator/console-operator-58897d9998-hbftw" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.385105 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4s4nh\" (UniqueName: \"kubernetes.io/projected/f94d0b66-3c2d-46f5-bcdb-078cbb7cccae-kube-api-access-4s4nh\") pod \"multus-admission-controller-857f4d67dd-krscp\" (UID: \"f94d0b66-3c2d-46f5-bcdb-078cbb7cccae\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-krscp" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.397344 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.397581 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4ad84a5d-2ceb-457c-af21-d15ac31a54ef-cert\") pod \"ingress-canary-fc286\" (UID: \"4ad84a5d-2ceb-457c-af21-d15ac31a54ef\") " pod="openshift-ingress-canary/ingress-canary-fc286" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.397612 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhc8c\" (UniqueName: \"kubernetes.io/projected/a2cc4fff-1922-45ac-871c-bab6f753b026-kube-api-access-fhc8c\") pod \"csi-hostpathplugin-zr2lt\" (UID: \"a2cc4fff-1922-45ac-871c-bab6f753b026\") " pod="hostpath-provisioner/csi-hostpathplugin-zr2lt" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.397635 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ae8c092c-ec9d-456a-9ba3-5501c22f6280-secret-volume\") pod \"collect-profiles-29319045-qpr28\" (UID: \"ae8c092c-ec9d-456a-9ba3-5501c22f6280\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319045-qpr28" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.397656 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49c0607a-8e84-4961-9e37-45434deacc31-config\") pod \"service-ca-operator-777779d784-mw85q\" (UID: \"49c0607a-8e84-4961-9e37-45434deacc31\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-mw85q" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.397691 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/259601c2-169a-446f-9fb7-e93aefc143b4-profile-collector-cert\") pod \"olm-operator-6b444d44fb-gvhpf\" (UID: \"259601c2-169a-446f-9fb7-e93aefc143b4\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gvhpf" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.397711 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a2cc4fff-1922-45ac-871c-bab6f753b026-socket-dir\") pod \"csi-hostpathplugin-zr2lt\" (UID: \"a2cc4fff-1922-45ac-871c-bab6f753b026\") " pod="hostpath-provisioner/csi-hostpathplugin-zr2lt" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.397735 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5eaef9fd-30e6-47e1-afbf-8c0a464a4512-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-974fr\" (UID: \"5eaef9fd-30e6-47e1-afbf-8c0a464a4512\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-974fr" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.397794 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b26v7\" (UniqueName: \"kubernetes.io/projected/bb3c5ed6-797d-4357-85a0-7520a82c5f06-kube-api-access-b26v7\") pod \"machine-config-server-bhv9s\" (UID: \"bb3c5ed6-797d-4357-85a0-7520a82c5f06\") " pod="openshift-machine-config-operator/machine-config-server-bhv9s" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.397843 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wtkn\" (UniqueName: \"kubernetes.io/projected/6c5671cc-4e9d-423d-b0d6-ea9e86420210-kube-api-access-4wtkn\") pod \"service-ca-9c57cc56f-npmvp\" (UID: \"6c5671cc-4e9d-423d-b0d6-ea9e86420210\") " pod="openshift-service-ca/service-ca-9c57cc56f-npmvp" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.397924 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rxcj\" (UniqueName: \"kubernetes.io/projected/9a2a2cd8-7738-47ce-9432-770f570fa47e-kube-api-access-2rxcj\") pod \"dns-default-jvk96\" (UID: \"9a2a2cd8-7738-47ce-9432-770f570fa47e\") " pod="openshift-dns/dns-default-jvk96" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.397950 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/074633af-0fd3-4335-ba5e-af7840383694-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lwcrf\" (UID: \"074633af-0fd3-4335-ba5e-af7840383694\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lwcrf" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.397973 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ae78dd2a-9513-40d7-b38c-6ba848c3c558-serving-cert\") pod \"etcd-operator-b45778765-79rht\" (UID: \"ae78dd2a-9513-40d7-b38c-6ba848c3c558\") " pod="openshift-etcd-operator/etcd-operator-b45778765-79rht" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.397991 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/ae78dd2a-9513-40d7-b38c-6ba848c3c558-etcd-ca\") pod \"etcd-operator-b45778765-79rht\" (UID: \"ae78dd2a-9513-40d7-b38c-6ba848c3c558\") " pod="openshift-etcd-operator/etcd-operator-b45778765-79rht" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.398053 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6v4bh\" (UniqueName: \"kubernetes.io/projected/9681ca7c-896e-41a2-8f2a-129b813f6695-kube-api-access-6v4bh\") pod \"package-server-manager-789f6589d5-8vwzv\" (UID: \"9681ca7c-896e-41a2-8f2a-129b813f6695\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8vwzv" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.398103 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a2cc4fff-1922-45ac-871c-bab6f753b026-registration-dir\") pod \"csi-hostpathplugin-zr2lt\" (UID: \"a2cc4fff-1922-45ac-871c-bab6f753b026\") " pod="hostpath-provisioner/csi-hostpathplugin-zr2lt" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.398131 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jgtb\" (UniqueName: \"kubernetes.io/projected/ae78dd2a-9513-40d7-b38c-6ba848c3c558-kube-api-access-4jgtb\") pod \"etcd-operator-b45778765-79rht\" (UID: \"ae78dd2a-9513-40d7-b38c-6ba848c3c558\") " pod="openshift-etcd-operator/etcd-operator-b45778765-79rht" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.398154 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/2c7183be-dcde-4752-8650-ffd4e9834ce5-profile-collector-cert\") pod \"catalog-operator-68c6474976-5c9h9\" (UID: \"2c7183be-dcde-4752-8650-ffd4e9834ce5\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5c9h9" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.398176 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9pl84\" (UniqueName: \"kubernetes.io/projected/0f54680a-b95f-4074-a763-859a7e96962d-kube-api-access-9pl84\") pod \"dns-operator-744455d44c-cxqx7\" (UID: \"0f54680a-b95f-4074-a763-859a7e96962d\") " pod="openshift-dns-operator/dns-operator-744455d44c-cxqx7" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.398194 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6k948\" (UniqueName: \"kubernetes.io/projected/49c0607a-8e84-4961-9e37-45434deacc31-kube-api-access-6k948\") pod \"service-ca-operator-777779d784-mw85q\" (UID: \"49c0607a-8e84-4961-9e37-45434deacc31\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-mw85q" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.398214 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ae8c092c-ec9d-456a-9ba3-5501c22f6280-config-volume\") pod \"collect-profiles-29319045-qpr28\" (UID: \"ae8c092c-ec9d-456a-9ba3-5501c22f6280\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319045-qpr28" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.398237 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/a2cc4fff-1922-45ac-871c-bab6f753b026-csi-data-dir\") pod \"csi-hostpathplugin-zr2lt\" (UID: \"a2cc4fff-1922-45ac-871c-bab6f753b026\") " pod="hostpath-provisioner/csi-hostpathplugin-zr2lt" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.398262 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gf22s\" (UniqueName: \"kubernetes.io/projected/d04ea3ea-71ac-481c-990b-a989a6f61516-kube-api-access-gf22s\") pod \"marketplace-operator-79b997595-pbmfv\" (UID: \"d04ea3ea-71ac-481c-990b-a989a6f61516\") " pod="openshift-marketplace/marketplace-operator-79b997595-pbmfv" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.398286 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/2914c017-5539-4c32-9b8a-c494dd5b397a-images\") pod \"machine-config-operator-74547568cd-hnqp5\" (UID: \"2914c017-5539-4c32-9b8a-c494dd5b397a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hnqp5" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.398313 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d04ea3ea-71ac-481c-990b-a989a6f61516-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-pbmfv\" (UID: \"d04ea3ea-71ac-481c-990b-a989a6f61516\") " pod="openshift-marketplace/marketplace-operator-79b997595-pbmfv" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.398326 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a2cc4fff-1922-45ac-871c-bab6f753b026-socket-dir\") pod \"csi-hostpathplugin-zr2lt\" (UID: \"a2cc4fff-1922-45ac-871c-bab6f753b026\") " pod="hostpath-provisioner/csi-hostpathplugin-zr2lt" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.398349 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5eaef9fd-30e6-47e1-afbf-8c0a464a4512-proxy-tls\") pod \"machine-config-controller-84d6567774-974fr\" (UID: \"5eaef9fd-30e6-47e1-afbf-8c0a464a4512\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-974fr" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.398635 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49c0607a-8e84-4961-9e37-45434deacc31-config\") pod \"service-ca-operator-777779d784-mw85q\" (UID: \"49c0607a-8e84-4961-9e37-45434deacc31\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-mw85q" Sep 29 10:46:42 crc kubenswrapper[4752]: E0929 10:46:42.399033 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 10:46:42.898984032 +0000 UTC m=+143.688125699 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.399394 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5eaef9fd-30e6-47e1-afbf-8c0a464a4512-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-974fr\" (UID: \"5eaef9fd-30e6-47e1-afbf-8c0a464a4512\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-974fr" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.399572 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9a2a2cd8-7738-47ce-9432-770f570fa47e-config-volume\") pod \"dns-default-jvk96\" (UID: \"9a2a2cd8-7738-47ce-9432-770f570fa47e\") " pod="openshift-dns/dns-default-jvk96" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.399619 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bmhr\" (UniqueName: \"kubernetes.io/projected/2c7183be-dcde-4752-8650-ffd4e9834ce5-kube-api-access-9bmhr\") pod \"catalog-operator-68c6474976-5c9h9\" (UID: \"2c7183be-dcde-4752-8650-ffd4e9834ce5\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5c9h9" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.399644 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0f54680a-b95f-4074-a763-859a7e96962d-metrics-tls\") pod \"dns-operator-744455d44c-cxqx7\" (UID: \"0f54680a-b95f-4074-a763-859a7e96962d\") " pod="openshift-dns-operator/dns-operator-744455d44c-cxqx7" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.399732 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/a2cc4fff-1922-45ac-871c-bab6f753b026-csi-data-dir\") pod \"csi-hostpathplugin-zr2lt\" (UID: \"a2cc4fff-1922-45ac-871c-bab6f753b026\") " pod="hostpath-provisioner/csi-hostpathplugin-zr2lt" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.400084 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ae8c092c-ec9d-456a-9ba3-5501c22f6280-config-volume\") pod \"collect-profiles-29319045-qpr28\" (UID: \"ae8c092c-ec9d-456a-9ba3-5501c22f6280\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319045-qpr28" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.400388 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9a2a2cd8-7738-47ce-9432-770f570fa47e-config-volume\") pod \"dns-default-jvk96\" (UID: \"9a2a2cd8-7738-47ce-9432-770f570fa47e\") " pod="openshift-dns/dns-default-jvk96" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.399617 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/074633af-0fd3-4335-ba5e-af7840383694-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lwcrf\" (UID: \"074633af-0fd3-4335-ba5e-af7840383694\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lwcrf" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.400933 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/2914c017-5539-4c32-9b8a-c494dd5b397a-images\") pod \"machine-config-operator-74547568cd-hnqp5\" (UID: \"2914c017-5539-4c32-9b8a-c494dd5b397a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hnqp5" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.402037 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a2cc4fff-1922-45ac-871c-bab6f753b026-registration-dir\") pod \"csi-hostpathplugin-zr2lt\" (UID: \"a2cc4fff-1922-45ac-871c-bab6f753b026\") " pod="hostpath-provisioner/csi-hostpathplugin-zr2lt" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.402054 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/6c5671cc-4e9d-423d-b0d6-ea9e86420210-signing-key\") pod \"service-ca-9c57cc56f-npmvp\" (UID: \"6c5671cc-4e9d-423d-b0d6-ea9e86420210\") " pod="openshift-service-ca/service-ca-9c57cc56f-npmvp" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.402135 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c6e8ca63-ed5f-4e08-8c2a-ac799993720c-apiservice-cert\") pod \"packageserver-d55dfcdfc-b7cd6\" (UID: \"c6e8ca63-ed5f-4e08-8c2a-ac799993720c\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b7cd6" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.402194 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-km27r\" (UniqueName: \"kubernetes.io/projected/40deae84-a862-4cbf-8acc-d03e9f516ff4-kube-api-access-km27r\") pod \"migrator-59844c95c7-6946k\" (UID: \"40deae84-a862-4cbf-8acc-d03e9f516ff4\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6946k" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.402227 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/a2cc4fff-1922-45ac-871c-bab6f753b026-mountpoint-dir\") pod \"csi-hostpathplugin-zr2lt\" (UID: \"a2cc4fff-1922-45ac-871c-bab6f753b026\") " pod="hostpath-provisioner/csi-hostpathplugin-zr2lt" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.402262 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9a2a2cd8-7738-47ce-9432-770f570fa47e-metrics-tls\") pod \"dns-default-jvk96\" (UID: \"9a2a2cd8-7738-47ce-9432-770f570fa47e\") " pod="openshift-dns/dns-default-jvk96" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.402297 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/9681ca7c-896e-41a2-8f2a-129b813f6695-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-8vwzv\" (UID: \"9681ca7c-896e-41a2-8f2a-129b813f6695\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8vwzv" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.402340 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sr2hm\" (UniqueName: \"kubernetes.io/projected/c6e8ca63-ed5f-4e08-8c2a-ac799993720c-kube-api-access-sr2hm\") pod \"packageserver-d55dfcdfc-b7cd6\" (UID: \"c6e8ca63-ed5f-4e08-8c2a-ac799993720c\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b7cd6" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.402388 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ae78dd2a-9513-40d7-b38c-6ba848c3c558-etcd-client\") pod \"etcd-operator-b45778765-79rht\" (UID: \"ae78dd2a-9513-40d7-b38c-6ba848c3c558\") " pod="openshift-etcd-operator/etcd-operator-b45778765-79rht" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.402390 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/a2cc4fff-1922-45ac-871c-bab6f753b026-mountpoint-dir\") pod \"csi-hostpathplugin-zr2lt\" (UID: \"a2cc4fff-1922-45ac-871c-bab6f753b026\") " pod="hostpath-provisioner/csi-hostpathplugin-zr2lt" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.402410 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2914c017-5539-4c32-9b8a-c494dd5b397a-proxy-tls\") pod \"machine-config-operator-74547568cd-hnqp5\" (UID: \"2914c017-5539-4c32-9b8a-c494dd5b397a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hnqp5" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.402435 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/ae78dd2a-9513-40d7-b38c-6ba848c3c558-etcd-service-ca\") pod \"etcd-operator-b45778765-79rht\" (UID: \"ae78dd2a-9513-40d7-b38c-6ba848c3c558\") " pod="openshift-etcd-operator/etcd-operator-b45778765-79rht" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.402479 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/bb3c5ed6-797d-4357-85a0-7520a82c5f06-node-bootstrap-token\") pod \"machine-config-server-bhv9s\" (UID: \"bb3c5ed6-797d-4357-85a0-7520a82c5f06\") " pod="openshift-machine-config-operator/machine-config-server-bhv9s" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.402503 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae78dd2a-9513-40d7-b38c-6ba848c3c558-config\") pod \"etcd-operator-b45778765-79rht\" (UID: \"ae78dd2a-9513-40d7-b38c-6ba848c3c558\") " pod="openshift-etcd-operator/etcd-operator-b45778765-79rht" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.402535 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d04ea3ea-71ac-481c-990b-a989a6f61516-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-pbmfv\" (UID: \"d04ea3ea-71ac-481c-990b-a989a6f61516\") " pod="openshift-marketplace/marketplace-operator-79b997595-pbmfv" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.402564 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/2c7183be-dcde-4752-8650-ffd4e9834ce5-srv-cert\") pod \"catalog-operator-68c6474976-5c9h9\" (UID: \"2c7183be-dcde-4752-8650-ffd4e9834ce5\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5c9h9" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.402585 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4c6rh\" (UniqueName: \"kubernetes.io/projected/4ad84a5d-2ceb-457c-af21-d15ac31a54ef-kube-api-access-4c6rh\") pod \"ingress-canary-fc286\" (UID: \"4ad84a5d-2ceb-457c-af21-d15ac31a54ef\") " pod="openshift-ingress-canary/ingress-canary-fc286" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.402611 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/bb3c5ed6-797d-4357-85a0-7520a82c5f06-certs\") pod \"machine-config-server-bhv9s\" (UID: \"bb3c5ed6-797d-4357-85a0-7520a82c5f06\") " pod="openshift-machine-config-operator/machine-config-server-bhv9s" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.402632 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/a2cc4fff-1922-45ac-871c-bab6f753b026-plugins-dir\") pod \"csi-hostpathplugin-zr2lt\" (UID: \"a2cc4fff-1922-45ac-871c-bab6f753b026\") " pod="hostpath-provisioner/csi-hostpathplugin-zr2lt" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.402660 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2914c017-5539-4c32-9b8a-c494dd5b397a-auth-proxy-config\") pod \"machine-config-operator-74547568cd-hnqp5\" (UID: \"2914c017-5539-4c32-9b8a-c494dd5b397a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hnqp5" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.402683 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/49c0607a-8e84-4961-9e37-45434deacc31-serving-cert\") pod \"service-ca-operator-777779d784-mw85q\" (UID: \"49c0607a-8e84-4961-9e37-45434deacc31\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-mw85q" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.402708 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/6c5671cc-4e9d-423d-b0d6-ea9e86420210-signing-cabundle\") pod \"service-ca-9c57cc56f-npmvp\" (UID: \"6c5671cc-4e9d-423d-b0d6-ea9e86420210\") " pod="openshift-service-ca/service-ca-9c57cc56f-npmvp" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.402754 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/074633af-0fd3-4335-ba5e-af7840383694-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lwcrf\" (UID: \"074633af-0fd3-4335-ba5e-af7840383694\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lwcrf" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.402782 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-492xz\" (UniqueName: \"kubernetes.io/projected/2914c017-5539-4c32-9b8a-c494dd5b397a-kube-api-access-492xz\") pod \"machine-config-operator-74547568cd-hnqp5\" (UID: \"2914c017-5539-4c32-9b8a-c494dd5b397a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hnqp5" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.402825 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mb8bh\" (UniqueName: \"kubernetes.io/projected/5eaef9fd-30e6-47e1-afbf-8c0a464a4512-kube-api-access-mb8bh\") pod \"machine-config-controller-84d6567774-974fr\" (UID: \"5eaef9fd-30e6-47e1-afbf-8c0a464a4512\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-974fr" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.402846 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c6e8ca63-ed5f-4e08-8c2a-ac799993720c-webhook-cert\") pod \"packageserver-d55dfcdfc-b7cd6\" (UID: \"c6e8ca63-ed5f-4e08-8c2a-ac799993720c\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b7cd6" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.402869 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/259601c2-169a-446f-9fb7-e93aefc143b4-srv-cert\") pod \"olm-operator-6b444d44fb-gvhpf\" (UID: \"259601c2-169a-446f-9fb7-e93aefc143b4\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gvhpf" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.403057 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/c6e8ca63-ed5f-4e08-8c2a-ac799993720c-tmpfs\") pod \"packageserver-d55dfcdfc-b7cd6\" (UID: \"c6e8ca63-ed5f-4e08-8c2a-ac799993720c\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b7cd6" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.403089 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/074633af-0fd3-4335-ba5e-af7840383694-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lwcrf\" (UID: \"074633af-0fd3-4335-ba5e-af7840383694\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lwcrf" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.403136 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbcwc\" (UniqueName: \"kubernetes.io/projected/ae8c092c-ec9d-456a-9ba3-5501c22f6280-kube-api-access-gbcwc\") pod \"collect-profiles-29319045-qpr28\" (UID: \"ae8c092c-ec9d-456a-9ba3-5501c22f6280\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319045-qpr28" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.403163 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mjc4\" (UniqueName: \"kubernetes.io/projected/259601c2-169a-446f-9fb7-e93aefc143b4-kube-api-access-6mjc4\") pod \"olm-operator-6b444d44fb-gvhpf\" (UID: \"259601c2-169a-446f-9fb7-e93aefc143b4\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gvhpf" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.403376 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4ad84a5d-2ceb-457c-af21-d15ac31a54ef-cert\") pod \"ingress-canary-fc286\" (UID: \"4ad84a5d-2ceb-457c-af21-d15ac31a54ef\") " pod="openshift-ingress-canary/ingress-canary-fc286" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.403638 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/a2cc4fff-1922-45ac-871c-bab6f753b026-plugins-dir\") pod \"csi-hostpathplugin-zr2lt\" (UID: \"a2cc4fff-1922-45ac-871c-bab6f753b026\") " pod="hostpath-provisioner/csi-hostpathplugin-zr2lt" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.403638 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d04ea3ea-71ac-481c-990b-a989a6f61516-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-pbmfv\" (UID: \"d04ea3ea-71ac-481c-990b-a989a6f61516\") " pod="openshift-marketplace/marketplace-operator-79b997595-pbmfv" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.403785 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae78dd2a-9513-40d7-b38c-6ba848c3c558-config\") pod \"etcd-operator-b45778765-79rht\" (UID: \"ae78dd2a-9513-40d7-b38c-6ba848c3c558\") " pod="openshift-etcd-operator/etcd-operator-b45778765-79rht" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.402333 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ae78dd2a-9513-40d7-b38c-6ba848c3c558-serving-cert\") pod \"etcd-operator-b45778765-79rht\" (UID: \"ae78dd2a-9513-40d7-b38c-6ba848c3c558\") " pod="openshift-etcd-operator/etcd-operator-b45778765-79rht" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.404305 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0f54680a-b95f-4074-a763-859a7e96962d-metrics-tls\") pod \"dns-operator-744455d44c-cxqx7\" (UID: \"0f54680a-b95f-4074-a763-859a7e96962d\") " pod="openshift-dns-operator/dns-operator-744455d44c-cxqx7" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.404419 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/ae78dd2a-9513-40d7-b38c-6ba848c3c558-etcd-service-ca\") pod \"etcd-operator-b45778765-79rht\" (UID: \"ae78dd2a-9513-40d7-b38c-6ba848c3c558\") " pod="openshift-etcd-operator/etcd-operator-b45778765-79rht" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.404542 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2914c017-5539-4c32-9b8a-c494dd5b397a-auth-proxy-config\") pod \"machine-config-operator-74547568cd-hnqp5\" (UID: \"2914c017-5539-4c32-9b8a-c494dd5b397a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hnqp5" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.404596 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/ae78dd2a-9513-40d7-b38c-6ba848c3c558-etcd-ca\") pod \"etcd-operator-b45778765-79rht\" (UID: \"ae78dd2a-9513-40d7-b38c-6ba848c3c558\") " pod="openshift-etcd-operator/etcd-operator-b45778765-79rht" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.405515 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d04ea3ea-71ac-481c-990b-a989a6f61516-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-pbmfv\" (UID: \"d04ea3ea-71ac-481c-990b-a989a6f61516\") " pod="openshift-marketplace/marketplace-operator-79b997595-pbmfv" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.406253 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ae8c092c-ec9d-456a-9ba3-5501c22f6280-secret-volume\") pod \"collect-profiles-29319045-qpr28\" (UID: \"ae8c092c-ec9d-456a-9ba3-5501c22f6280\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319045-qpr28" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.406342 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/c6e8ca63-ed5f-4e08-8c2a-ac799993720c-tmpfs\") pod \"packageserver-d55dfcdfc-b7cd6\" (UID: \"c6e8ca63-ed5f-4e08-8c2a-ac799993720c\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b7cd6" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.409292 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/6c5671cc-4e9d-423d-b0d6-ea9e86420210-signing-cabundle\") pod \"service-ca-9c57cc56f-npmvp\" (UID: \"6c5671cc-4e9d-423d-b0d6-ea9e86420210\") " pod="openshift-service-ca/service-ca-9c57cc56f-npmvp" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.414695 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/49c0607a-8e84-4961-9e37-45434deacc31-serving-cert\") pod \"service-ca-operator-777779d784-mw85q\" (UID: \"49c0607a-8e84-4961-9e37-45434deacc31\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-mw85q" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.414697 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/074633af-0fd3-4335-ba5e-af7840383694-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lwcrf\" (UID: \"074633af-0fd3-4335-ba5e-af7840383694\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lwcrf" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.415188 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ae78dd2a-9513-40d7-b38c-6ba848c3c558-etcd-client\") pod \"etcd-operator-b45778765-79rht\" (UID: \"ae78dd2a-9513-40d7-b38c-6ba848c3c558\") " pod="openshift-etcd-operator/etcd-operator-b45778765-79rht" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.415515 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2914c017-5539-4c32-9b8a-c494dd5b397a-proxy-tls\") pod \"machine-config-operator-74547568cd-hnqp5\" (UID: \"2914c017-5539-4c32-9b8a-c494dd5b397a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hnqp5" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.415564 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c6e8ca63-ed5f-4e08-8c2a-ac799993720c-webhook-cert\") pod \"packageserver-d55dfcdfc-b7cd6\" (UID: \"c6e8ca63-ed5f-4e08-8c2a-ac799993720c\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b7cd6" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.415928 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5eaef9fd-30e6-47e1-afbf-8c0a464a4512-proxy-tls\") pod \"machine-config-controller-84d6567774-974fr\" (UID: \"5eaef9fd-30e6-47e1-afbf-8c0a464a4512\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-974fr" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.416055 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/bb3c5ed6-797d-4357-85a0-7520a82c5f06-node-bootstrap-token\") pod \"machine-config-server-bhv9s\" (UID: \"bb3c5ed6-797d-4357-85a0-7520a82c5f06\") " pod="openshift-machine-config-operator/machine-config-server-bhv9s" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.416143 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/bb3c5ed6-797d-4357-85a0-7520a82c5f06-certs\") pod \"machine-config-server-bhv9s\" (UID: \"bb3c5ed6-797d-4357-85a0-7520a82c5f06\") " pod="openshift-machine-config-operator/machine-config-server-bhv9s" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.416394 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/981034b2-30d0-4fea-933d-ef36b1b20d25-bound-sa-token\") pod \"ingress-operator-5b745b69d9-z4kqj\" (UID: \"981034b2-30d0-4fea-933d-ef36b1b20d25\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z4kqj" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.416707 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/2c7183be-dcde-4752-8650-ffd4e9834ce5-profile-collector-cert\") pod \"catalog-operator-68c6474976-5c9h9\" (UID: \"2c7183be-dcde-4752-8650-ffd4e9834ce5\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5c9h9" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.417446 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/259601c2-169a-446f-9fb7-e93aefc143b4-profile-collector-cert\") pod \"olm-operator-6b444d44fb-gvhpf\" (UID: \"259601c2-169a-446f-9fb7-e93aefc143b4\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gvhpf" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.417969 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/259601c2-169a-446f-9fb7-e93aefc143b4-srv-cert\") pod \"olm-operator-6b444d44fb-gvhpf\" (UID: \"259601c2-169a-446f-9fb7-e93aefc143b4\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gvhpf" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.417993 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/2c7183be-dcde-4752-8650-ffd4e9834ce5-srv-cert\") pod \"catalog-operator-68c6474976-5c9h9\" (UID: \"2c7183be-dcde-4752-8650-ffd4e9834ce5\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5c9h9" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.418594 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/6c5671cc-4e9d-423d-b0d6-ea9e86420210-signing-key\") pod \"service-ca-9c57cc56f-npmvp\" (UID: \"6c5671cc-4e9d-423d-b0d6-ea9e86420210\") " pod="openshift-service-ca/service-ca-9c57cc56f-npmvp" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.419692 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/9681ca7c-896e-41a2-8f2a-129b813f6695-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-8vwzv\" (UID: \"9681ca7c-896e-41a2-8f2a-129b813f6695\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8vwzv" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.420959 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9a2a2cd8-7738-47ce-9432-770f570fa47e-metrics-tls\") pod \"dns-default-jvk96\" (UID: \"9a2a2cd8-7738-47ce-9432-770f570fa47e\") " pod="openshift-dns/dns-default-jvk96" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.421216 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c6e8ca63-ed5f-4e08-8c2a-ac799993720c-apiservice-cert\") pod \"packageserver-d55dfcdfc-b7cd6\" (UID: \"c6e8ca63-ed5f-4e08-8c2a-ac799993720c\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b7cd6" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.424337 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f756d24-5e77-4130-b920-794234a82ece-bound-sa-token\") pod \"image-registry-697d97f7c8-57fqh\" (UID: \"8f756d24-5e77-4130-b920-794234a82ece\") " pod="openshift-image-registry/image-registry-697d97f7c8-57fqh" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.446242 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hk6x6\" (UniqueName: \"kubernetes.io/projected/53fa29ee-8f5a-4c5b-9d74-3ff726f5ed28-kube-api-access-hk6x6\") pod \"console-f9d7485db-bp6hz\" (UID: \"53fa29ee-8f5a-4c5b-9d74-3ff726f5ed28\") " pod="openshift-console/console-f9d7485db-bp6hz" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.466393 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnlvs\" (UniqueName: \"kubernetes.io/projected/1ee4b36a-3e8c-41f7-9457-7928047cd0c6-kube-api-access-rnlvs\") pod \"openshift-controller-manager-operator-756b6f6bc6-s6ksk\" (UID: \"1ee4b36a-3e8c-41f7-9457-7928047cd0c6\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-s6ksk" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.504474 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-57fqh\" (UID: \"8f756d24-5e77-4130-b920-794234a82ece\") " pod="openshift-image-registry/image-registry-697d97f7c8-57fqh" Sep 29 10:46:42 crc kubenswrapper[4752]: E0929 10:46:42.504914 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 10:46:43.004892595 +0000 UTC m=+143.794034262 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-57fqh" (UID: "8f756d24-5e77-4130-b920-794234a82ece") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.507579 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzvrr\" (UniqueName: \"kubernetes.io/projected/d9cf5107-f1bf-41ee-bd8a-e3dd8dbfeb5d-kube-api-access-pzvrr\") pod \"router-default-5444994796-r8k8r\" (UID: \"d9cf5107-f1bf-41ee-bd8a-e3dd8dbfeb5d\") " pod="openshift-ingress/router-default-5444994796-r8k8r" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.511895 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-h5ql5" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.526082 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l76nd\" (UniqueName: \"kubernetes.io/projected/981034b2-30d0-4fea-933d-ef36b1b20d25-kube-api-access-l76nd\") pod \"ingress-operator-5b745b69d9-z4kqj\" (UID: \"981034b2-30d0-4fea-933d-ef36b1b20d25\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z4kqj" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.544832 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59kq2\" (UniqueName: \"kubernetes.io/projected/c234e889-6259-4ada-827f-532882f57c4c-kube-api-access-59kq2\") pod \"authentication-operator-69f744f599-dbt29\" (UID: \"c234e889-6259-4ada-827f-532882f57c4c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dbt29" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.552168 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-prpgr" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.566161 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nk7nw\" (UniqueName: \"kubernetes.io/projected/82b648a1-4d55-4994-b471-4938eeb34bd0-kube-api-access-nk7nw\") pod \"apiserver-7bbb656c7d-jvmfr\" (UID: \"82b648a1-4d55-4994-b471-4938eeb34bd0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jvmfr" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.567927 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-dbt29" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.576030 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-bp6hz" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.581304 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-s6ksk" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.586260 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brb8j\" (UniqueName: \"kubernetes.io/projected/50f1884b-af0d-429a-b38e-ffd726a9463b-kube-api-access-brb8j\") pod \"kube-storage-version-migrator-operator-b67b599dd-mfldp\" (UID: \"50f1884b-af0d-429a-b38e-ffd726a9463b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mfldp" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.587149 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bq279" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.595041 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z4kqj" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.608078 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkk2f\" (UniqueName: \"kubernetes.io/projected/24ab4270-1ece-4201-94ae-51c71902c3f1-kube-api-access-nkk2f\") pod \"control-plane-machine-set-operator-78cbb6b69f-s8pf8\" (UID: \"24ab4270-1ece-4201-94ae-51c71902c3f1\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-s8pf8" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.608726 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 10:46:42 crc kubenswrapper[4752]: E0929 10:46:42.609101 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 10:46:43.10907914 +0000 UTC m=+143.898220807 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.609511 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-57fqh\" (UID: \"8f756d24-5e77-4130-b920-794234a82ece\") " pod="openshift-image-registry/image-registry-697d97f7c8-57fqh" Sep 29 10:46:42 crc kubenswrapper[4752]: E0929 10:46:42.609965 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 10:46:43.109958135 +0000 UTC m=+143.899099802 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-57fqh" (UID: "8f756d24-5e77-4130-b920-794234a82ece") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.615967 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-hbftw" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.625995 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-r8k8r" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.631616 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwmb6\" (UniqueName: \"kubernetes.io/projected/78a002dc-c902-472c-b269-9ec7c99ab835-kube-api-access-wwmb6\") pod \"route-controller-manager-6576b87f9c-q8xbr\" (UID: \"78a002dc-c902-472c-b269-9ec7c99ab835\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-q8xbr" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.644080 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-krscp" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.649702 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdmbs\" (UniqueName: \"kubernetes.io/projected/8f756d24-5e77-4130-b920-794234a82ece-kube-api-access-gdmbs\") pod \"image-registry-697d97f7c8-57fqh\" (UID: \"8f756d24-5e77-4130-b920-794234a82ece\") " pod="openshift-image-registry/image-registry-697d97f7c8-57fqh" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.650297 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mfldp" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.672757 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tddv6\" (UniqueName: \"kubernetes.io/projected/44ae2b29-ec3a-4321-8590-4d316d810034-kube-api-access-tddv6\") pod \"downloads-7954f5f757-bhw29\" (UID: \"44ae2b29-ec3a-4321-8590-4d316d810034\") " pod="openshift-console/downloads-7954f5f757-bhw29" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.688014 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wtkn\" (UniqueName: \"kubernetes.io/projected/6c5671cc-4e9d-423d-b0d6-ea9e86420210-kube-api-access-4wtkn\") pod \"service-ca-9c57cc56f-npmvp\" (UID: \"6c5671cc-4e9d-423d-b0d6-ea9e86420210\") " pod="openshift-service-ca/service-ca-9c57cc56f-npmvp" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.710973 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 10:46:42 crc kubenswrapper[4752]: E0929 10:46:42.711548 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 10:46:43.211526727 +0000 UTC m=+144.000668394 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.716369 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6k948\" (UniqueName: \"kubernetes.io/projected/49c0607a-8e84-4961-9e37-45434deacc31-kube-api-access-6k948\") pod \"service-ca-operator-777779d784-mw85q\" (UID: \"49c0607a-8e84-4961-9e37-45434deacc31\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-mw85q" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.736924 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-mw85q" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.741378 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rxcj\" (UniqueName: \"kubernetes.io/projected/9a2a2cd8-7738-47ce-9432-770f570fa47e-kube-api-access-2rxcj\") pod \"dns-default-jvk96\" (UID: \"9a2a2cd8-7738-47ce-9432-770f570fa47e\") " pod="openshift-dns/dns-default-jvk96" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.741594 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-whxr2"] Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.742378 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-npmvp" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.758791 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gf22s\" (UniqueName: \"kubernetes.io/projected/d04ea3ea-71ac-481c-990b-a989a6f61516-kube-api-access-gf22s\") pod \"marketplace-operator-79b997595-pbmfv\" (UID: \"d04ea3ea-71ac-481c-990b-a989a6f61516\") " pod="openshift-marketplace/marketplace-operator-79b997595-pbmfv" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.775001 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6v4bh\" (UniqueName: \"kubernetes.io/projected/9681ca7c-896e-41a2-8f2a-129b813f6695-kube-api-access-6v4bh\") pod \"package-server-manager-789f6589d5-8vwzv\" (UID: \"9681ca7c-896e-41a2-8f2a-129b813f6695\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8vwzv" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.776662 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-jvk96" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.782893 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b26v7\" (UniqueName: \"kubernetes.io/projected/bb3c5ed6-797d-4357-85a0-7520a82c5f06-kube-api-access-b26v7\") pod \"machine-config-server-bhv9s\" (UID: \"bb3c5ed6-797d-4357-85a0-7520a82c5f06\") " pod="openshift-machine-config-operator/machine-config-server-bhv9s" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.784000 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-rvflz" event={"ID":"f3f0dcdd-283d-4ed5-889a-da260dcf13b0","Type":"ContainerStarted","Data":"f4db0405385faa272d014a5e7ff66239694e2731c81f5d9603858b5015cb9e7d"} Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.784482 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-rvflz" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.787607 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-r8k8r" event={"ID":"d9cf5107-f1bf-41ee-bd8a-e3dd8dbfeb5d","Type":"ContainerStarted","Data":"3e137e974b571ccaddd13c84a816c2193c65cbb31261d1f4790c73ef23f17d54"} Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.788441 4752 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-rvflz container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.17:8443/healthz\": dial tcp 10.217.0.17:8443: connect: connection refused" start-of-body= Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.788499 4752 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-rvflz" podUID="f3f0dcdd-283d-4ed5-889a-da260dcf13b0" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.17:8443/healthz\": dial tcp 10.217.0.17:8443: connect: connection refused" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.790314 4752 generic.go:334] "Generic (PLEG): container finished" podID="2f1a2a22-45fb-441a-a05d-fb6c6dbf9e68" containerID="9ea871a637cb32760cf610daea66bdca80539875945c11288ac4cdf237796079" exitCode=0 Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.790429 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-7sz2n" event={"ID":"2f1a2a22-45fb-441a-a05d-fb6c6dbf9e68","Type":"ContainerDied","Data":"9ea871a637cb32760cf610daea66bdca80539875945c11288ac4cdf237796079"} Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.813109 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-57fqh\" (UID: \"8f756d24-5e77-4130-b920-794234a82ece\") " pod="openshift-image-registry/image-registry-697d97f7c8-57fqh" Sep 29 10:46:42 crc kubenswrapper[4752]: E0929 10:46:42.813539 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 10:46:43.313524873 +0000 UTC m=+144.102666540 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-57fqh" (UID: "8f756d24-5e77-4130-b920-794234a82ece") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.813698 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bmhr\" (UniqueName: \"kubernetes.io/projected/2c7183be-dcde-4752-8650-ffd4e9834ce5-kube-api-access-9bmhr\") pod \"catalog-operator-68c6474976-5c9h9\" (UID: \"2c7183be-dcde-4752-8650-ffd4e9834ce5\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5c9h9" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.818072 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-ntjk6" event={"ID":"2477e356-dc04-44a6-bec0-e7304134493f","Type":"ContainerStarted","Data":"9fe64ab48e55e9d9e901bf4b3882899db4c871082b94ca493309568f29e47461"} Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.821770 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-bhw29" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.825567 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fw7kk" event={"ID":"fd3a77bc-f325-4019-ad7b-e03b97e0471a","Type":"ContainerStarted","Data":"8c5d6857a4d0173ad07e2ce7ae281666da832d0524a635331707b7c78cd73f5b"} Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.825619 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fw7kk" event={"ID":"fd3a77bc-f325-4019-ad7b-e03b97e0471a","Type":"ContainerStarted","Data":"7a085fbce6dbf20f3d93b0219cdc59a49f4ca6004d8e03f90d8e6cdcaa8dd45b"} Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.825630 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fw7kk" event={"ID":"fd3a77bc-f325-4019-ad7b-e03b97e0471a","Type":"ContainerStarted","Data":"68c53c923734a3ffab0c03cec19908e5b964c903c82efc89c31872de13921635"} Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.825795 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jgtb\" (UniqueName: \"kubernetes.io/projected/ae78dd2a-9513-40d7-b38c-6ba848c3c558-kube-api-access-4jgtb\") pod \"etcd-operator-b45778765-79rht\" (UID: \"ae78dd2a-9513-40d7-b38c-6ba848c3c558\") " pod="openshift-etcd-operator/etcd-operator-b45778765-79rht" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.830591 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jvmfr" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.839643 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-q8xbr" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.842433 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4t7tw" event={"ID":"e9b3ddc5-03c1-4d65-b890-660e9e8cc6c0","Type":"ContainerStarted","Data":"f618135eaf905d64766df7091626cf7c62382134bfe756987688804805d73ad2"} Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.850139 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhc8c\" (UniqueName: \"kubernetes.io/projected/a2cc4fff-1922-45ac-871c-bab6f753b026-kube-api-access-fhc8c\") pod \"csi-hostpathplugin-zr2lt\" (UID: \"a2cc4fff-1922-45ac-871c-bab6f753b026\") " pod="hostpath-provisioner/csi-hostpathplugin-zr2lt" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.861633 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6ng8r" event={"ID":"e8089407-89fb-42c1-8947-58fa83f8ef4c","Type":"ContainerStarted","Data":"11125666100c82179833ccab369295c7171262eb65cec92d2e52f4953ba8ff66"} Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.878887 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pl84\" (UniqueName: \"kubernetes.io/projected/0f54680a-b95f-4074-a763-859a7e96962d-kube-api-access-9pl84\") pod \"dns-operator-744455d44c-cxqx7\" (UID: \"0f54680a-b95f-4074-a763-859a7e96962d\") " pod="openshift-dns-operator/dns-operator-744455d44c-cxqx7" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.884030 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-48jd6" event={"ID":"6b7ad647-b9dc-4694-beab-5908b529d9cf","Type":"ContainerStarted","Data":"b0d9b0329e4fa78f59b36382097cdea2eb05253369a2d17bcdc9bbc78fa6838d"} Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.899622 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-s8pf8" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.914690 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 10:46:42 crc kubenswrapper[4752]: E0929 10:46:42.916086 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 10:46:43.416072332 +0000 UTC m=+144.205213999 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.917704 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-km27r\" (UniqueName: \"kubernetes.io/projected/40deae84-a862-4cbf-8acc-d03e9f516ff4-kube-api-access-km27r\") pod \"migrator-59844c95c7-6946k\" (UID: \"40deae84-a862-4cbf-8acc-d03e9f516ff4\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6946k" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.919886 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sr2hm\" (UniqueName: \"kubernetes.io/projected/c6e8ca63-ed5f-4e08-8c2a-ac799993720c-kube-api-access-sr2hm\") pod \"packageserver-d55dfcdfc-b7cd6\" (UID: \"c6e8ca63-ed5f-4e08-8c2a-ac799993720c\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b7cd6" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.943653 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mjc4\" (UniqueName: \"kubernetes.io/projected/259601c2-169a-446f-9fb7-e93aefc143b4-kube-api-access-6mjc4\") pod \"olm-operator-6b444d44fb-gvhpf\" (UID: \"259601c2-169a-446f-9fb7-e93aefc143b4\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gvhpf" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.966641 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-492xz\" (UniqueName: \"kubernetes.io/projected/2914c017-5539-4c32-9b8a-c494dd5b397a-kube-api-access-492xz\") pod \"machine-config-operator-74547568cd-hnqp5\" (UID: \"2914c017-5539-4c32-9b8a-c494dd5b397a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hnqp5" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.970327 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-79rht" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.975117 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6946k" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.986252 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hnqp5" Sep 29 10:46:42 crc kubenswrapper[4752]: I0929 10:46:42.995213 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mb8bh\" (UniqueName: \"kubernetes.io/projected/5eaef9fd-30e6-47e1-afbf-8c0a464a4512-kube-api-access-mb8bh\") pod \"machine-config-controller-84d6567774-974fr\" (UID: \"5eaef9fd-30e6-47e1-afbf-8c0a464a4512\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-974fr" Sep 29 10:46:43 crc kubenswrapper[4752]: I0929 10:46:43.006103 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8vwzv" Sep 29 10:46:43 crc kubenswrapper[4752]: I0929 10:46:43.010338 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4c6rh\" (UniqueName: \"kubernetes.io/projected/4ad84a5d-2ceb-457c-af21-d15ac31a54ef-kube-api-access-4c6rh\") pod \"ingress-canary-fc286\" (UID: \"4ad84a5d-2ceb-457c-af21-d15ac31a54ef\") " pod="openshift-ingress-canary/ingress-canary-fc286" Sep 29 10:46:43 crc kubenswrapper[4752]: I0929 10:46:43.013141 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-dbt29"] Sep 29 10:46:43 crc kubenswrapper[4752]: I0929 10:46:43.013193 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-h5ql5"] Sep 29 10:46:43 crc kubenswrapper[4752]: I0929 10:46:43.016460 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5c9h9" Sep 29 10:46:43 crc kubenswrapper[4752]: I0929 10:46:43.035646 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gvhpf" Sep 29 10:46:43 crc kubenswrapper[4752]: I0929 10:46:43.039258 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/074633af-0fd3-4335-ba5e-af7840383694-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lwcrf\" (UID: \"074633af-0fd3-4335-ba5e-af7840383694\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lwcrf" Sep 29 10:46:43 crc kubenswrapper[4752]: I0929 10:46:43.055586 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-prpgr"] Sep 29 10:46:43 crc kubenswrapper[4752]: I0929 10:46:43.055663 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-bp6hz"] Sep 29 10:46:43 crc kubenswrapper[4752]: I0929 10:46:43.059853 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbcwc\" (UniqueName: \"kubernetes.io/projected/ae8c092c-ec9d-456a-9ba3-5501c22f6280-kube-api-access-gbcwc\") pod \"collect-profiles-29319045-qpr28\" (UID: \"ae8c092c-ec9d-456a-9ba3-5501c22f6280\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319045-qpr28" Sep 29 10:46:43 crc kubenswrapper[4752]: I0929 10:46:43.073787 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-57fqh\" (UID: \"8f756d24-5e77-4130-b920-794234a82ece\") " pod="openshift-image-registry/image-registry-697d97f7c8-57fqh" Sep 29 10:46:43 crc kubenswrapper[4752]: I0929 10:46:43.077038 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b7cd6" Sep 29 10:46:43 crc kubenswrapper[4752]: I0929 10:46:43.078346 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-pbmfv" Sep 29 10:46:43 crc kubenswrapper[4752]: E0929 10:46:43.078674 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 10:46:43.578653629 +0000 UTC m=+144.367795296 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-57fqh" (UID: "8f756d24-5e77-4130-b920-794234a82ece") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:43 crc kubenswrapper[4752]: I0929 10:46:43.078814 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-bhv9s" Sep 29 10:46:43 crc kubenswrapper[4752]: I0929 10:46:43.079173 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-fc286" Sep 29 10:46:43 crc kubenswrapper[4752]: I0929 10:46:43.080118 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-cxqx7" Sep 29 10:46:43 crc kubenswrapper[4752]: I0929 10:46:43.095880 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-zr2lt" Sep 29 10:46:43 crc kubenswrapper[4752]: I0929 10:46:43.175341 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 10:46:43 crc kubenswrapper[4752]: E0929 10:46:43.176263 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 10:46:43.676246853 +0000 UTC m=+144.465388520 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:43 crc kubenswrapper[4752]: I0929 10:46:43.257993 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-974fr" Sep 29 10:46:43 crc kubenswrapper[4752]: I0929 10:46:43.276949 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-57fqh\" (UID: \"8f756d24-5e77-4130-b920-794234a82ece\") " pod="openshift-image-registry/image-registry-697d97f7c8-57fqh" Sep 29 10:46:43 crc kubenswrapper[4752]: E0929 10:46:43.277419 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 10:46:43.777402425 +0000 UTC m=+144.566544102 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-57fqh" (UID: "8f756d24-5e77-4130-b920-794234a82ece") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:43 crc kubenswrapper[4752]: I0929 10:46:43.291225 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29319045-qpr28" Sep 29 10:46:43 crc kubenswrapper[4752]: I0929 10:46:43.298515 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lwcrf" Sep 29 10:46:43 crc kubenswrapper[4752]: I0929 10:46:43.378315 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 10:46:43 crc kubenswrapper[4752]: E0929 10:46:43.378565 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 10:46:43.878547467 +0000 UTC m=+144.667689124 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:43 crc kubenswrapper[4752]: I0929 10:46:43.378595 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-57fqh\" (UID: \"8f756d24-5e77-4130-b920-794234a82ece\") " pod="openshift-image-registry/image-registry-697d97f7c8-57fqh" Sep 29 10:46:43 crc kubenswrapper[4752]: E0929 10:46:43.379025 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 10:46:43.87901292 +0000 UTC m=+144.668154587 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-57fqh" (UID: "8f756d24-5e77-4130-b920-794234a82ece") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:43 crc kubenswrapper[4752]: I0929 10:46:43.480626 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 10:46:43 crc kubenswrapper[4752]: E0929 10:46:43.480895 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 10:46:43.98084715 +0000 UTC m=+144.769988817 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:43 crc kubenswrapper[4752]: I0929 10:46:43.481412 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-57fqh\" (UID: \"8f756d24-5e77-4130-b920-794234a82ece\") " pod="openshift-image-registry/image-registry-697d97f7c8-57fqh" Sep 29 10:46:43 crc kubenswrapper[4752]: E0929 10:46:43.481890 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 10:46:43.981871208 +0000 UTC m=+144.771012875 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-57fqh" (UID: "8f756d24-5e77-4130-b920-794234a82ece") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:43 crc kubenswrapper[4752]: I0929 10:46:43.573759 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-z4kqj"] Sep 29 10:46:43 crc kubenswrapper[4752]: I0929 10:46:43.593633 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 10:46:43 crc kubenswrapper[4752]: E0929 10:46:43.593866 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 10:46:44.093783057 +0000 UTC m=+144.882924734 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:43 crc kubenswrapper[4752]: I0929 10:46:43.594143 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-57fqh\" (UID: \"8f756d24-5e77-4130-b920-794234a82ece\") " pod="openshift-image-registry/image-registry-697d97f7c8-57fqh" Sep 29 10:46:43 crc kubenswrapper[4752]: E0929 10:46:43.594758 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 10:46:44.094744203 +0000 UTC m=+144.883885870 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-57fqh" (UID: "8f756d24-5e77-4130-b920-794234a82ece") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:43 crc kubenswrapper[4752]: I0929 10:46:43.597300 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-s6ksk"] Sep 29 10:46:43 crc kubenswrapper[4752]: I0929 10:46:43.697368 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 10:46:43 crc kubenswrapper[4752]: E0929 10:46:43.697588 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 10:46:44.197503739 +0000 UTC m=+144.986645406 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:43 crc kubenswrapper[4752]: I0929 10:46:43.698451 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-57fqh\" (UID: \"8f756d24-5e77-4130-b920-794234a82ece\") " pod="openshift-image-registry/image-registry-697d97f7c8-57fqh" Sep 29 10:46:43 crc kubenswrapper[4752]: E0929 10:46:43.700263 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 10:46:44.200245615 +0000 UTC m=+144.989387282 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-57fqh" (UID: "8f756d24-5e77-4130-b920-794234a82ece") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:43 crc kubenswrapper[4752]: I0929 10:46:43.799302 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 10:46:43 crc kubenswrapper[4752]: E0929 10:46:43.799821 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 10:46:44.299784992 +0000 UTC m=+145.088926659 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:43 crc kubenswrapper[4752]: I0929 10:46:43.842263 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-rvflz" podStartSLOduration=123.842243274 podStartE2EDuration="2m3.842243274s" podCreationTimestamp="2025-09-29 10:44:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:46:43.84063591 +0000 UTC m=+144.629777577" watchObservedRunningTime="2025-09-29 10:46:43.842243274 +0000 UTC m=+144.631384941" Sep 29 10:46:43 crc kubenswrapper[4752]: I0929 10:46:43.893449 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-r8k8r" event={"ID":"d9cf5107-f1bf-41ee-bd8a-e3dd8dbfeb5d","Type":"ContainerStarted","Data":"920cfd487f7294b237983cdaa405410221f310dc3e2b458af7026ed1e1d3d6ee"} Sep 29 10:46:43 crc kubenswrapper[4752]: I0929 10:46:43.898262 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-whxr2" event={"ID":"9ce060b9-be39-4731-a723-388817737e31","Type":"ContainerStarted","Data":"b1056a2cfa87cc5b30bcc8f47e25f9a63d5ece26a6253ceb9ecce84abb1ee87b"} Sep 29 10:46:43 crc kubenswrapper[4752]: I0929 10:46:43.898452 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-whxr2" event={"ID":"9ce060b9-be39-4731-a723-388817737e31","Type":"ContainerStarted","Data":"ff4bd6eb0541b9af092905668b427144e157ee914027722b296c3e768f2056d7"} Sep 29 10:46:43 crc kubenswrapper[4752]: I0929 10:46:43.900025 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z4kqj" event={"ID":"981034b2-30d0-4fea-933d-ef36b1b20d25","Type":"ContainerStarted","Data":"22b1a16e070a735f0bb3a88fc73091aae452c32251295db9b6d03e880ce427ca"} Sep 29 10:46:43 crc kubenswrapper[4752]: I0929 10:46:43.900619 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-57fqh\" (UID: \"8f756d24-5e77-4130-b920-794234a82ece\") " pod="openshift-image-registry/image-registry-697d97f7c8-57fqh" Sep 29 10:46:43 crc kubenswrapper[4752]: E0929 10:46:43.901089 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 10:46:44.401073418 +0000 UTC m=+145.190215095 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-57fqh" (UID: "8f756d24-5e77-4130-b920-794234a82ece") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:43 crc kubenswrapper[4752]: I0929 10:46:43.902106 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-prpgr" event={"ID":"f00d489f-9b4a-421a-a02a-b9b090ae0449","Type":"ContainerStarted","Data":"a2f5b94cf3e1b07e27ccd94c0b80eb6a995b63ab6786bd1ae83da50d7a4e25c7"} Sep 29 10:46:43 crc kubenswrapper[4752]: I0929 10:46:43.903655 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-dbt29" event={"ID":"c234e889-6259-4ada-827f-532882f57c4c","Type":"ContainerStarted","Data":"217211c431454a27810db94fa1a04f9ebb53c2de821bd5ece5d09976f377b755"} Sep 29 10:46:43 crc kubenswrapper[4752]: I0929 10:46:43.903688 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-dbt29" event={"ID":"c234e889-6259-4ada-827f-532882f57c4c","Type":"ContainerStarted","Data":"0cbb910ae429c5f4274cab4a25e864cabd1ba0f340fb11f6d6211a4dc8a3a4f7"} Sep 29 10:46:43 crc kubenswrapper[4752]: I0929 10:46:43.908281 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-bhv9s" event={"ID":"bb3c5ed6-797d-4357-85a0-7520a82c5f06","Type":"ContainerStarted","Data":"139041e243e1ca46530c91fd31c11711d0c9f18550efd85675d3e1b518c550e9"} Sep 29 10:46:43 crc kubenswrapper[4752]: I0929 10:46:43.908359 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-bhv9s" event={"ID":"bb3c5ed6-797d-4357-85a0-7520a82c5f06","Type":"ContainerStarted","Data":"c4922e65a9ad0d80cf19c7f0d10fd10cb2cd5354871cf89ea92344ad5f681398"} Sep 29 10:46:43 crc kubenswrapper[4752]: I0929 10:46:43.929973 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-48jd6" podStartSLOduration=123.929949244 podStartE2EDuration="2m3.929949244s" podCreationTimestamp="2025-09-29 10:44:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:46:43.887085322 +0000 UTC m=+144.676226999" watchObservedRunningTime="2025-09-29 10:46:43.929949244 +0000 UTC m=+144.719090911" Sep 29 10:46:43 crc kubenswrapper[4752]: I0929 10:46:43.941681 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-7sz2n" event={"ID":"2f1a2a22-45fb-441a-a05d-fb6c6dbf9e68","Type":"ContainerStarted","Data":"e0e9bf91addcd40522fc1ce13bebf0c88bb2b30d37801aa98ef2910c6a39fd15"} Sep 29 10:46:43 crc kubenswrapper[4752]: I0929 10:46:43.950764 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-bp6hz" event={"ID":"53fa29ee-8f5a-4c5b-9d74-3ff726f5ed28","Type":"ContainerStarted","Data":"69f7996091c1816562d3e389be666e8d0a58e4432aa8c659b999aa8397210cd8"} Sep 29 10:46:43 crc kubenswrapper[4752]: I0929 10:46:43.951296 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-bp6hz" event={"ID":"53fa29ee-8f5a-4c5b-9d74-3ff726f5ed28","Type":"ContainerStarted","Data":"bd4039e4b2e06554d906d80920591716a87dcfd00dadba613f66b85043d2facb"} Sep 29 10:46:43 crc kubenswrapper[4752]: I0929 10:46:43.958137 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-h5ql5" event={"ID":"033ab5f2-0ffb-4f26-b703-8380971d1d7e","Type":"ContainerStarted","Data":"ba89a74f5d42e84848f3b7e243a875dbf76ec8cb263d1052730596981b6fd7f4"} Sep 29 10:46:43 crc kubenswrapper[4752]: I0929 10:46:43.958194 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-h5ql5" event={"ID":"033ab5f2-0ffb-4f26-b703-8380971d1d7e","Type":"ContainerStarted","Data":"4a91258dcb835b54c84c2c963b5589ef96f695ac553515fb0042144a16e7061e"} Sep 29 10:46:43 crc kubenswrapper[4752]: I0929 10:46:43.968641 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-ntjk6" podStartSLOduration=123.968615382 podStartE2EDuration="2m3.968615382s" podCreationTimestamp="2025-09-29 10:44:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:46:43.96603303 +0000 UTC m=+144.755174697" watchObservedRunningTime="2025-09-29 10:46:43.968615382 +0000 UTC m=+144.757757049" Sep 29 10:46:43 crc kubenswrapper[4752]: I0929 10:46:43.988146 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-rvflz" Sep 29 10:46:44 crc kubenswrapper[4752]: I0929 10:46:44.005830 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6ng8r" podStartSLOduration=124.005779548 podStartE2EDuration="2m4.005779548s" podCreationTimestamp="2025-09-29 10:44:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:46:44.004965695 +0000 UTC m=+144.794107362" watchObservedRunningTime="2025-09-29 10:46:44.005779548 +0000 UTC m=+144.794921215" Sep 29 10:46:44 crc kubenswrapper[4752]: I0929 10:46:44.011315 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 10:46:44 crc kubenswrapper[4752]: E0929 10:46:44.012853 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 10:46:44.512829052 +0000 UTC m=+145.301970729 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:44 crc kubenswrapper[4752]: I0929 10:46:44.117578 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-57fqh\" (UID: \"8f756d24-5e77-4130-b920-794234a82ece\") " pod="openshift-image-registry/image-registry-697d97f7c8-57fqh" Sep 29 10:46:44 crc kubenswrapper[4752]: E0929 10:46:44.118075 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 10:46:44.618057956 +0000 UTC m=+145.407199623 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-57fqh" (UID: "8f756d24-5e77-4130-b920-794234a82ece") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:44 crc kubenswrapper[4752]: I0929 10:46:44.218690 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 10:46:44 crc kubenswrapper[4752]: E0929 10:46:44.219014 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 10:46:44.718994862 +0000 UTC m=+145.508136539 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:44 crc kubenswrapper[4752]: I0929 10:46:44.320441 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-57fqh\" (UID: \"8f756d24-5e77-4130-b920-794234a82ece\") " pod="openshift-image-registry/image-registry-697d97f7c8-57fqh" Sep 29 10:46:44 crc kubenswrapper[4752]: E0929 10:46:44.320858 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 10:46:44.820835713 +0000 UTC m=+145.609977380 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-57fqh" (UID: "8f756d24-5e77-4130-b920-794234a82ece") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:44 crc kubenswrapper[4752]: I0929 10:46:44.421979 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 10:46:44 crc kubenswrapper[4752]: E0929 10:46:44.422282 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 10:46:44.922262042 +0000 UTC m=+145.711403709 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:44 crc kubenswrapper[4752]: I0929 10:46:44.525770 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-57fqh\" (UID: \"8f756d24-5e77-4130-b920-794234a82ece\") " pod="openshift-image-registry/image-registry-697d97f7c8-57fqh" Sep 29 10:46:44 crc kubenswrapper[4752]: E0929 10:46:44.526183 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 10:46:45.026168769 +0000 UTC m=+145.815310436 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-57fqh" (UID: "8f756d24-5e77-4130-b920-794234a82ece") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:44 crc kubenswrapper[4752]: I0929 10:46:44.566421 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fw7kk" podStartSLOduration=124.56640557 podStartE2EDuration="2m4.56640557s" podCreationTimestamp="2025-09-29 10:44:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:46:44.56423075 +0000 UTC m=+145.353372417" watchObservedRunningTime="2025-09-29 10:46:44.56640557 +0000 UTC m=+145.355547237" Sep 29 10:46:44 crc kubenswrapper[4752]: I0929 10:46:44.627457 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-r8k8r" Sep 29 10:46:44 crc kubenswrapper[4752]: I0929 10:46:44.629666 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 10:46:44 crc kubenswrapper[4752]: E0929 10:46:44.629939 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 10:46:45.129910392 +0000 UTC m=+145.919052049 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:44 crc kubenswrapper[4752]: I0929 10:46:44.630219 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-57fqh\" (UID: \"8f756d24-5e77-4130-b920-794234a82ece\") " pod="openshift-image-registry/image-registry-697d97f7c8-57fqh" Sep 29 10:46:44 crc kubenswrapper[4752]: E0929 10:46:44.630937 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 10:46:45.130918151 +0000 UTC m=+145.920059818 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-57fqh" (UID: "8f756d24-5e77-4130-b920-794234a82ece") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:44 crc kubenswrapper[4752]: I0929 10:46:44.641599 4752 patch_prober.go:28] interesting pod/router-default-5444994796-r8k8r container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 29 10:46:44 crc kubenswrapper[4752]: [-]has-synced failed: reason withheld Sep 29 10:46:44 crc kubenswrapper[4752]: [+]process-running ok Sep 29 10:46:44 crc kubenswrapper[4752]: healthz check failed Sep 29 10:46:44 crc kubenswrapper[4752]: I0929 10:46:44.641690 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r8k8r" podUID="d9cf5107-f1bf-41ee-bd8a-e3dd8dbfeb5d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 29 10:46:44 crc kubenswrapper[4752]: I0929 10:46:44.731073 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4t7tw" podStartSLOduration=124.731056875 podStartE2EDuration="2m4.731056875s" podCreationTimestamp="2025-09-29 10:44:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:46:44.678568956 +0000 UTC m=+145.467710613" watchObservedRunningTime="2025-09-29 10:46:44.731056875 +0000 UTC m=+145.520198542" Sep 29 10:46:44 crc kubenswrapper[4752]: I0929 10:46:44.732230 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 10:46:44 crc kubenswrapper[4752]: E0929 10:46:44.732504 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 10:46:45.232457273 +0000 UTC m=+146.021598940 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:44 crc kubenswrapper[4752]: I0929 10:46:44.732578 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-57fqh\" (UID: \"8f756d24-5e77-4130-b920-794234a82ece\") " pod="openshift-image-registry/image-registry-697d97f7c8-57fqh" Sep 29 10:46:44 crc kubenswrapper[4752]: E0929 10:46:44.733421 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 10:46:45.233412499 +0000 UTC m=+146.022554166 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-57fqh" (UID: "8f756d24-5e77-4130-b920-794234a82ece") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:44 crc kubenswrapper[4752]: I0929 10:46:44.741611 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mfldp"] Sep 29 10:46:44 crc kubenswrapper[4752]: I0929 10:46:44.759655 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bq279"] Sep 29 10:46:44 crc kubenswrapper[4752]: I0929 10:46:44.791460 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-hbftw"] Sep 29 10:46:44 crc kubenswrapper[4752]: I0929 10:46:44.843515 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 10:46:44 crc kubenswrapper[4752]: E0929 10:46:44.843636 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 10:46:45.343617371 +0000 UTC m=+146.132759038 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:44 crc kubenswrapper[4752]: I0929 10:46:44.843934 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-57fqh\" (UID: \"8f756d24-5e77-4130-b920-794234a82ece\") " pod="openshift-image-registry/image-registry-697d97f7c8-57fqh" Sep 29 10:46:44 crc kubenswrapper[4752]: E0929 10:46:44.844461 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 10:46:45.344450474 +0000 UTC m=+146.133592141 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-57fqh" (UID: "8f756d24-5e77-4130-b920-794234a82ece") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:44 crc kubenswrapper[4752]: I0929 10:46:44.851002 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-npmvp"] Sep 29 10:46:44 crc kubenswrapper[4752]: I0929 10:46:44.935880 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-r8k8r" podStartSLOduration=124.935864627 podStartE2EDuration="2m4.935864627s" podCreationTimestamp="2025-09-29 10:44:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:46:44.93527796 +0000 UTC m=+145.724419637" watchObservedRunningTime="2025-09-29 10:46:44.935864627 +0000 UTC m=+145.725006284" Sep 29 10:46:44 crc kubenswrapper[4752]: I0929 10:46:44.947825 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 10:46:44 crc kubenswrapper[4752]: E0929 10:46:44.948118 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 10:46:45.448071554 +0000 UTC m=+146.237213221 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:44 crc kubenswrapper[4752]: I0929 10:46:44.948328 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-57fqh\" (UID: \"8f756d24-5e77-4130-b920-794234a82ece\") " pod="openshift-image-registry/image-registry-697d97f7c8-57fqh" Sep 29 10:46:44 crc kubenswrapper[4752]: E0929 10:46:44.948650 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 10:46:45.448638129 +0000 UTC m=+146.237779796 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-57fqh" (UID: "8f756d24-5e77-4130-b920-794234a82ece") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:44 crc kubenswrapper[4752]: I0929 10:46:44.992197 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-hbftw" event={"ID":"6445acca-bd3a-41a4-8b4a-16607771e077","Type":"ContainerStarted","Data":"23d33916604f00ab0cc59b7afd774a0c831d976a98cd16fbb476cdefa7c51e42"} Sep 29 10:46:45 crc kubenswrapper[4752]: I0929 10:46:45.001034 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-q8xbr"] Sep 29 10:46:45 crc kubenswrapper[4752]: I0929 10:46:45.011125 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-jvk96"] Sep 29 10:46:45 crc kubenswrapper[4752]: I0929 10:46:45.013788 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-krscp"] Sep 29 10:46:45 crc kubenswrapper[4752]: I0929 10:46:45.028664 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bq279" event={"ID":"fc5ce698-c6c1-41fc-9b25-864081396f26","Type":"ContainerStarted","Data":"ee1098b4be121291b09d1e011444d4b342d90bbcab0c742295f50dc47cc77217"} Sep 29 10:46:45 crc kubenswrapper[4752]: I0929 10:46:45.035699 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-mw85q"] Sep 29 10:46:45 crc kubenswrapper[4752]: I0929 10:46:45.048080 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-prpgr" event={"ID":"f00d489f-9b4a-421a-a02a-b9b090ae0449","Type":"ContainerStarted","Data":"0bf50d9bebe257170660e11029e3a817b7b0c30c5acf5ae33ab8d53fbc1261a4"} Sep 29 10:46:45 crc kubenswrapper[4752]: I0929 10:46:45.048677 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-prpgr" Sep 29 10:46:45 crc kubenswrapper[4752]: I0929 10:46:45.049746 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 10:46:45 crc kubenswrapper[4752]: E0929 10:46:45.050508 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 10:46:45.55046479 +0000 UTC m=+146.339606467 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:45 crc kubenswrapper[4752]: I0929 10:46:45.053587 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-npmvp" event={"ID":"6c5671cc-4e9d-423d-b0d6-ea9e86420210","Type":"ContainerStarted","Data":"1fffd11fc1636331bb670cc7df2f2ff917c1a5f7261a16b3f46a668238345208"} Sep 29 10:46:45 crc kubenswrapper[4752]: I0929 10:46:45.061986 4752 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-prpgr container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.34:6443/healthz\": dial tcp 10.217.0.34:6443: connect: connection refused" start-of-body= Sep 29 10:46:45 crc kubenswrapper[4752]: I0929 10:46:45.062049 4752 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-prpgr" podUID="f00d489f-9b4a-421a-a02a-b9b090ae0449" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.34:6443/healthz\": dial tcp 10.217.0.34:6443: connect: connection refused" Sep 29 10:46:45 crc kubenswrapper[4752]: I0929 10:46:45.085934 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-s6ksk" event={"ID":"1ee4b36a-3e8c-41f7-9457-7928047cd0c6","Type":"ContainerStarted","Data":"3dbda74a53fd9b6dc0889d11789d527f4a7998ee0559b2e9dde3ed9da2494f12"} Sep 29 10:46:45 crc kubenswrapper[4752]: I0929 10:46:45.085995 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-s6ksk" event={"ID":"1ee4b36a-3e8c-41f7-9457-7928047cd0c6","Type":"ContainerStarted","Data":"7ca82040544c57351b231a544eeb79cebc7bc3d29a245403fc6fc791cd6d18ae"} Sep 29 10:46:45 crc kubenswrapper[4752]: I0929 10:46:45.141518 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z4kqj" event={"ID":"981034b2-30d0-4fea-933d-ef36b1b20d25","Type":"ContainerStarted","Data":"a966fed20a24ad3b05872190bbc6e84dc787922105f4867090dc057a9932297a"} Sep 29 10:46:45 crc kubenswrapper[4752]: I0929 10:46:45.141595 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z4kqj" event={"ID":"981034b2-30d0-4fea-933d-ef36b1b20d25","Type":"ContainerStarted","Data":"16aae15e0130af628d3e52c244a088372a8daa7ad279d0b433d741c78617da2d"} Sep 29 10:46:45 crc kubenswrapper[4752]: I0929 10:46:45.170672 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-57fqh\" (UID: \"8f756d24-5e77-4130-b920-794234a82ece\") " pod="openshift-image-registry/image-registry-697d97f7c8-57fqh" Sep 29 10:46:45 crc kubenswrapper[4752]: E0929 10:46:45.180759 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 10:46:45.680738755 +0000 UTC m=+146.469880422 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-57fqh" (UID: "8f756d24-5e77-4130-b920-794234a82ece") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:45 crc kubenswrapper[4752]: I0929 10:46:45.187336 4752 generic.go:334] "Generic (PLEG): container finished" podID="033ab5f2-0ffb-4f26-b703-8380971d1d7e" containerID="ba89a74f5d42e84848f3b7e243a875dbf76ec8cb263d1052730596981b6fd7f4" exitCode=0 Sep 29 10:46:45 crc kubenswrapper[4752]: I0929 10:46:45.187428 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-h5ql5" event={"ID":"033ab5f2-0ffb-4f26-b703-8380971d1d7e","Type":"ContainerDied","Data":"ba89a74f5d42e84848f3b7e243a875dbf76ec8cb263d1052730596981b6fd7f4"} Sep 29 10:46:45 crc kubenswrapper[4752]: I0929 10:46:45.187469 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-h5ql5" event={"ID":"033ab5f2-0ffb-4f26-b703-8380971d1d7e","Type":"ContainerStarted","Data":"f08639abb94a85ae9185ed412fdf2e40927ec4beb0794b6776f4ef8a5790e27e"} Sep 29 10:46:45 crc kubenswrapper[4752]: I0929 10:46:45.189572 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-h5ql5" Sep 29 10:46:45 crc kubenswrapper[4752]: I0929 10:46:45.191317 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mfldp" event={"ID":"50f1884b-af0d-429a-b38e-ffd726a9463b","Type":"ContainerStarted","Data":"c34629920131ae8b26fc1e1339442771aa481b8101b40918ed87722f00cc6a4e"} Sep 29 10:46:45 crc kubenswrapper[4752]: I0929 10:46:45.201283 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-7sz2n" event={"ID":"2f1a2a22-45fb-441a-a05d-fb6c6dbf9e68","Type":"ContainerStarted","Data":"cced8b5fb8a5d829e5adb130ac889ecb27efc67016703a224c5bf281f4e252cf"} Sep 29 10:46:45 crc kubenswrapper[4752]: I0929 10:46:45.218296 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-dbt29" podStartSLOduration=125.218257851 podStartE2EDuration="2m5.218257851s" podCreationTimestamp="2025-09-29 10:44:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:46:45.215482534 +0000 UTC m=+146.004624201" watchObservedRunningTime="2025-09-29 10:46:45.218257851 +0000 UTC m=+146.007399518" Sep 29 10:46:45 crc kubenswrapper[4752]: I0929 10:46:45.276472 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 10:46:45 crc kubenswrapper[4752]: I0929 10:46:45.278342 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-bp6hz" podStartSLOduration=125.278322228 podStartE2EDuration="2m5.278322228s" podCreationTimestamp="2025-09-29 10:44:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:46:45.275891611 +0000 UTC m=+146.065033398" watchObservedRunningTime="2025-09-29 10:46:45.278322228 +0000 UTC m=+146.067463895" Sep 29 10:46:45 crc kubenswrapper[4752]: E0929 10:46:45.278400 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 10:46:45.778357629 +0000 UTC m=+146.567499296 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:45 crc kubenswrapper[4752]: I0929 10:46:45.329758 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-jvmfr"] Sep 29 10:46:45 crc kubenswrapper[4752]: I0929 10:46:45.341067 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-cxqx7"] Sep 29 10:46:45 crc kubenswrapper[4752]: I0929 10:46:45.343337 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-6946k"] Sep 29 10:46:45 crc kubenswrapper[4752]: I0929 10:46:45.358408 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-hnqp5"] Sep 29 10:46:45 crc kubenswrapper[4752]: I0929 10:46:45.379670 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-57fqh\" (UID: \"8f756d24-5e77-4130-b920-794234a82ece\") " pod="openshift-image-registry/image-registry-697d97f7c8-57fqh" Sep 29 10:46:45 crc kubenswrapper[4752]: E0929 10:46:45.382597 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 10:46:45.882583466 +0000 UTC m=+146.671725133 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-57fqh" (UID: "8f756d24-5e77-4130-b920-794234a82ece") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:45 crc kubenswrapper[4752]: I0929 10:46:45.402467 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-whxr2" podStartSLOduration=125.402448334 podStartE2EDuration="2m5.402448334s" podCreationTimestamp="2025-09-29 10:44:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:46:45.373269858 +0000 UTC m=+146.162411535" watchObservedRunningTime="2025-09-29 10:46:45.402448334 +0000 UTC m=+146.191590001" Sep 29 10:46:45 crc kubenswrapper[4752]: I0929 10:46:45.403882 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-s8pf8"] Sep 29 10:46:45 crc kubenswrapper[4752]: I0929 10:46:45.432985 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-bhv9s" podStartSLOduration=6.432957036 podStartE2EDuration="6.432957036s" podCreationTimestamp="2025-09-29 10:46:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:46:45.399943035 +0000 UTC m=+146.189084702" watchObservedRunningTime="2025-09-29 10:46:45.432957036 +0000 UTC m=+146.222098703" Sep 29 10:46:45 crc kubenswrapper[4752]: I0929 10:46:45.468903 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-fc286"] Sep 29 10:46:45 crc kubenswrapper[4752]: I0929 10:46:45.481779 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 10:46:45 crc kubenswrapper[4752]: E0929 10:46:45.482169 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 10:46:45.982154164 +0000 UTC m=+146.771295831 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:45 crc kubenswrapper[4752]: I0929 10:46:45.486236 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z4kqj" podStartSLOduration=125.486214086 podStartE2EDuration="2m5.486214086s" podCreationTimestamp="2025-09-29 10:44:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:46:45.451106827 +0000 UTC m=+146.240248494" watchObservedRunningTime="2025-09-29 10:46:45.486214086 +0000 UTC m=+146.275355753" Sep 29 10:46:45 crc kubenswrapper[4752]: I0929 10:46:45.487475 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5c9h9"] Sep 29 10:46:45 crc kubenswrapper[4752]: I0929 10:46:45.507534 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b7cd6"] Sep 29 10:46:45 crc kubenswrapper[4752]: I0929 10:46:45.530144 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-h5ql5" podStartSLOduration=125.530119967 podStartE2EDuration="2m5.530119967s" podCreationTimestamp="2025-09-29 10:44:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:46:45.500111729 +0000 UTC m=+146.289253396" watchObservedRunningTime="2025-09-29 10:46:45.530119967 +0000 UTC m=+146.319261634" Sep 29 10:46:45 crc kubenswrapper[4752]: I0929 10:46:45.530536 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lwcrf"] Sep 29 10:46:45 crc kubenswrapper[4752]: W0929 10:46:45.535177 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2914c017_5539_4c32_9b8a_c494dd5b397a.slice/crio-9f4fc6041e46fc5d80a03005d6311aaf416d693543ec922bb955ab89ee4de752 WatchSource:0}: Error finding container 9f4fc6041e46fc5d80a03005d6311aaf416d693543ec922bb955ab89ee4de752: Status 404 returned error can't find the container with id 9f4fc6041e46fc5d80a03005d6311aaf416d693543ec922bb955ab89ee4de752 Sep 29 10:46:45 crc kubenswrapper[4752]: I0929 10:46:45.545631 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8vwzv"] Sep 29 10:46:45 crc kubenswrapper[4752]: I0929 10:46:45.555051 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mfldp" podStartSLOduration=125.555034505 podStartE2EDuration="2m5.555034505s" podCreationTimestamp="2025-09-29 10:44:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:46:45.527253698 +0000 UTC m=+146.316395375" watchObservedRunningTime="2025-09-29 10:46:45.555034505 +0000 UTC m=+146.344176172" Sep 29 10:46:45 crc kubenswrapper[4752]: I0929 10:46:45.574433 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-bhw29"] Sep 29 10:46:45 crc kubenswrapper[4752]: I0929 10:46:45.585317 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-57fqh\" (UID: \"8f756d24-5e77-4130-b920-794234a82ece\") " pod="openshift-image-registry/image-registry-697d97f7c8-57fqh" Sep 29 10:46:45 crc kubenswrapper[4752]: E0929 10:46:45.588110 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 10:46:46.088080627 +0000 UTC m=+146.877222294 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-57fqh" (UID: "8f756d24-5e77-4130-b920-794234a82ece") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:45 crc kubenswrapper[4752]: I0929 10:46:45.588375 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-79rht"] Sep 29 10:46:45 crc kubenswrapper[4752]: I0929 10:46:45.610840 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-pbmfv"] Sep 29 10:46:45 crc kubenswrapper[4752]: W0929 10:46:45.613148 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc6e8ca63_ed5f_4e08_8c2a_ac799993720c.slice/crio-bd51974b8fc206862ea33135463b683ac99ef318804f20601e4107768360c34c WatchSource:0}: Error finding container bd51974b8fc206862ea33135463b683ac99ef318804f20601e4107768360c34c: Status 404 returned error can't find the container with id bd51974b8fc206862ea33135463b683ac99ef318804f20601e4107768360c34c Sep 29 10:46:45 crc kubenswrapper[4752]: I0929 10:46:45.614140 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-s6ksk" podStartSLOduration=125.614116975 podStartE2EDuration="2m5.614116975s" podCreationTimestamp="2025-09-29 10:44:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:46:45.567130779 +0000 UTC m=+146.356272446" watchObservedRunningTime="2025-09-29 10:46:45.614116975 +0000 UTC m=+146.403258642" Sep 29 10:46:45 crc kubenswrapper[4752]: I0929 10:46:45.621057 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gvhpf"] Sep 29 10:46:45 crc kubenswrapper[4752]: I0929 10:46:45.641346 4752 patch_prober.go:28] interesting pod/router-default-5444994796-r8k8r container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 29 10:46:45 crc kubenswrapper[4752]: [-]has-synced failed: reason withheld Sep 29 10:46:45 crc kubenswrapper[4752]: [+]process-running ok Sep 29 10:46:45 crc kubenswrapper[4752]: healthz check failed Sep 29 10:46:45 crc kubenswrapper[4752]: I0929 10:46:45.641409 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r8k8r" podUID="d9cf5107-f1bf-41ee-bd8a-e3dd8dbfeb5d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 29 10:46:45 crc kubenswrapper[4752]: I0929 10:46:45.642598 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-974fr"] Sep 29 10:46:45 crc kubenswrapper[4752]: I0929 10:46:45.683874 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-zr2lt"] Sep 29 10:46:45 crc kubenswrapper[4752]: I0929 10:46:45.684651 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-prpgr" podStartSLOduration=125.684629332 podStartE2EDuration="2m5.684629332s" podCreationTimestamp="2025-09-29 10:44:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:46:45.613494829 +0000 UTC m=+146.402636496" watchObservedRunningTime="2025-09-29 10:46:45.684629332 +0000 UTC m=+146.473770999" Sep 29 10:46:45 crc kubenswrapper[4752]: I0929 10:46:45.688088 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 10:46:45 crc kubenswrapper[4752]: E0929 10:46:45.688403 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 10:46:46.188390975 +0000 UTC m=+146.977532632 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:45 crc kubenswrapper[4752]: I0929 10:46:45.709401 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-7sz2n" podStartSLOduration=125.709379954 podStartE2EDuration="2m5.709379954s" podCreationTimestamp="2025-09-29 10:44:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:46:45.682554014 +0000 UTC m=+146.471695681" watchObservedRunningTime="2025-09-29 10:46:45.709379954 +0000 UTC m=+146.498521621" Sep 29 10:46:45 crc kubenswrapper[4752]: I0929 10:46:45.745135 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29319045-qpr28"] Sep 29 10:46:45 crc kubenswrapper[4752]: I0929 10:46:45.793603 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-57fqh\" (UID: \"8f756d24-5e77-4130-b920-794234a82ece\") " pod="openshift-image-registry/image-registry-697d97f7c8-57fqh" Sep 29 10:46:45 crc kubenswrapper[4752]: E0929 10:46:45.794035 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 10:46:46.294022951 +0000 UTC m=+147.083164618 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-57fqh" (UID: "8f756d24-5e77-4130-b920-794234a82ece") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:45 crc kubenswrapper[4752]: I0929 10:46:45.897256 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 10:46:45 crc kubenswrapper[4752]: E0929 10:46:45.897671 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 10:46:46.397656081 +0000 UTC m=+147.186797738 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:46 crc kubenswrapper[4752]: I0929 10:46:46.000276 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-57fqh\" (UID: \"8f756d24-5e77-4130-b920-794234a82ece\") " pod="openshift-image-registry/image-registry-697d97f7c8-57fqh" Sep 29 10:46:46 crc kubenswrapper[4752]: E0929 10:46:46.000636 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 10:46:46.500623172 +0000 UTC m=+147.289764839 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-57fqh" (UID: "8f756d24-5e77-4130-b920-794234a82ece") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:46 crc kubenswrapper[4752]: I0929 10:46:46.101336 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 10:46:46 crc kubenswrapper[4752]: E0929 10:46:46.102070 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 10:46:46.602055082 +0000 UTC m=+147.391196749 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:46 crc kubenswrapper[4752]: I0929 10:46:46.205688 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-57fqh\" (UID: \"8f756d24-5e77-4130-b920-794234a82ece\") " pod="openshift-image-registry/image-registry-697d97f7c8-57fqh" Sep 29 10:46:46 crc kubenswrapper[4752]: E0929 10:46:46.206158 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 10:46:46.706142935 +0000 UTC m=+147.495284612 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-57fqh" (UID: "8f756d24-5e77-4130-b920-794234a82ece") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:46 crc kubenswrapper[4752]: I0929 10:46:46.232740 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-7sz2n" Sep 29 10:46:46 crc kubenswrapper[4752]: I0929 10:46:46.232947 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-7sz2n" Sep 29 10:46:46 crc kubenswrapper[4752]: I0929 10:46:46.246117 4752 patch_prober.go:28] interesting pod/apiserver-76f77b778f-7sz2n container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="Get \"https://10.217.0.15:8443/livez\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Sep 29 10:46:46 crc kubenswrapper[4752]: I0929 10:46:46.246211 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-7sz2n" podUID="2f1a2a22-45fb-441a-a05d-fb6c6dbf9e68" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.15:8443/livez\": dial tcp 10.217.0.15:8443: connect: connection refused" Sep 29 10:46:46 crc kubenswrapper[4752]: I0929 10:46:46.281840 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-hbftw" event={"ID":"6445acca-bd3a-41a4-8b4a-16607771e077","Type":"ContainerStarted","Data":"94143e585748a24243940a764270e48fa2d3add53dbaac58570aff7202342419"} Sep 29 10:46:46 crc kubenswrapper[4752]: I0929 10:46:46.282372 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-hbftw" Sep 29 10:46:46 crc kubenswrapper[4752]: I0929 10:46:46.288695 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bq279" event={"ID":"fc5ce698-c6c1-41fc-9b25-864081396f26","Type":"ContainerStarted","Data":"0d6c2e127d7111a0e7f155a7454ce958bc2632e62340580216851af8933f9fd2"} Sep 29 10:46:46 crc kubenswrapper[4752]: I0929 10:46:46.297913 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-zr2lt" event={"ID":"a2cc4fff-1922-45ac-871c-bab6f753b026","Type":"ContainerStarted","Data":"7ee3989abd7abd2bfae0dcf1768c5543b17a25f8eebbc4156f5e63fa13488d85"} Sep 29 10:46:46 crc kubenswrapper[4752]: I0929 10:46:46.300153 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-974fr" event={"ID":"5eaef9fd-30e6-47e1-afbf-8c0a464a4512","Type":"ContainerStarted","Data":"557a71c9ace40958ebad58b023596591bdcd1ef042c128a9b2eefea155863312"} Sep 29 10:46:46 crc kubenswrapper[4752]: I0929 10:46:46.310308 4752 patch_prober.go:28] interesting pod/console-operator-58897d9998-hbftw container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.16:8443/readyz\": dial tcp 10.217.0.16:8443: connect: connection refused" start-of-body= Sep 29 10:46:46 crc kubenswrapper[4752]: I0929 10:46:46.310355 4752 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-hbftw" podUID="6445acca-bd3a-41a4-8b4a-16607771e077" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.16:8443/readyz\": dial tcp 10.217.0.16:8443: connect: connection refused" Sep 29 10:46:46 crc kubenswrapper[4752]: I0929 10:46:46.311138 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 10:46:46 crc kubenswrapper[4752]: E0929 10:46:46.311461 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 10:46:46.811447681 +0000 UTC m=+147.600589348 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:46 crc kubenswrapper[4752]: I0929 10:46:46.311692 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-bhw29" event={"ID":"44ae2b29-ec3a-4321-8590-4d316d810034","Type":"ContainerStarted","Data":"9f2b3c8e2da593c6eb058e9c94f62015e825ff295fd034ba15aa1aeba96c3c97"} Sep 29 10:46:46 crc kubenswrapper[4752]: I0929 10:46:46.314109 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-q8xbr" event={"ID":"78a002dc-c902-472c-b269-9ec7c99ab835","Type":"ContainerStarted","Data":"794b031167ccc021f2a398b84f937712a08a39f7251e530e88554fbdbd37ebb3"} Sep 29 10:46:46 crc kubenswrapper[4752]: I0929 10:46:46.314130 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-q8xbr" event={"ID":"78a002dc-c902-472c-b269-9ec7c99ab835","Type":"ContainerStarted","Data":"561cdc5d6a664e370b09607e23e77485312a91a9525f6db77e36bd3a2c612949"} Sep 29 10:46:46 crc kubenswrapper[4752]: I0929 10:46:46.314604 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-q8xbr" Sep 29 10:46:46 crc kubenswrapper[4752]: I0929 10:46:46.327471 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-krscp" event={"ID":"f94d0b66-3c2d-46f5-bcdb-078cbb7cccae","Type":"ContainerStarted","Data":"33dba5c8edcc67bd0343f0528ae6f865b70ec21f06fbf109ecd58570c9b0d609"} Sep 29 10:46:46 crc kubenswrapper[4752]: I0929 10:46:46.327525 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-krscp" event={"ID":"f94d0b66-3c2d-46f5-bcdb-078cbb7cccae","Type":"ContainerStarted","Data":"5bc32f8a35f0df1525fa2e842a9c2f4d7228485d959379ea300d4207724c2163"} Sep 29 10:46:46 crc kubenswrapper[4752]: I0929 10:46:46.328673 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-mw85q" event={"ID":"49c0607a-8e84-4961-9e37-45434deacc31","Type":"ContainerStarted","Data":"1caf12faa58216439e3dd890ecdcd451b3d2dce5bfa6ae14fc41c877c9326048"} Sep 29 10:46:46 crc kubenswrapper[4752]: I0929 10:46:46.328703 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-mw85q" event={"ID":"49c0607a-8e84-4961-9e37-45434deacc31","Type":"ContainerStarted","Data":"eeaf229fef7c1402ef86ccf57fb54fa0aa2f42b01872a6a1a2929314569d7546"} Sep 29 10:46:46 crc kubenswrapper[4752]: I0929 10:46:46.329032 4752 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-q8xbr container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.23:8443/healthz\": dial tcp 10.217.0.23:8443: connect: connection refused" start-of-body= Sep 29 10:46:46 crc kubenswrapper[4752]: I0929 10:46:46.329080 4752 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-q8xbr" podUID="78a002dc-c902-472c-b269-9ec7c99ab835" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.23:8443/healthz\": dial tcp 10.217.0.23:8443: connect: connection refused" Sep 29 10:46:46 crc kubenswrapper[4752]: I0929 10:46:46.330412 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mfldp" event={"ID":"50f1884b-af0d-429a-b38e-ffd726a9463b","Type":"ContainerStarted","Data":"1f08ac2b5fd012777be98a2f72f70a5b4e404c4292c42809a60b1908c251e1cb"} Sep 29 10:46:46 crc kubenswrapper[4752]: I0929 10:46:46.342167 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5c9h9" event={"ID":"2c7183be-dcde-4752-8650-ffd4e9834ce5","Type":"ContainerStarted","Data":"0f121c8c54c9cf9e745151898266c0d12f011743a029b23c46f8b16af05f90e9"} Sep 29 10:46:46 crc kubenswrapper[4752]: I0929 10:46:46.345979 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-hbftw" podStartSLOduration=126.345944873 podStartE2EDuration="2m6.345944873s" podCreationTimestamp="2025-09-29 10:44:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:46:46.310108554 +0000 UTC m=+147.099250221" watchObservedRunningTime="2025-09-29 10:46:46.345944873 +0000 UTC m=+147.135086540" Sep 29 10:46:46 crc kubenswrapper[4752]: I0929 10:46:46.346780 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-q8xbr" podStartSLOduration=125.346774517 podStartE2EDuration="2m5.346774517s" podCreationTimestamp="2025-09-29 10:44:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:46:46.342568251 +0000 UTC m=+147.131709938" watchObservedRunningTime="2025-09-29 10:46:46.346774517 +0000 UTC m=+147.135916184" Sep 29 10:46:46 crc kubenswrapper[4752]: I0929 10:46:46.353500 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-jvk96" event={"ID":"9a2a2cd8-7738-47ce-9432-770f570fa47e","Type":"ContainerStarted","Data":"7edcf1d2044b278b12d31177083852e2631c013c70e8c4545dbaf4453603b229"} Sep 29 10:46:46 crc kubenswrapper[4752]: I0929 10:46:46.360913 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29319045-qpr28" event={"ID":"ae8c092c-ec9d-456a-9ba3-5501c22f6280","Type":"ContainerStarted","Data":"c8ad9f8b2cc8e085392060faf803526b988cba516e68587576fcdc4613320ce8"} Sep 29 10:46:46 crc kubenswrapper[4752]: I0929 10:46:46.373504 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-s8pf8" event={"ID":"24ab4270-1ece-4201-94ae-51c71902c3f1","Type":"ContainerStarted","Data":"09dae83eb604add6a237602ba8c6dcffa8731ffe77186c973d219e5ce7ce6626"} Sep 29 10:46:46 crc kubenswrapper[4752]: I0929 10:46:46.397141 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-cxqx7" event={"ID":"0f54680a-b95f-4074-a763-859a7e96962d","Type":"ContainerStarted","Data":"74ad8df70888aaca9ea99677d5e2b4c05095f5bf498b0d8e4c615c2a69ff4287"} Sep 29 10:46:46 crc kubenswrapper[4752]: I0929 10:46:46.405493 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jvmfr" event={"ID":"82b648a1-4d55-4994-b471-4938eeb34bd0","Type":"ContainerStarted","Data":"968d6549ae02d8d66319ea5b54169fba910ed255d0ef7b99b0f5c576836a16d4"} Sep 29 10:46:46 crc kubenswrapper[4752]: I0929 10:46:46.413757 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-57fqh\" (UID: \"8f756d24-5e77-4130-b920-794234a82ece\") " pod="openshift-image-registry/image-registry-697d97f7c8-57fqh" Sep 29 10:46:46 crc kubenswrapper[4752]: E0929 10:46:46.415954 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 10:46:46.915938885 +0000 UTC m=+147.705080682 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-57fqh" (UID: "8f756d24-5e77-4130-b920-794234a82ece") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:46 crc kubenswrapper[4752]: I0929 10:46:46.417856 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-pbmfv" event={"ID":"d04ea3ea-71ac-481c-990b-a989a6f61516","Type":"ContainerStarted","Data":"13c8ddc139cf32e517f16ddee6bdc81caf18342fe4a605c1a7be7368d909e3e6"} Sep 29 10:46:46 crc kubenswrapper[4752]: I0929 10:46:46.431163 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-npmvp" event={"ID":"6c5671cc-4e9d-423d-b0d6-ea9e86420210","Type":"ContainerStarted","Data":"6299fadb3c79f66d9cb7fad37863783ef33f3e1f6f0bc7ff5706e944901fadab"} Sep 29 10:46:46 crc kubenswrapper[4752]: I0929 10:46:46.434525 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8vwzv" event={"ID":"9681ca7c-896e-41a2-8f2a-129b813f6695","Type":"ContainerStarted","Data":"c6c1727e45f24f53551666ec012d6ef664772d6b563f5936b2bacbdb6d4a05c8"} Sep 29 10:46:46 crc kubenswrapper[4752]: I0929 10:46:46.449007 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lwcrf" event={"ID":"074633af-0fd3-4335-ba5e-af7840383694","Type":"ContainerStarted","Data":"a0b0cc876165943a15f58c41eeb1ae42d2aaefc9194f21136751ff0201aa3a38"} Sep 29 10:46:46 crc kubenswrapper[4752]: I0929 10:46:46.461521 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-npmvp" podStartSLOduration=125.461503303 podStartE2EDuration="2m5.461503303s" podCreationTimestamp="2025-09-29 10:44:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:46:46.458970133 +0000 UTC m=+147.248111810" watchObservedRunningTime="2025-09-29 10:46:46.461503303 +0000 UTC m=+147.250644970" Sep 29 10:46:46 crc kubenswrapper[4752]: I0929 10:46:46.462344 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b7cd6" event={"ID":"c6e8ca63-ed5f-4e08-8c2a-ac799993720c","Type":"ContainerStarted","Data":"bd51974b8fc206862ea33135463b683ac99ef318804f20601e4107768360c34c"} Sep 29 10:46:46 crc kubenswrapper[4752]: I0929 10:46:46.495696 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gvhpf" event={"ID":"259601c2-169a-446f-9fb7-e93aefc143b4","Type":"ContainerStarted","Data":"693a85355e99d34f4ae69cf3804a5276acec1520c94604b80783c8059a96eb59"} Sep 29 10:46:46 crc kubenswrapper[4752]: I0929 10:46:46.503452 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-79rht" event={"ID":"ae78dd2a-9513-40d7-b38c-6ba848c3c558","Type":"ContainerStarted","Data":"482348d9f477305a905194a384842c690d075073c28803d00315adbb59df3fd9"} Sep 29 10:46:46 crc kubenswrapper[4752]: I0929 10:46:46.504731 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-fc286" event={"ID":"4ad84a5d-2ceb-457c-af21-d15ac31a54ef","Type":"ContainerStarted","Data":"6b0a6b9fc719d682ae7a14411b42e2c46de7fc7da50ef0f51c9939914a254262"} Sep 29 10:46:46 crc kubenswrapper[4752]: I0929 10:46:46.505757 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6946k" event={"ID":"40deae84-a862-4cbf-8acc-d03e9f516ff4","Type":"ContainerStarted","Data":"1ca64d58d3af5cd4264d6961feffa4419e433afa18cfd3b4afd8faeb8838cda0"} Sep 29 10:46:46 crc kubenswrapper[4752]: I0929 10:46:46.508191 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hnqp5" event={"ID":"2914c017-5539-4c32-9b8a-c494dd5b397a","Type":"ContainerStarted","Data":"9f4fc6041e46fc5d80a03005d6311aaf416d693543ec922bb955ab89ee4de752"} Sep 29 10:46:46 crc kubenswrapper[4752]: I0929 10:46:46.516144 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 10:46:46 crc kubenswrapper[4752]: E0929 10:46:46.518141 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 10:46:47.018122495 +0000 UTC m=+147.807264162 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:46 crc kubenswrapper[4752]: I0929 10:46:46.522952 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-prpgr" Sep 29 10:46:46 crc kubenswrapper[4752]: I0929 10:46:46.618660 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-57fqh\" (UID: \"8f756d24-5e77-4130-b920-794234a82ece\") " pod="openshift-image-registry/image-registry-697d97f7c8-57fqh" Sep 29 10:46:46 crc kubenswrapper[4752]: E0929 10:46:46.619506 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 10:46:47.119487202 +0000 UTC m=+147.908628869 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-57fqh" (UID: "8f756d24-5e77-4130-b920-794234a82ece") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:46 crc kubenswrapper[4752]: I0929 10:46:46.641835 4752 patch_prober.go:28] interesting pod/router-default-5444994796-r8k8r container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 29 10:46:46 crc kubenswrapper[4752]: [-]has-synced failed: reason withheld Sep 29 10:46:46 crc kubenswrapper[4752]: [+]process-running ok Sep 29 10:46:46 crc kubenswrapper[4752]: healthz check failed Sep 29 10:46:46 crc kubenswrapper[4752]: I0929 10:46:46.641944 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r8k8r" podUID="d9cf5107-f1bf-41ee-bd8a-e3dd8dbfeb5d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 29 10:46:46 crc kubenswrapper[4752]: I0929 10:46:46.724334 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 10:46:46 crc kubenswrapper[4752]: E0929 10:46:46.724729 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 10:46:47.224712077 +0000 UTC m=+148.013853744 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:46 crc kubenswrapper[4752]: I0929 10:46:46.825861 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-57fqh\" (UID: \"8f756d24-5e77-4130-b920-794234a82ece\") " pod="openshift-image-registry/image-registry-697d97f7c8-57fqh" Sep 29 10:46:46 crc kubenswrapper[4752]: E0929 10:46:46.826437 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 10:46:47.326417434 +0000 UTC m=+148.115559101 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-57fqh" (UID: "8f756d24-5e77-4130-b920-794234a82ece") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:46 crc kubenswrapper[4752]: I0929 10:46:46.936832 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 10:46:46 crc kubenswrapper[4752]: E0929 10:46:46.936998 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 10:46:47.436964335 +0000 UTC m=+148.226106012 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:46 crc kubenswrapper[4752]: I0929 10:46:46.937242 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-57fqh\" (UID: \"8f756d24-5e77-4130-b920-794234a82ece\") " pod="openshift-image-registry/image-registry-697d97f7c8-57fqh" Sep 29 10:46:46 crc kubenswrapper[4752]: E0929 10:46:46.937887 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 10:46:47.43786479 +0000 UTC m=+148.227006467 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-57fqh" (UID: "8f756d24-5e77-4130-b920-794234a82ece") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:47 crc kubenswrapper[4752]: I0929 10:46:47.044426 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 10:46:47 crc kubenswrapper[4752]: E0929 10:46:47.044846 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 10:46:47.544831002 +0000 UTC m=+148.333972669 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:47 crc kubenswrapper[4752]: I0929 10:46:47.147860 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-57fqh\" (UID: \"8f756d24-5e77-4130-b920-794234a82ece\") " pod="openshift-image-registry/image-registry-697d97f7c8-57fqh" Sep 29 10:46:47 crc kubenswrapper[4752]: E0929 10:46:47.148441 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 10:46:47.648423191 +0000 UTC m=+148.437564848 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-57fqh" (UID: "8f756d24-5e77-4130-b920-794234a82ece") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:47 crc kubenswrapper[4752]: I0929 10:46:47.249739 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 10:46:47 crc kubenswrapper[4752]: E0929 10:46:47.250455 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 10:46:47.750427996 +0000 UTC m=+148.539569663 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:47 crc kubenswrapper[4752]: I0929 10:46:47.351794 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-57fqh\" (UID: \"8f756d24-5e77-4130-b920-794234a82ece\") " pod="openshift-image-registry/image-registry-697d97f7c8-57fqh" Sep 29 10:46:47 crc kubenswrapper[4752]: E0929 10:46:47.352175 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 10:46:47.852163644 +0000 UTC m=+148.641305311 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-57fqh" (UID: "8f756d24-5e77-4130-b920-794234a82ece") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:47 crc kubenswrapper[4752]: I0929 10:46:47.452851 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 10:46:47 crc kubenswrapper[4752]: E0929 10:46:47.453081 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 10:46:47.953040708 +0000 UTC m=+148.742182375 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:47 crc kubenswrapper[4752]: I0929 10:46:47.453521 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-57fqh\" (UID: \"8f756d24-5e77-4130-b920-794234a82ece\") " pod="openshift-image-registry/image-registry-697d97f7c8-57fqh" Sep 29 10:46:47 crc kubenswrapper[4752]: E0929 10:46:47.454014 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 10:46:47.953996684 +0000 UTC m=+148.743138351 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-57fqh" (UID: "8f756d24-5e77-4130-b920-794234a82ece") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:47 crc kubenswrapper[4752]: I0929 10:46:47.519597 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-974fr" event={"ID":"5eaef9fd-30e6-47e1-afbf-8c0a464a4512","Type":"ContainerStarted","Data":"dd38887d231486bd8fb5afb5657a95bc2cb39bb6f4f1131db7b8b79ebaf185ed"} Sep 29 10:46:47 crc kubenswrapper[4752]: I0929 10:46:47.519692 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-974fr" event={"ID":"5eaef9fd-30e6-47e1-afbf-8c0a464a4512","Type":"ContainerStarted","Data":"0628f56a667f0136d4f61c650b7bd02956dbe46a9bf85fe143f07aafe91432a7"} Sep 29 10:46:47 crc kubenswrapper[4752]: I0929 10:46:47.522441 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8vwzv" event={"ID":"9681ca7c-896e-41a2-8f2a-129b813f6695","Type":"ContainerStarted","Data":"cf38814978660702ebeded834099c9e7ecbefec4c855edc060f5f02a16a60af8"} Sep 29 10:46:47 crc kubenswrapper[4752]: I0929 10:46:47.522505 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8vwzv" event={"ID":"9681ca7c-896e-41a2-8f2a-129b813f6695","Type":"ContainerStarted","Data":"ca97a90ff64084c9bd33b6ba0adc0093e44c496048e3f6cd1748c8a9008cf2b4"} Sep 29 10:46:47 crc kubenswrapper[4752]: I0929 10:46:47.522606 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8vwzv" Sep 29 10:46:47 crc kubenswrapper[4752]: I0929 10:46:47.526482 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-krscp" event={"ID":"f94d0b66-3c2d-46f5-bcdb-078cbb7cccae","Type":"ContainerStarted","Data":"b2f50453b75edbc0736fe5e3ca1456ff884c228c65c7dc0090a3971fc5a132c9"} Sep 29 10:46:47 crc kubenswrapper[4752]: I0929 10:46:47.529386 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hnqp5" event={"ID":"2914c017-5539-4c32-9b8a-c494dd5b397a","Type":"ContainerStarted","Data":"31357fa42b879ec05c14e75b957e237fe12b3fdf3cae2d0402749f8a220c48bc"} Sep 29 10:46:47 crc kubenswrapper[4752]: I0929 10:46:47.529415 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hnqp5" event={"ID":"2914c017-5539-4c32-9b8a-c494dd5b397a","Type":"ContainerStarted","Data":"2b862a5fa0103c64161f681ae7d488277ddd2fcfbcf89f26c5a7bec78e64d477"} Sep 29 10:46:47 crc kubenswrapper[4752]: I0929 10:46:47.531288 4752 generic.go:334] "Generic (PLEG): container finished" podID="82b648a1-4d55-4994-b471-4938eeb34bd0" containerID="dce894c34be56d0fab77a9b21108792fa3dcadfeea66b4d8efa9a3a85634f017" exitCode=0 Sep 29 10:46:47 crc kubenswrapper[4752]: I0929 10:46:47.531982 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jvmfr" event={"ID":"82b648a1-4d55-4994-b471-4938eeb34bd0","Type":"ContainerDied","Data":"dce894c34be56d0fab77a9b21108792fa3dcadfeea66b4d8efa9a3a85634f017"} Sep 29 10:46:47 crc kubenswrapper[4752]: I0929 10:46:47.534489 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lwcrf" event={"ID":"074633af-0fd3-4335-ba5e-af7840383694","Type":"ContainerStarted","Data":"8ca1000f002aef839339fcec4f0074d2e9ccf587959fc2792578bd615dd53cdd"} Sep 29 10:46:47 crc kubenswrapper[4752]: I0929 10:46:47.536996 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-fc286" event={"ID":"4ad84a5d-2ceb-457c-af21-d15ac31a54ef","Type":"ContainerStarted","Data":"947e32c9d683ab2439180247b7b8f5b3dba14bb2d02effc413c97330c3d200e2"} Sep 29 10:46:47 crc kubenswrapper[4752]: I0929 10:46:47.541583 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-jvk96" event={"ID":"9a2a2cd8-7738-47ce-9432-770f570fa47e","Type":"ContainerStarted","Data":"c05e28c6ed250baca322af9b4083d00d3e745e83a1125f0e2ee798e50cdd32db"} Sep 29 10:46:47 crc kubenswrapper[4752]: I0929 10:46:47.541639 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-jvk96" event={"ID":"9a2a2cd8-7738-47ce-9432-770f570fa47e","Type":"ContainerStarted","Data":"e68a9538d3549c13ac036185ada58c18be39ae4f94395d49abfd6f96ff71c9d6"} Sep 29 10:46:47 crc kubenswrapper[4752]: I0929 10:46:47.542256 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-jvk96" Sep 29 10:46:47 crc kubenswrapper[4752]: I0929 10:46:47.545119 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-974fr" podStartSLOduration=127.545102098 podStartE2EDuration="2m7.545102098s" podCreationTimestamp="2025-09-29 10:44:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:46:47.544223675 +0000 UTC m=+148.333365332" watchObservedRunningTime="2025-09-29 10:46:47.545102098 +0000 UTC m=+148.334243755" Sep 29 10:46:47 crc kubenswrapper[4752]: I0929 10:46:47.553906 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gvhpf" event={"ID":"259601c2-169a-446f-9fb7-e93aefc143b4","Type":"ContainerStarted","Data":"d5379b928e99869ae8be90d0db580c1bc6c9949c0f7dc1030743610dab4055a4"} Sep 29 10:46:47 crc kubenswrapper[4752]: I0929 10:46:47.554242 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 10:46:47 crc kubenswrapper[4752]: E0929 10:46:47.554976 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 10:46:48.054544639 +0000 UTC m=+148.843686306 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:47 crc kubenswrapper[4752]: I0929 10:46:47.555114 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-57fqh\" (UID: \"8f756d24-5e77-4130-b920-794234a82ece\") " pod="openshift-image-registry/image-registry-697d97f7c8-57fqh" Sep 29 10:46:47 crc kubenswrapper[4752]: E0929 10:46:47.556903 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 10:46:48.056894034 +0000 UTC m=+148.846035701 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-57fqh" (UID: "8f756d24-5e77-4130-b920-794234a82ece") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:47 crc kubenswrapper[4752]: I0929 10:46:47.557163 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b7cd6" event={"ID":"c6e8ca63-ed5f-4e08-8c2a-ac799993720c","Type":"ContainerStarted","Data":"443e3730db8c6a80ca7322f318ddcf820b78625dcf5b1cde0926305837f5dec5"} Sep 29 10:46:47 crc kubenswrapper[4752]: I0929 10:46:47.557913 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b7cd6" Sep 29 10:46:47 crc kubenswrapper[4752]: I0929 10:46:47.561070 4752 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-b7cd6 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.38:5443/healthz\": dial tcp 10.217.0.38:5443: connect: connection refused" start-of-body= Sep 29 10:46:47 crc kubenswrapper[4752]: I0929 10:46:47.561142 4752 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b7cd6" podUID="c6e8ca63-ed5f-4e08-8c2a-ac799993720c" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.38:5443/healthz\": dial tcp 10.217.0.38:5443: connect: connection refused" Sep 29 10:46:47 crc kubenswrapper[4752]: I0929 10:46:47.561602 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6946k" event={"ID":"40deae84-a862-4cbf-8acc-d03e9f516ff4","Type":"ContainerStarted","Data":"2f8685b747d3e402d65ec4c2199d6d57d283e0d158e99638b10e175b879b41b2"} Sep 29 10:46:47 crc kubenswrapper[4752]: I0929 10:46:47.561652 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6946k" event={"ID":"40deae84-a862-4cbf-8acc-d03e9f516ff4","Type":"ContainerStarted","Data":"0b520777f5f28bb6b3b077cafd15a8a85828da81cbe12260bda684079d559152"} Sep 29 10:46:47 crc kubenswrapper[4752]: I0929 10:46:47.568056 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5c9h9" event={"ID":"2c7183be-dcde-4752-8650-ffd4e9834ce5","Type":"ContainerStarted","Data":"0eda0038d25a1bdab15edfd6a51a7c1eea31a77761b22a0308f36b66ebd8452b"} Sep 29 10:46:47 crc kubenswrapper[4752]: I0929 10:46:47.568655 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5c9h9" Sep 29 10:46:47 crc kubenswrapper[4752]: I0929 10:46:47.572945 4752 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-5c9h9 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.31:8443/healthz\": dial tcp 10.217.0.31:8443: connect: connection refused" start-of-body= Sep 29 10:46:47 crc kubenswrapper[4752]: I0929 10:46:47.573021 4752 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5c9h9" podUID="2c7183be-dcde-4752-8650-ffd4e9834ce5" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.31:8443/healthz\": dial tcp 10.217.0.31:8443: connect: connection refused" Sep 29 10:46:47 crc kubenswrapper[4752]: I0929 10:46:47.578485 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29319045-qpr28" event={"ID":"ae8c092c-ec9d-456a-9ba3-5501c22f6280","Type":"ContainerStarted","Data":"0f2d0a93bf241ea6876c7045753bca8c3b6ffb6faf20b37b4418f0b7c8b82cc4"} Sep 29 10:46:47 crc kubenswrapper[4752]: I0929 10:46:47.594841 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-krscp" podStartSLOduration=127.59479351 podStartE2EDuration="2m7.59479351s" podCreationTimestamp="2025-09-29 10:44:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:46:47.592556808 +0000 UTC m=+148.381698485" watchObservedRunningTime="2025-09-29 10:46:47.59479351 +0000 UTC m=+148.383935197" Sep 29 10:46:47 crc kubenswrapper[4752]: I0929 10:46:47.599035 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-s8pf8" event={"ID":"24ab4270-1ece-4201-94ae-51c71902c3f1","Type":"ContainerStarted","Data":"0c37964032d0a99c1f2a6e24bc15b635aed58bbbb2716d7ad86f2ed3c03a80b4"} Sep 29 10:46:47 crc kubenswrapper[4752]: I0929 10:46:47.605894 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-cxqx7" event={"ID":"0f54680a-b95f-4074-a763-859a7e96962d","Type":"ContainerStarted","Data":"1fb983ab2e904d5da7618c93863391bb7350d660b2e5f3f182c6db3bcb4c4b44"} Sep 29 10:46:47 crc kubenswrapper[4752]: I0929 10:46:47.605945 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-cxqx7" event={"ID":"0f54680a-b95f-4074-a763-859a7e96962d","Type":"ContainerStarted","Data":"0ccce26e6a44b8909863eae4983eb507d7a9d256234b3685e8eba9d777df1b37"} Sep 29 10:46:47 crc kubenswrapper[4752]: I0929 10:46:47.621753 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-79rht" event={"ID":"ae78dd2a-9513-40d7-b38c-6ba848c3c558","Type":"ContainerStarted","Data":"95c651cdb9d77fd14e49318bfed8c3326250f3ddd46c06e58b241221a23ecaf5"} Sep 29 10:46:47 crc kubenswrapper[4752]: I0929 10:46:47.623399 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hnqp5" podStartSLOduration=127.623382439 podStartE2EDuration="2m7.623382439s" podCreationTimestamp="2025-09-29 10:44:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:46:47.621145928 +0000 UTC m=+148.410287595" watchObservedRunningTime="2025-09-29 10:46:47.623382439 +0000 UTC m=+148.412524106" Sep 29 10:46:47 crc kubenswrapper[4752]: I0929 10:46:47.654117 4752 patch_prober.go:28] interesting pod/router-default-5444994796-r8k8r container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 29 10:46:47 crc kubenswrapper[4752]: [-]has-synced failed: reason withheld Sep 29 10:46:47 crc kubenswrapper[4752]: [+]process-running ok Sep 29 10:46:47 crc kubenswrapper[4752]: healthz check failed Sep 29 10:46:47 crc kubenswrapper[4752]: I0929 10:46:47.654223 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r8k8r" podUID="d9cf5107-f1bf-41ee-bd8a-e3dd8dbfeb5d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 29 10:46:47 crc kubenswrapper[4752]: I0929 10:46:47.656016 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-pbmfv" event={"ID":"d04ea3ea-71ac-481c-990b-a989a6f61516","Type":"ContainerStarted","Data":"79c2d14bb1544aad70f4c45adb33236836c53b91f8fe8c1bfa1f014194ce9be3"} Sep 29 10:46:47 crc kubenswrapper[4752]: I0929 10:46:47.656955 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-pbmfv" Sep 29 10:46:47 crc kubenswrapper[4752]: I0929 10:46:47.657814 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 10:46:47 crc kubenswrapper[4752]: E0929 10:46:47.658131 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 10:46:48.158105757 +0000 UTC m=+148.947247424 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:47 crc kubenswrapper[4752]: I0929 10:46:47.667747 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-57fqh\" (UID: \"8f756d24-5e77-4130-b920-794234a82ece\") " pod="openshift-image-registry/image-registry-697d97f7c8-57fqh" Sep 29 10:46:47 crc kubenswrapper[4752]: I0929 10:46:47.658973 4752 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-pbmfv container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.24:8080/healthz\": dial tcp 10.217.0.24:8080: connect: connection refused" start-of-body= Sep 29 10:46:47 crc kubenswrapper[4752]: I0929 10:46:47.676786 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-bhw29" event={"ID":"44ae2b29-ec3a-4321-8590-4d316d810034","Type":"ContainerStarted","Data":"2d5ae920fb03d835bbb18b558ade05d9a81d1ff15ddad2c0ab458ab4d1ab6ba1"} Sep 29 10:46:47 crc kubenswrapper[4752]: I0929 10:46:47.685572 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-bhw29" Sep 29 10:46:47 crc kubenswrapper[4752]: I0929 10:46:47.685834 4752 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-pbmfv" podUID="d04ea3ea-71ac-481c-990b-a989a6f61516" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.24:8080/healthz\": dial tcp 10.217.0.24:8080: connect: connection refused" Sep 29 10:46:47 crc kubenswrapper[4752]: E0929 10:46:47.673759 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 10:46:48.173740539 +0000 UTC m=+148.962882286 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-57fqh" (UID: "8f756d24-5e77-4130-b920-794234a82ece") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:47 crc kubenswrapper[4752]: I0929 10:46:47.681117 4752 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-q8xbr container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.23:8443/healthz\": dial tcp 10.217.0.23:8443: connect: connection refused" start-of-body= Sep 29 10:46:47 crc kubenswrapper[4752]: I0929 10:46:47.687032 4752 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-q8xbr" podUID="78a002dc-c902-472c-b269-9ec7c99ab835" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.23:8443/healthz\": dial tcp 10.217.0.23:8443: connect: connection refused" Sep 29 10:46:47 crc kubenswrapper[4752]: I0929 10:46:47.687168 4752 patch_prober.go:28] interesting pod/console-operator-58897d9998-hbftw container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.16:8443/readyz\": dial tcp 10.217.0.16:8443: connect: connection refused" start-of-body= Sep 29 10:46:47 crc kubenswrapper[4752]: I0929 10:46:47.687266 4752 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-hbftw" podUID="6445acca-bd3a-41a4-8b4a-16607771e077" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.16:8443/readyz\": dial tcp 10.217.0.16:8443: connect: connection refused" Sep 29 10:46:47 crc kubenswrapper[4752]: I0929 10:46:47.699305 4752 patch_prober.go:28] interesting pod/downloads-7954f5f757-bhw29 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Sep 29 10:46:47 crc kubenswrapper[4752]: I0929 10:46:47.699595 4752 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-bhw29" podUID="44ae2b29-ec3a-4321-8590-4d316d810034" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Sep 29 10:46:47 crc kubenswrapper[4752]: I0929 10:46:47.719297 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lwcrf" podStartSLOduration=127.719269896 podStartE2EDuration="2m7.719269896s" podCreationTimestamp="2025-09-29 10:44:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:46:47.71109256 +0000 UTC m=+148.500234237" watchObservedRunningTime="2025-09-29 10:46:47.719269896 +0000 UTC m=+148.508411563" Sep 29 10:46:47 crc kubenswrapper[4752]: I0929 10:46:47.719432 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8vwzv" podStartSLOduration=126.71942718 podStartE2EDuration="2m6.71942718s" podCreationTimestamp="2025-09-29 10:44:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:46:47.686751588 +0000 UTC m=+148.475893265" watchObservedRunningTime="2025-09-29 10:46:47.71942718 +0000 UTC m=+148.508568847" Sep 29 10:46:47 crc kubenswrapper[4752]: I0929 10:46:47.742611 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-fc286" podStartSLOduration=8.742591278999999 podStartE2EDuration="8.742591279s" podCreationTimestamp="2025-09-29 10:46:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:46:47.74150861 +0000 UTC m=+148.530650277" watchObservedRunningTime="2025-09-29 10:46:47.742591279 +0000 UTC m=+148.531732946" Sep 29 10:46:47 crc kubenswrapper[4752]: I0929 10:46:47.807185 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 10:46:47 crc kubenswrapper[4752]: E0929 10:46:47.810133 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 10:46:48.310099563 +0000 UTC m=+149.099241400 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:47 crc kubenswrapper[4752]: I0929 10:46:47.911162 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-57fqh\" (UID: \"8f756d24-5e77-4130-b920-794234a82ece\") " pod="openshift-image-registry/image-registry-697d97f7c8-57fqh" Sep 29 10:46:47 crc kubenswrapper[4752]: E0929 10:46:47.911846 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 10:46:48.411815019 +0000 UTC m=+149.200956686 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-57fqh" (UID: "8f756d24-5e77-4130-b920-794234a82ece") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:47 crc kubenswrapper[4752]: I0929 10:46:47.977915 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-jvk96" podStartSLOduration=8.977896764 podStartE2EDuration="8.977896764s" podCreationTimestamp="2025-09-29 10:46:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:46:47.888784374 +0000 UTC m=+148.677926041" watchObservedRunningTime="2025-09-29 10:46:47.977896764 +0000 UTC m=+148.767038421" Sep 29 10:46:48 crc kubenswrapper[4752]: I0929 10:46:48.018178 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 10:46:48 crc kubenswrapper[4752]: E0929 10:46:48.018393 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 10:46:48.518352709 +0000 UTC m=+149.307494376 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:48 crc kubenswrapper[4752]: I0929 10:46:48.019154 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-57fqh\" (UID: \"8f756d24-5e77-4130-b920-794234a82ece\") " pod="openshift-image-registry/image-registry-697d97f7c8-57fqh" Sep 29 10:46:48 crc kubenswrapper[4752]: E0929 10:46:48.019835 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 10:46:48.519775899 +0000 UTC m=+149.308917576 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-57fqh" (UID: "8f756d24-5e77-4130-b920-794234a82ece") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:48 crc kubenswrapper[4752]: I0929 10:46:48.048745 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5c9h9" podStartSLOduration=128.048713498 podStartE2EDuration="2m8.048713498s" podCreationTimestamp="2025-09-29 10:44:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:46:47.980709741 +0000 UTC m=+148.769851408" watchObservedRunningTime="2025-09-29 10:46:48.048713498 +0000 UTC m=+148.837855165" Sep 29 10:46:48 crc kubenswrapper[4752]: I0929 10:46:48.050856 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gvhpf" podStartSLOduration=127.050847437 podStartE2EDuration="2m7.050847437s" podCreationTimestamp="2025-09-29 10:44:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:46:48.04949661 +0000 UTC m=+148.838638297" watchObservedRunningTime="2025-09-29 10:46:48.050847437 +0000 UTC m=+148.839989104" Sep 29 10:46:48 crc kubenswrapper[4752]: I0929 10:46:48.120044 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 10:46:48 crc kubenswrapper[4752]: E0929 10:46:48.120329 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 10:46:48.620307943 +0000 UTC m=+149.409449600 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:48 crc kubenswrapper[4752]: I0929 10:46:48.128989 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-s8pf8" podStartSLOduration=128.128969912 podStartE2EDuration="2m8.128969912s" podCreationTimestamp="2025-09-29 10:44:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:46:48.082111509 +0000 UTC m=+148.871253176" watchObservedRunningTime="2025-09-29 10:46:48.128969912 +0000 UTC m=+148.918111569" Sep 29 10:46:48 crc kubenswrapper[4752]: I0929 10:46:48.151009 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b7cd6" podStartSLOduration=127.15098939 podStartE2EDuration="2m7.15098939s" podCreationTimestamp="2025-09-29 10:44:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:46:48.129185329 +0000 UTC m=+148.918326996" watchObservedRunningTime="2025-09-29 10:46:48.15098939 +0000 UTC m=+148.940131057" Sep 29 10:46:48 crc kubenswrapper[4752]: I0929 10:46:48.175028 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-bhw29" podStartSLOduration=128.174993933 podStartE2EDuration="2m8.174993933s" podCreationTimestamp="2025-09-29 10:44:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:46:48.149641543 +0000 UTC m=+148.938783210" watchObservedRunningTime="2025-09-29 10:46:48.174993933 +0000 UTC m=+148.964135600" Sep 29 10:46:48 crc kubenswrapper[4752]: I0929 10:46:48.177848 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6946k" podStartSLOduration=128.177826061 podStartE2EDuration="2m8.177826061s" podCreationTimestamp="2025-09-29 10:44:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:46:48.175678292 +0000 UTC m=+148.964819959" watchObservedRunningTime="2025-09-29 10:46:48.177826061 +0000 UTC m=+148.966967728" Sep 29 10:46:48 crc kubenswrapper[4752]: I0929 10:46:48.212638 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29319045-qpr28" podStartSLOduration=108.212611541 podStartE2EDuration="1m48.212611541s" podCreationTimestamp="2025-09-29 10:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:46:48.209620659 +0000 UTC m=+148.998762346" watchObservedRunningTime="2025-09-29 10:46:48.212611541 +0000 UTC m=+149.001753208" Sep 29 10:46:48 crc kubenswrapper[4752]: I0929 10:46:48.221753 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-57fqh\" (UID: \"8f756d24-5e77-4130-b920-794234a82ece\") " pod="openshift-image-registry/image-registry-697d97f7c8-57fqh" Sep 29 10:46:48 crc kubenswrapper[4752]: E0929 10:46:48.222266 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 10:46:48.722249567 +0000 UTC m=+149.511391224 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-57fqh" (UID: "8f756d24-5e77-4130-b920-794234a82ece") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:48 crc kubenswrapper[4752]: I0929 10:46:48.230815 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-mw85q" podStartSLOduration=127.230785943 podStartE2EDuration="2m7.230785943s" podCreationTimestamp="2025-09-29 10:44:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:46:48.230215667 +0000 UTC m=+149.019357344" watchObservedRunningTime="2025-09-29 10:46:48.230785943 +0000 UTC m=+149.019927610" Sep 29 10:46:48 crc kubenswrapper[4752]: I0929 10:46:48.280013 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bq279" podStartSLOduration=128.27998885 podStartE2EDuration="2m8.27998885s" podCreationTimestamp="2025-09-29 10:44:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:46:48.278034327 +0000 UTC m=+149.067175994" watchObservedRunningTime="2025-09-29 10:46:48.27998885 +0000 UTC m=+149.069130517" Sep 29 10:46:48 crc kubenswrapper[4752]: I0929 10:46:48.326042 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 10:46:48 crc kubenswrapper[4752]: E0929 10:46:48.326441 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 10:46:48.826393021 +0000 UTC m=+149.615534678 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:48 crc kubenswrapper[4752]: I0929 10:46:48.326527 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-57fqh\" (UID: \"8f756d24-5e77-4130-b920-794234a82ece\") " pod="openshift-image-registry/image-registry-697d97f7c8-57fqh" Sep 29 10:46:48 crc kubenswrapper[4752]: E0929 10:46:48.327059 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 10:46:48.82704949 +0000 UTC m=+149.616191157 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-57fqh" (UID: "8f756d24-5e77-4130-b920-794234a82ece") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:48 crc kubenswrapper[4752]: I0929 10:46:48.344526 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-cxqx7" podStartSLOduration=128.344502471 podStartE2EDuration="2m8.344502471s" podCreationTimestamp="2025-09-29 10:44:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:46:48.334758022 +0000 UTC m=+149.123899689" watchObservedRunningTime="2025-09-29 10:46:48.344502471 +0000 UTC m=+149.133644148" Sep 29 10:46:48 crc kubenswrapper[4752]: I0929 10:46:48.377592 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-79rht" podStartSLOduration=128.377567564 podStartE2EDuration="2m8.377567564s" podCreationTimestamp="2025-09-29 10:44:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:46:48.375720363 +0000 UTC m=+149.164862030" watchObservedRunningTime="2025-09-29 10:46:48.377567564 +0000 UTC m=+149.166709231" Sep 29 10:46:48 crc kubenswrapper[4752]: I0929 10:46:48.430595 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 10:46:48 crc kubenswrapper[4752]: E0929 10:46:48.430742 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 10:46:48.9307083 +0000 UTC m=+149.719849967 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:48 crc kubenswrapper[4752]: I0929 10:46:48.431425 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-57fqh\" (UID: \"8f756d24-5e77-4130-b920-794234a82ece\") " pod="openshift-image-registry/image-registry-697d97f7c8-57fqh" Sep 29 10:46:48 crc kubenswrapper[4752]: E0929 10:46:48.431894 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 10:46:48.931878033 +0000 UTC m=+149.721019700 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-57fqh" (UID: "8f756d24-5e77-4130-b920-794234a82ece") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:48 crc kubenswrapper[4752]: I0929 10:46:48.433112 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-pbmfv" podStartSLOduration=128.433079176 podStartE2EDuration="2m8.433079176s" podCreationTimestamp="2025-09-29 10:44:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:46:48.430613038 +0000 UTC m=+149.219754725" watchObservedRunningTime="2025-09-29 10:46:48.433079176 +0000 UTC m=+149.222220863" Sep 29 10:46:48 crc kubenswrapper[4752]: I0929 10:46:48.532004 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 10:46:48 crc kubenswrapper[4752]: E0929 10:46:48.532217 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 10:46:49.03217535 +0000 UTC m=+149.821317017 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:48 crc kubenswrapper[4752]: I0929 10:46:48.558319 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-h5ql5" Sep 29 10:46:48 crc kubenswrapper[4752]: I0929 10:46:48.633316 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-57fqh\" (UID: \"8f756d24-5e77-4130-b920-794234a82ece\") " pod="openshift-image-registry/image-registry-697d97f7c8-57fqh" Sep 29 10:46:48 crc kubenswrapper[4752]: E0929 10:46:48.633962 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 10:46:49.133930179 +0000 UTC m=+149.923072016 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-57fqh" (UID: "8f756d24-5e77-4130-b920-794234a82ece") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:48 crc kubenswrapper[4752]: I0929 10:46:48.634849 4752 patch_prober.go:28] interesting pod/router-default-5444994796-r8k8r container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 29 10:46:48 crc kubenswrapper[4752]: [-]has-synced failed: reason withheld Sep 29 10:46:48 crc kubenswrapper[4752]: [+]process-running ok Sep 29 10:46:48 crc kubenswrapper[4752]: healthz check failed Sep 29 10:46:48 crc kubenswrapper[4752]: I0929 10:46:48.634952 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r8k8r" podUID="d9cf5107-f1bf-41ee-bd8a-e3dd8dbfeb5d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 29 10:46:48 crc kubenswrapper[4752]: I0929 10:46:48.685253 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-zr2lt" event={"ID":"a2cc4fff-1922-45ac-871c-bab6f753b026","Type":"ContainerStarted","Data":"fdbcf6cebfcbc1d5837ec742ee11af39378ff1d93929614792bc73a6ae426ae2"} Sep 29 10:46:48 crc kubenswrapper[4752]: I0929 10:46:48.689921 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jvmfr" event={"ID":"82b648a1-4d55-4994-b471-4938eeb34bd0","Type":"ContainerStarted","Data":"4769c8085ba4a31f92efca7f5d1751179e193dd8d4bca03523c1abe98cce96b7"} Sep 29 10:46:48 crc kubenswrapper[4752]: I0929 10:46:48.689988 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gvhpf" Sep 29 10:46:48 crc kubenswrapper[4752]: I0929 10:46:48.693089 4752 patch_prober.go:28] interesting pod/downloads-7954f5f757-bhw29 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Sep 29 10:46:48 crc kubenswrapper[4752]: I0929 10:46:48.693174 4752 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-5c9h9 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.31:8443/healthz\": dial tcp 10.217.0.31:8443: connect: connection refused" start-of-body= Sep 29 10:46:48 crc kubenswrapper[4752]: I0929 10:46:48.693238 4752 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5c9h9" podUID="2c7183be-dcde-4752-8650-ffd4e9834ce5" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.31:8443/healthz\": dial tcp 10.217.0.31:8443: connect: connection refused" Sep 29 10:46:48 crc kubenswrapper[4752]: I0929 10:46:48.693100 4752 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-gvhpf container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.42:8443/healthz\": dial tcp 10.217.0.42:8443: connect: connection refused" start-of-body= Sep 29 10:46:48 crc kubenswrapper[4752]: I0929 10:46:48.693170 4752 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-bhw29" podUID="44ae2b29-ec3a-4321-8590-4d316d810034" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Sep 29 10:46:48 crc kubenswrapper[4752]: I0929 10:46:48.693100 4752 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-pbmfv container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.24:8080/healthz\": dial tcp 10.217.0.24:8080: connect: connection refused" start-of-body= Sep 29 10:46:48 crc kubenswrapper[4752]: I0929 10:46:48.693455 4752 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-pbmfv" podUID="d04ea3ea-71ac-481c-990b-a989a6f61516" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.24:8080/healthz\": dial tcp 10.217.0.24:8080: connect: connection refused" Sep 29 10:46:48 crc kubenswrapper[4752]: I0929 10:46:48.693430 4752 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gvhpf" podUID="259601c2-169a-446f-9fb7-e93aefc143b4" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.42:8443/healthz\": dial tcp 10.217.0.42:8443: connect: connection refused" Sep 29 10:46:48 crc kubenswrapper[4752]: I0929 10:46:48.727567 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jvmfr" podStartSLOduration=127.727551663 podStartE2EDuration="2m7.727551663s" podCreationTimestamp="2025-09-29 10:44:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:46:48.723511561 +0000 UTC m=+149.512653228" watchObservedRunningTime="2025-09-29 10:46:48.727551663 +0000 UTC m=+149.516693330" Sep 29 10:46:48 crc kubenswrapper[4752]: I0929 10:46:48.749377 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 10:46:48 crc kubenswrapper[4752]: E0929 10:46:48.749549 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 10:46:49.24952845 +0000 UTC m=+150.038670117 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:48 crc kubenswrapper[4752]: I0929 10:46:48.752482 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-57fqh\" (UID: \"8f756d24-5e77-4130-b920-794234a82ece\") " pod="openshift-image-registry/image-registry-697d97f7c8-57fqh" Sep 29 10:46:48 crc kubenswrapper[4752]: E0929 10:46:48.765398 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 10:46:49.256112511 +0000 UTC m=+150.045254378 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-57fqh" (UID: "8f756d24-5e77-4130-b920-794234a82ece") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:48 crc kubenswrapper[4752]: I0929 10:46:48.854465 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 10:46:48 crc kubenswrapper[4752]: E0929 10:46:48.855021 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 10:46:49.35499769 +0000 UTC m=+150.144139357 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:48 crc kubenswrapper[4752]: I0929 10:46:48.956254 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 10:46:48 crc kubenswrapper[4752]: I0929 10:46:48.956381 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-57fqh\" (UID: \"8f756d24-5e77-4130-b920-794234a82ece\") " pod="openshift-image-registry/image-registry-697d97f7c8-57fqh" Sep 29 10:46:48 crc kubenswrapper[4752]: I0929 10:46:48.956425 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 10:46:48 crc kubenswrapper[4752]: I0929 10:46:48.956476 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 10:46:48 crc kubenswrapper[4752]: I0929 10:46:48.956510 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 10:46:48 crc kubenswrapper[4752]: I0929 10:46:48.957694 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 10:46:48 crc kubenswrapper[4752]: E0929 10:46:48.957935 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 10:46:49.4579161 +0000 UTC m=+150.247057767 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-57fqh" (UID: "8f756d24-5e77-4130-b920-794234a82ece") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:48 crc kubenswrapper[4752]: I0929 10:46:48.964895 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 10:46:48 crc kubenswrapper[4752]: I0929 10:46:48.964969 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 10:46:48 crc kubenswrapper[4752]: I0929 10:46:48.964985 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 10:46:49 crc kubenswrapper[4752]: I0929 10:46:49.047706 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 10:46:49 crc kubenswrapper[4752]: I0929 10:46:49.055136 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 10:46:49 crc kubenswrapper[4752]: I0929 10:46:49.057680 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 10:46:49 crc kubenswrapper[4752]: E0929 10:46:49.057908 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 10:46:49.55787679 +0000 UTC m=+150.347018457 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:49 crc kubenswrapper[4752]: I0929 10:46:49.058145 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-57fqh\" (UID: \"8f756d24-5e77-4130-b920-794234a82ece\") " pod="openshift-image-registry/image-registry-697d97f7c8-57fqh" Sep 29 10:46:49 crc kubenswrapper[4752]: E0929 10:46:49.058507 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 10:46:49.558499446 +0000 UTC m=+150.347641113 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-57fqh" (UID: "8f756d24-5e77-4130-b920-794234a82ece") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:49 crc kubenswrapper[4752]: I0929 10:46:49.062421 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 10:46:49 crc kubenswrapper[4752]: I0929 10:46:49.159701 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 10:46:49 crc kubenswrapper[4752]: E0929 10:46:49.159954 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 10:46:49.659927546 +0000 UTC m=+150.449069223 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:49 crc kubenswrapper[4752]: I0929 10:46:49.160134 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-57fqh\" (UID: \"8f756d24-5e77-4130-b920-794234a82ece\") " pod="openshift-image-registry/image-registry-697d97f7c8-57fqh" Sep 29 10:46:49 crc kubenswrapper[4752]: E0929 10:46:49.160502 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 10:46:49.660489931 +0000 UTC m=+150.449631598 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-57fqh" (UID: "8f756d24-5e77-4130-b920-794234a82ece") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:49 crc kubenswrapper[4752]: I0929 10:46:49.260816 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 10:46:49 crc kubenswrapper[4752]: E0929 10:46:49.261049 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 10:46:49.761019426 +0000 UTC m=+150.550161093 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:49 crc kubenswrapper[4752]: I0929 10:46:49.261307 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-57fqh\" (UID: \"8f756d24-5e77-4130-b920-794234a82ece\") " pod="openshift-image-registry/image-registry-697d97f7c8-57fqh" Sep 29 10:46:49 crc kubenswrapper[4752]: E0929 10:46:49.261586 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 10:46:49.761579571 +0000 UTC m=+150.550721238 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-57fqh" (UID: "8f756d24-5e77-4130-b920-794234a82ece") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:49 crc kubenswrapper[4752]: I0929 10:46:49.362246 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 10:46:49 crc kubenswrapper[4752]: E0929 10:46:49.362591 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 10:46:49.862576658 +0000 UTC m=+150.651718325 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:49 crc kubenswrapper[4752]: I0929 10:46:49.375993 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-hbftw" Sep 29 10:46:49 crc kubenswrapper[4752]: I0929 10:46:49.466644 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-57fqh\" (UID: \"8f756d24-5e77-4130-b920-794234a82ece\") " pod="openshift-image-registry/image-registry-697d97f7c8-57fqh" Sep 29 10:46:49 crc kubenswrapper[4752]: E0929 10:46:49.466964 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 10:46:49.966952279 +0000 UTC m=+150.756093946 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-57fqh" (UID: "8f756d24-5e77-4130-b920-794234a82ece") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:49 crc kubenswrapper[4752]: I0929 10:46:49.572682 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 10:46:49 crc kubenswrapper[4752]: E0929 10:46:49.573244 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 10:46:50.073218062 +0000 UTC m=+150.862359729 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:49 crc kubenswrapper[4752]: I0929 10:46:49.638941 4752 patch_prober.go:28] interesting pod/router-default-5444994796-r8k8r container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 29 10:46:49 crc kubenswrapper[4752]: [-]has-synced failed: reason withheld Sep 29 10:46:49 crc kubenswrapper[4752]: [+]process-running ok Sep 29 10:46:49 crc kubenswrapper[4752]: healthz check failed Sep 29 10:46:49 crc kubenswrapper[4752]: I0929 10:46:49.638990 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r8k8r" podUID="d9cf5107-f1bf-41ee-bd8a-e3dd8dbfeb5d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 29 10:46:49 crc kubenswrapper[4752]: I0929 10:46:49.668701 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b7cd6" Sep 29 10:46:49 crc kubenswrapper[4752]: I0929 10:46:49.675339 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-57fqh\" (UID: \"8f756d24-5e77-4130-b920-794234a82ece\") " pod="openshift-image-registry/image-registry-697d97f7c8-57fqh" Sep 29 10:46:49 crc kubenswrapper[4752]: E0929 10:46:49.675709 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 10:46:50.17569705 +0000 UTC m=+150.964838717 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-57fqh" (UID: "8f756d24-5e77-4130-b920-794234a82ece") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:49 crc kubenswrapper[4752]: I0929 10:46:49.708714 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-zr2lt" event={"ID":"a2cc4fff-1922-45ac-871c-bab6f753b026","Type":"ContainerStarted","Data":"0982615c6de295fa9fdc16d1f50fc378beffa0f0e04214a118f96d7d6673bc20"} Sep 29 10:46:49 crc kubenswrapper[4752]: I0929 10:46:49.710897 4752 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-pbmfv container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.24:8080/healthz\": dial tcp 10.217.0.24:8080: connect: connection refused" start-of-body= Sep 29 10:46:49 crc kubenswrapper[4752]: I0929 10:46:49.710937 4752 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-pbmfv" podUID="d04ea3ea-71ac-481c-990b-a989a6f61516" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.24:8080/healthz\": dial tcp 10.217.0.24:8080: connect: connection refused" Sep 29 10:46:49 crc kubenswrapper[4752]: I0929 10:46:49.718180 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5c9h9" Sep 29 10:46:49 crc kubenswrapper[4752]: I0929 10:46:49.734554 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gvhpf" Sep 29 10:46:49 crc kubenswrapper[4752]: I0929 10:46:49.776770 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 10:46:49 crc kubenswrapper[4752]: E0929 10:46:49.780921 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 10:46:50.280889103 +0000 UTC m=+151.070030770 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:49 crc kubenswrapper[4752]: I0929 10:46:49.888536 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-57fqh\" (UID: \"8f756d24-5e77-4130-b920-794234a82ece\") " pod="openshift-image-registry/image-registry-697d97f7c8-57fqh" Sep 29 10:46:49 crc kubenswrapper[4752]: E0929 10:46:49.889245 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 10:46:50.389231174 +0000 UTC m=+151.178372861 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-57fqh" (UID: "8f756d24-5e77-4130-b920-794234a82ece") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:49 crc kubenswrapper[4752]: I0929 10:46:49.989820 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 10:46:49 crc kubenswrapper[4752]: E0929 10:46:49.990142 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 10:46:50.490126689 +0000 UTC m=+151.279268356 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:50 crc kubenswrapper[4752]: I0929 10:46:50.094546 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-57fqh\" (UID: \"8f756d24-5e77-4130-b920-794234a82ece\") " pod="openshift-image-registry/image-registry-697d97f7c8-57fqh" Sep 29 10:46:50 crc kubenswrapper[4752]: E0929 10:46:50.094877 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 10:46:50.59486605 +0000 UTC m=+151.384007717 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-57fqh" (UID: "8f756d24-5e77-4130-b920-794234a82ece") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:50 crc kubenswrapper[4752]: I0929 10:46:50.202333 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 10:46:50 crc kubenswrapper[4752]: E0929 10:46:50.202852 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 10:46:50.702837029 +0000 UTC m=+151.491978696 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:50 crc kubenswrapper[4752]: I0929 10:46:50.315162 4752 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Sep 29 10:46:50 crc kubenswrapper[4752]: I0929 10:46:50.316561 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-57fqh\" (UID: \"8f756d24-5e77-4130-b920-794234a82ece\") " pod="openshift-image-registry/image-registry-697d97f7c8-57fqh" Sep 29 10:46:50 crc kubenswrapper[4752]: E0929 10:46:50.316923 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 10:46:50.816910647 +0000 UTC m=+151.606052304 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-57fqh" (UID: "8f756d24-5e77-4130-b920-794234a82ece") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:50 crc kubenswrapper[4752]: I0929 10:46:50.417925 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 10:46:50 crc kubenswrapper[4752]: E0929 10:46:50.418142 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 10:46:50.91811264 +0000 UTC m=+151.707254307 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:50 crc kubenswrapper[4752]: I0929 10:46:50.520356 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-57fqh\" (UID: \"8f756d24-5e77-4130-b920-794234a82ece\") " pod="openshift-image-registry/image-registry-697d97f7c8-57fqh" Sep 29 10:46:50 crc kubenswrapper[4752]: E0929 10:46:50.520845 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 10:46:51.020827445 +0000 UTC m=+151.809969112 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-57fqh" (UID: "8f756d24-5e77-4130-b920-794234a82ece") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:50 crc kubenswrapper[4752]: I0929 10:46:50.547458 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5tlth"] Sep 29 10:46:50 crc kubenswrapper[4752]: I0929 10:46:50.548426 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5tlth" Sep 29 10:46:50 crc kubenswrapper[4752]: I0929 10:46:50.553403 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Sep 29 10:46:50 crc kubenswrapper[4752]: I0929 10:46:50.573695 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5tlth"] Sep 29 10:46:50 crc kubenswrapper[4752]: I0929 10:46:50.621437 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 10:46:50 crc kubenswrapper[4752]: E0929 10:46:50.621682 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 10:46:51.121640798 +0000 UTC m=+151.910782465 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:50 crc kubenswrapper[4752]: I0929 10:46:50.621756 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06f9e526-21c6-4e20-b1a8-8f4fbfaa6413-utilities\") pod \"community-operators-5tlth\" (UID: \"06f9e526-21c6-4e20-b1a8-8f4fbfaa6413\") " pod="openshift-marketplace/community-operators-5tlth" Sep 29 10:46:50 crc kubenswrapper[4752]: I0929 10:46:50.621980 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-57fqh\" (UID: \"8f756d24-5e77-4130-b920-794234a82ece\") " pod="openshift-image-registry/image-registry-697d97f7c8-57fqh" Sep 29 10:46:50 crc kubenswrapper[4752]: I0929 10:46:50.622070 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrxjl\" (UniqueName: \"kubernetes.io/projected/06f9e526-21c6-4e20-b1a8-8f4fbfaa6413-kube-api-access-qrxjl\") pod \"community-operators-5tlth\" (UID: \"06f9e526-21c6-4e20-b1a8-8f4fbfaa6413\") " pod="openshift-marketplace/community-operators-5tlth" Sep 29 10:46:50 crc kubenswrapper[4752]: I0929 10:46:50.622302 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06f9e526-21c6-4e20-b1a8-8f4fbfaa6413-catalog-content\") pod \"community-operators-5tlth\" (UID: \"06f9e526-21c6-4e20-b1a8-8f4fbfaa6413\") " pod="openshift-marketplace/community-operators-5tlth" Sep 29 10:46:50 crc kubenswrapper[4752]: E0929 10:46:50.622433 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 10:46:51.122409868 +0000 UTC m=+151.911551535 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-57fqh" (UID: "8f756d24-5e77-4130-b920-794234a82ece") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:50 crc kubenswrapper[4752]: I0929 10:46:50.633106 4752 patch_prober.go:28] interesting pod/router-default-5444994796-r8k8r container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 29 10:46:50 crc kubenswrapper[4752]: [-]has-synced failed: reason withheld Sep 29 10:46:50 crc kubenswrapper[4752]: [+]process-running ok Sep 29 10:46:50 crc kubenswrapper[4752]: healthz check failed Sep 29 10:46:50 crc kubenswrapper[4752]: I0929 10:46:50.633184 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r8k8r" podUID="d9cf5107-f1bf-41ee-bd8a-e3dd8dbfeb5d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 29 10:46:50 crc kubenswrapper[4752]: I0929 10:46:50.716338 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"449c84685b18dbac64217d8e0d54cbc76cb1008a1b3df78acfc9a08c8332a399"} Sep 29 10:46:50 crc kubenswrapper[4752]: I0929 10:46:50.716955 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"3d13d691d3c5898b9c314ce411b697367f1d6bf112b03659c5eade16de0f7ac3"} Sep 29 10:46:50 crc kubenswrapper[4752]: I0929 10:46:50.719943 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-zr2lt" event={"ID":"a2cc4fff-1922-45ac-871c-bab6f753b026","Type":"ContainerStarted","Data":"008e7a9909472a48c5f98f95bcf5a6ac68ce98de98350712a41773d11bd07fd7"} Sep 29 10:46:50 crc kubenswrapper[4752]: I0929 10:46:50.723011 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"245d4e13aa306327df94ba69eb88af9d7ff5e2a1df8fa3c5d867e2bf42b9e848"} Sep 29 10:46:50 crc kubenswrapper[4752]: I0929 10:46:50.723066 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"277a65386f1e93f3abcdb331ebeeb9351c8b6a086f962b845f2b804d5c2a3ca5"} Sep 29 10:46:50 crc kubenswrapper[4752]: I0929 10:46:50.723273 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 10:46:50 crc kubenswrapper[4752]: I0929 10:46:50.723941 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 10:46:50 crc kubenswrapper[4752]: E0929 10:46:50.724179 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 10:46:51.224158237 +0000 UTC m=+152.013299904 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:50 crc kubenswrapper[4752]: I0929 10:46:50.724269 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-57fqh\" (UID: \"8f756d24-5e77-4130-b920-794234a82ece\") " pod="openshift-image-registry/image-registry-697d97f7c8-57fqh" Sep 29 10:46:50 crc kubenswrapper[4752]: I0929 10:46:50.724324 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrxjl\" (UniqueName: \"kubernetes.io/projected/06f9e526-21c6-4e20-b1a8-8f4fbfaa6413-kube-api-access-qrxjl\") pod \"community-operators-5tlth\" (UID: \"06f9e526-21c6-4e20-b1a8-8f4fbfaa6413\") " pod="openshift-marketplace/community-operators-5tlth" Sep 29 10:46:50 crc kubenswrapper[4752]: I0929 10:46:50.724419 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06f9e526-21c6-4e20-b1a8-8f4fbfaa6413-catalog-content\") pod \"community-operators-5tlth\" (UID: \"06f9e526-21c6-4e20-b1a8-8f4fbfaa6413\") " pod="openshift-marketplace/community-operators-5tlth" Sep 29 10:46:50 crc kubenswrapper[4752]: I0929 10:46:50.724479 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06f9e526-21c6-4e20-b1a8-8f4fbfaa6413-utilities\") pod \"community-operators-5tlth\" (UID: \"06f9e526-21c6-4e20-b1a8-8f4fbfaa6413\") " pod="openshift-marketplace/community-operators-5tlth" Sep 29 10:46:50 crc kubenswrapper[4752]: E0929 10:46:50.724613 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 10:46:51.224606109 +0000 UTC m=+152.013747776 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-57fqh" (UID: "8f756d24-5e77-4130-b920-794234a82ece") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:50 crc kubenswrapper[4752]: I0929 10:46:50.725081 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06f9e526-21c6-4e20-b1a8-8f4fbfaa6413-utilities\") pod \"community-operators-5tlth\" (UID: \"06f9e526-21c6-4e20-b1a8-8f4fbfaa6413\") " pod="openshift-marketplace/community-operators-5tlth" Sep 29 10:46:50 crc kubenswrapper[4752]: I0929 10:46:50.725227 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06f9e526-21c6-4e20-b1a8-8f4fbfaa6413-catalog-content\") pod \"community-operators-5tlth\" (UID: \"06f9e526-21c6-4e20-b1a8-8f4fbfaa6413\") " pod="openshift-marketplace/community-operators-5tlth" Sep 29 10:46:50 crc kubenswrapper[4752]: I0929 10:46:50.725558 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"ee4b05d859d5e4f01afd3c0edf1969a5a37569e4509efc94184ad614b76b2187"} Sep 29 10:46:50 crc kubenswrapper[4752]: I0929 10:46:50.725611 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"7f5eb93fc435646cefaff60da28f7a4dce5d664532e013075c3e3ba920b2e928"} Sep 29 10:46:50 crc kubenswrapper[4752]: I0929 10:46:50.733181 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2pwfh"] Sep 29 10:46:50 crc kubenswrapper[4752]: I0929 10:46:50.734276 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2pwfh" Sep 29 10:46:50 crc kubenswrapper[4752]: I0929 10:46:50.738223 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Sep 29 10:46:50 crc kubenswrapper[4752]: I0929 10:46:50.750572 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrxjl\" (UniqueName: \"kubernetes.io/projected/06f9e526-21c6-4e20-b1a8-8f4fbfaa6413-kube-api-access-qrxjl\") pod \"community-operators-5tlth\" (UID: \"06f9e526-21c6-4e20-b1a8-8f4fbfaa6413\") " pod="openshift-marketplace/community-operators-5tlth" Sep 29 10:46:50 crc kubenswrapper[4752]: I0929 10:46:50.758502 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2pwfh"] Sep 29 10:46:50 crc kubenswrapper[4752]: I0929 10:46:50.825336 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 10:46:50 crc kubenswrapper[4752]: E0929 10:46:50.825520 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 10:46:51.325488573 +0000 UTC m=+152.114630240 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:50 crc kubenswrapper[4752]: I0929 10:46:50.825832 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-57fqh\" (UID: \"8f756d24-5e77-4130-b920-794234a82ece\") " pod="openshift-image-registry/image-registry-697d97f7c8-57fqh" Sep 29 10:46:50 crc kubenswrapper[4752]: I0929 10:46:50.825922 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dc2lj\" (UniqueName: \"kubernetes.io/projected/83d184f7-5dec-4c4c-b53e-d26af311916c-kube-api-access-dc2lj\") pod \"certified-operators-2pwfh\" (UID: \"83d184f7-5dec-4c4c-b53e-d26af311916c\") " pod="openshift-marketplace/certified-operators-2pwfh" Sep 29 10:46:50 crc kubenswrapper[4752]: I0929 10:46:50.825992 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83d184f7-5dec-4c4c-b53e-d26af311916c-catalog-content\") pod \"certified-operators-2pwfh\" (UID: \"83d184f7-5dec-4c4c-b53e-d26af311916c\") " pod="openshift-marketplace/certified-operators-2pwfh" Sep 29 10:46:50 crc kubenswrapper[4752]: I0929 10:46:50.826146 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83d184f7-5dec-4c4c-b53e-d26af311916c-utilities\") pod \"certified-operators-2pwfh\" (UID: \"83d184f7-5dec-4c4c-b53e-d26af311916c\") " pod="openshift-marketplace/certified-operators-2pwfh" Sep 29 10:46:50 crc kubenswrapper[4752]: E0929 10:46:50.826263 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 10:46:51.326244464 +0000 UTC m=+152.115386131 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-57fqh" (UID: "8f756d24-5e77-4130-b920-794234a82ece") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:50 crc kubenswrapper[4752]: I0929 10:46:50.865270 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5tlth" Sep 29 10:46:50 crc kubenswrapper[4752]: I0929 10:46:50.927552 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 10:46:50 crc kubenswrapper[4752]: E0929 10:46:50.927979 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 10:46:51.427884929 +0000 UTC m=+152.217026596 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:50 crc kubenswrapper[4752]: I0929 10:46:50.928345 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-57fqh\" (UID: \"8f756d24-5e77-4130-b920-794234a82ece\") " pod="openshift-image-registry/image-registry-697d97f7c8-57fqh" Sep 29 10:46:50 crc kubenswrapper[4752]: I0929 10:46:50.928497 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dc2lj\" (UniqueName: \"kubernetes.io/projected/83d184f7-5dec-4c4c-b53e-d26af311916c-kube-api-access-dc2lj\") pod \"certified-operators-2pwfh\" (UID: \"83d184f7-5dec-4c4c-b53e-d26af311916c\") " pod="openshift-marketplace/certified-operators-2pwfh" Sep 29 10:46:50 crc kubenswrapper[4752]: I0929 10:46:50.928646 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83d184f7-5dec-4c4c-b53e-d26af311916c-catalog-content\") pod \"certified-operators-2pwfh\" (UID: \"83d184f7-5dec-4c4c-b53e-d26af311916c\") " pod="openshift-marketplace/certified-operators-2pwfh" Sep 29 10:46:50 crc kubenswrapper[4752]: I0929 10:46:50.928790 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83d184f7-5dec-4c4c-b53e-d26af311916c-utilities\") pod \"certified-operators-2pwfh\" (UID: \"83d184f7-5dec-4c4c-b53e-d26af311916c\") " pod="openshift-marketplace/certified-operators-2pwfh" Sep 29 10:46:50 crc kubenswrapper[4752]: E0929 10:46:50.929382 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 10:46:51.429370821 +0000 UTC m=+152.218512488 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-57fqh" (UID: "8f756d24-5e77-4130-b920-794234a82ece") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:50 crc kubenswrapper[4752]: I0929 10:46:50.929737 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83d184f7-5dec-4c4c-b53e-d26af311916c-catalog-content\") pod \"certified-operators-2pwfh\" (UID: \"83d184f7-5dec-4c4c-b53e-d26af311916c\") " pod="openshift-marketplace/certified-operators-2pwfh" Sep 29 10:46:50 crc kubenswrapper[4752]: I0929 10:46:50.930229 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83d184f7-5dec-4c4c-b53e-d26af311916c-utilities\") pod \"certified-operators-2pwfh\" (UID: \"83d184f7-5dec-4c4c-b53e-d26af311916c\") " pod="openshift-marketplace/certified-operators-2pwfh" Sep 29 10:46:50 crc kubenswrapper[4752]: I0929 10:46:50.933880 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-l2pww"] Sep 29 10:46:50 crc kubenswrapper[4752]: I0929 10:46:50.934893 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l2pww" Sep 29 10:46:50 crc kubenswrapper[4752]: I0929 10:46:50.949609 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-l2pww"] Sep 29 10:46:50 crc kubenswrapper[4752]: I0929 10:46:50.952340 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dc2lj\" (UniqueName: \"kubernetes.io/projected/83d184f7-5dec-4c4c-b53e-d26af311916c-kube-api-access-dc2lj\") pod \"certified-operators-2pwfh\" (UID: \"83d184f7-5dec-4c4c-b53e-d26af311916c\") " pod="openshift-marketplace/certified-operators-2pwfh" Sep 29 10:46:51 crc kubenswrapper[4752]: I0929 10:46:51.018483 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Sep 29 10:46:51 crc kubenswrapper[4752]: I0929 10:46:51.022075 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 29 10:46:51 crc kubenswrapper[4752]: I0929 10:46:51.025142 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Sep 29 10:46:51 crc kubenswrapper[4752]: I0929 10:46:51.025198 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Sep 29 10:46:51 crc kubenswrapper[4752]: I0929 10:46:51.027257 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Sep 29 10:46:51 crc kubenswrapper[4752]: I0929 10:46:51.031023 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 10:46:51 crc kubenswrapper[4752]: E0929 10:46:51.031390 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 10:46:51.531345514 +0000 UTC m=+152.320487171 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:51 crc kubenswrapper[4752]: I0929 10:46:51.031499 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-57fqh\" (UID: \"8f756d24-5e77-4130-b920-794234a82ece\") " pod="openshift-image-registry/image-registry-697d97f7c8-57fqh" Sep 29 10:46:51 crc kubenswrapper[4752]: I0929 10:46:51.031788 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/501f967f-86b7-41e1-b650-0b122766a576-catalog-content\") pod \"community-operators-l2pww\" (UID: \"501f967f-86b7-41e1-b650-0b122766a576\") " pod="openshift-marketplace/community-operators-l2pww" Sep 29 10:46:51 crc kubenswrapper[4752]: E0929 10:46:51.031876 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 10:46:51.531859709 +0000 UTC m=+152.321001376 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-57fqh" (UID: "8f756d24-5e77-4130-b920-794234a82ece") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:51 crc kubenswrapper[4752]: I0929 10:46:51.031960 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/501f967f-86b7-41e1-b650-0b122766a576-utilities\") pod \"community-operators-l2pww\" (UID: \"501f967f-86b7-41e1-b650-0b122766a576\") " pod="openshift-marketplace/community-operators-l2pww" Sep 29 10:46:51 crc kubenswrapper[4752]: I0929 10:46:51.032001 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fb64\" (UniqueName: \"kubernetes.io/projected/501f967f-86b7-41e1-b650-0b122766a576-kube-api-access-5fb64\") pod \"community-operators-l2pww\" (UID: \"501f967f-86b7-41e1-b650-0b122766a576\") " pod="openshift-marketplace/community-operators-l2pww" Sep 29 10:46:51 crc kubenswrapper[4752]: I0929 10:46:51.069080 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2pwfh" Sep 29 10:46:51 crc kubenswrapper[4752]: I0929 10:46:51.133196 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 10:46:51 crc kubenswrapper[4752]: I0929 10:46:51.133655 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/501f967f-86b7-41e1-b650-0b122766a576-utilities\") pod \"community-operators-l2pww\" (UID: \"501f967f-86b7-41e1-b650-0b122766a576\") " pod="openshift-marketplace/community-operators-l2pww" Sep 29 10:46:51 crc kubenswrapper[4752]: I0929 10:46:51.133699 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fb64\" (UniqueName: \"kubernetes.io/projected/501f967f-86b7-41e1-b650-0b122766a576-kube-api-access-5fb64\") pod \"community-operators-l2pww\" (UID: \"501f967f-86b7-41e1-b650-0b122766a576\") " pod="openshift-marketplace/community-operators-l2pww" Sep 29 10:46:51 crc kubenswrapper[4752]: I0929 10:46:51.133852 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/29994a3c-c9d4-436f-b6bf-c46f1cd81a57-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"29994a3c-c9d4-436f-b6bf-c46f1cd81a57\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 29 10:46:51 crc kubenswrapper[4752]: I0929 10:46:51.133884 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/501f967f-86b7-41e1-b650-0b122766a576-catalog-content\") pod \"community-operators-l2pww\" (UID: \"501f967f-86b7-41e1-b650-0b122766a576\") " pod="openshift-marketplace/community-operators-l2pww" Sep 29 10:46:51 crc kubenswrapper[4752]: I0929 10:46:51.133981 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/29994a3c-c9d4-436f-b6bf-c46f1cd81a57-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"29994a3c-c9d4-436f-b6bf-c46f1cd81a57\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 29 10:46:51 crc kubenswrapper[4752]: E0929 10:46:51.134189 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 10:46:51.634152712 +0000 UTC m=+152.423294379 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 10:46:51 crc kubenswrapper[4752]: I0929 10:46:51.134734 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/501f967f-86b7-41e1-b650-0b122766a576-utilities\") pod \"community-operators-l2pww\" (UID: \"501f967f-86b7-41e1-b650-0b122766a576\") " pod="openshift-marketplace/community-operators-l2pww" Sep 29 10:46:51 crc kubenswrapper[4752]: I0929 10:46:51.135038 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/501f967f-86b7-41e1-b650-0b122766a576-catalog-content\") pod \"community-operators-l2pww\" (UID: \"501f967f-86b7-41e1-b650-0b122766a576\") " pod="openshift-marketplace/community-operators-l2pww" Sep 29 10:46:51 crc kubenswrapper[4752]: I0929 10:46:51.149184 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mskwx"] Sep 29 10:46:51 crc kubenswrapper[4752]: I0929 10:46:51.150962 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5tlth"] Sep 29 10:46:51 crc kubenswrapper[4752]: I0929 10:46:51.151111 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mskwx" Sep 29 10:46:51 crc kubenswrapper[4752]: I0929 10:46:51.159792 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mskwx"] Sep 29 10:46:51 crc kubenswrapper[4752]: I0929 10:46:51.185615 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fb64\" (UniqueName: \"kubernetes.io/projected/501f967f-86b7-41e1-b650-0b122766a576-kube-api-access-5fb64\") pod \"community-operators-l2pww\" (UID: \"501f967f-86b7-41e1-b650-0b122766a576\") " pod="openshift-marketplace/community-operators-l2pww" Sep 29 10:46:51 crc kubenswrapper[4752]: I0929 10:46:51.211295 4752 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-09-29T10:46:50.315432316Z","Handler":null,"Name":""} Sep 29 10:46:51 crc kubenswrapper[4752]: I0929 10:46:51.227930 4752 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Sep 29 10:46:51 crc kubenswrapper[4752]: I0929 10:46:51.227994 4752 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Sep 29 10:46:51 crc kubenswrapper[4752]: I0929 10:46:51.235968 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37b4288c-7657-4647-b24e-98547f53c24f-catalog-content\") pod \"certified-operators-mskwx\" (UID: \"37b4288c-7657-4647-b24e-98547f53c24f\") " pod="openshift-marketplace/certified-operators-mskwx" Sep 29 10:46:51 crc kubenswrapper[4752]: I0929 10:46:51.236029 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2s7gw\" (UniqueName: \"kubernetes.io/projected/37b4288c-7657-4647-b24e-98547f53c24f-kube-api-access-2s7gw\") pod \"certified-operators-mskwx\" (UID: \"37b4288c-7657-4647-b24e-98547f53c24f\") " pod="openshift-marketplace/certified-operators-mskwx" Sep 29 10:46:51 crc kubenswrapper[4752]: I0929 10:46:51.236072 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-57fqh\" (UID: \"8f756d24-5e77-4130-b920-794234a82ece\") " pod="openshift-image-registry/image-registry-697d97f7c8-57fqh" Sep 29 10:46:51 crc kubenswrapper[4752]: I0929 10:46:51.236109 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37b4288c-7657-4647-b24e-98547f53c24f-utilities\") pod \"certified-operators-mskwx\" (UID: \"37b4288c-7657-4647-b24e-98547f53c24f\") " pod="openshift-marketplace/certified-operators-mskwx" Sep 29 10:46:51 crc kubenswrapper[4752]: I0929 10:46:51.236151 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/29994a3c-c9d4-436f-b6bf-c46f1cd81a57-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"29994a3c-c9d4-436f-b6bf-c46f1cd81a57\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 29 10:46:51 crc kubenswrapper[4752]: I0929 10:46:51.236172 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/29994a3c-c9d4-436f-b6bf-c46f1cd81a57-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"29994a3c-c9d4-436f-b6bf-c46f1cd81a57\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 29 10:46:51 crc kubenswrapper[4752]: I0929 10:46:51.236262 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/29994a3c-c9d4-436f-b6bf-c46f1cd81a57-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"29994a3c-c9d4-436f-b6bf-c46f1cd81a57\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 29 10:46:51 crc kubenswrapper[4752]: I0929 10:46:51.240257 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-7sz2n" Sep 29 10:46:51 crc kubenswrapper[4752]: I0929 10:46:51.240353 4752 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Sep 29 10:46:51 crc kubenswrapper[4752]: I0929 10:46:51.240419 4752 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-57fqh\" (UID: \"8f756d24-5e77-4130-b920-794234a82ece\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-57fqh" Sep 29 10:46:51 crc kubenswrapper[4752]: I0929 10:46:51.248261 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-7sz2n" Sep 29 10:46:51 crc kubenswrapper[4752]: I0929 10:46:51.252915 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l2pww" Sep 29 10:46:51 crc kubenswrapper[4752]: I0929 10:46:51.267714 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/29994a3c-c9d4-436f-b6bf-c46f1cd81a57-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"29994a3c-c9d4-436f-b6bf-c46f1cd81a57\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 29 10:46:51 crc kubenswrapper[4752]: I0929 10:46:51.292762 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-57fqh\" (UID: \"8f756d24-5e77-4130-b920-794234a82ece\") " pod="openshift-image-registry/image-registry-697d97f7c8-57fqh" Sep 29 10:46:51 crc kubenswrapper[4752]: I0929 10:46:51.310583 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-57fqh" Sep 29 10:46:51 crc kubenswrapper[4752]: I0929 10:46:51.338472 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 10:46:51 crc kubenswrapper[4752]: I0929 10:46:51.338845 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37b4288c-7657-4647-b24e-98547f53c24f-utilities\") pod \"certified-operators-mskwx\" (UID: \"37b4288c-7657-4647-b24e-98547f53c24f\") " pod="openshift-marketplace/certified-operators-mskwx" Sep 29 10:46:51 crc kubenswrapper[4752]: I0929 10:46:51.338960 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37b4288c-7657-4647-b24e-98547f53c24f-catalog-content\") pod \"certified-operators-mskwx\" (UID: \"37b4288c-7657-4647-b24e-98547f53c24f\") " pod="openshift-marketplace/certified-operators-mskwx" Sep 29 10:46:51 crc kubenswrapper[4752]: I0929 10:46:51.338992 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2s7gw\" (UniqueName: \"kubernetes.io/projected/37b4288c-7657-4647-b24e-98547f53c24f-kube-api-access-2s7gw\") pod \"certified-operators-mskwx\" (UID: \"37b4288c-7657-4647-b24e-98547f53c24f\") " pod="openshift-marketplace/certified-operators-mskwx" Sep 29 10:46:51 crc kubenswrapper[4752]: I0929 10:46:51.341380 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37b4288c-7657-4647-b24e-98547f53c24f-utilities\") pod \"certified-operators-mskwx\" (UID: \"37b4288c-7657-4647-b24e-98547f53c24f\") " pod="openshift-marketplace/certified-operators-mskwx" Sep 29 10:46:51 crc kubenswrapper[4752]: I0929 10:46:51.342292 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37b4288c-7657-4647-b24e-98547f53c24f-catalog-content\") pod \"certified-operators-mskwx\" (UID: \"37b4288c-7657-4647-b24e-98547f53c24f\") " pod="openshift-marketplace/certified-operators-mskwx" Sep 29 10:46:51 crc kubenswrapper[4752]: I0929 10:46:51.347427 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 29 10:46:51 crc kubenswrapper[4752]: I0929 10:46:51.374151 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2s7gw\" (UniqueName: \"kubernetes.io/projected/37b4288c-7657-4647-b24e-98547f53c24f-kube-api-access-2s7gw\") pod \"certified-operators-mskwx\" (UID: \"37b4288c-7657-4647-b24e-98547f53c24f\") " pod="openshift-marketplace/certified-operators-mskwx" Sep 29 10:46:51 crc kubenswrapper[4752]: I0929 10:46:51.417281 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2pwfh"] Sep 29 10:46:51 crc kubenswrapper[4752]: I0929 10:46:51.430434 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Sep 29 10:46:51 crc kubenswrapper[4752]: I0929 10:46:51.472364 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mskwx" Sep 29 10:46:51 crc kubenswrapper[4752]: W0929 10:46:51.482958 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod83d184f7_5dec_4c4c_b53e_d26af311916c.slice/crio-478dfb4261d161fbc523a095992095d392e0cb9e8221b31213d211a787458bc9 WatchSource:0}: Error finding container 478dfb4261d161fbc523a095992095d392e0cb9e8221b31213d211a787458bc9: Status 404 returned error can't find the container with id 478dfb4261d161fbc523a095992095d392e0cb9e8221b31213d211a787458bc9 Sep 29 10:46:51 crc kubenswrapper[4752]: I0929 10:46:51.653615 4752 patch_prober.go:28] interesting pod/router-default-5444994796-r8k8r container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 29 10:46:51 crc kubenswrapper[4752]: [-]has-synced failed: reason withheld Sep 29 10:46:51 crc kubenswrapper[4752]: [+]process-running ok Sep 29 10:46:51 crc kubenswrapper[4752]: healthz check failed Sep 29 10:46:51 crc kubenswrapper[4752]: I0929 10:46:51.654179 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r8k8r" podUID="d9cf5107-f1bf-41ee-bd8a-e3dd8dbfeb5d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 29 10:46:51 crc kubenswrapper[4752]: I0929 10:46:51.754636 4752 generic.go:334] "Generic (PLEG): container finished" podID="06f9e526-21c6-4e20-b1a8-8f4fbfaa6413" containerID="0c2575ba5de5d4677207db7b3a87e1307f2d55f53f76e0a828969c23ce075671" exitCode=0 Sep 29 10:46:51 crc kubenswrapper[4752]: I0929 10:46:51.754738 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5tlth" event={"ID":"06f9e526-21c6-4e20-b1a8-8f4fbfaa6413","Type":"ContainerDied","Data":"0c2575ba5de5d4677207db7b3a87e1307f2d55f53f76e0a828969c23ce075671"} Sep 29 10:46:51 crc kubenswrapper[4752]: I0929 10:46:51.754771 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5tlth" event={"ID":"06f9e526-21c6-4e20-b1a8-8f4fbfaa6413","Type":"ContainerStarted","Data":"724f61baa8b2f9fa972622270ab086c9aa8cccb84359fcc9704a3942f41fb509"} Sep 29 10:46:51 crc kubenswrapper[4752]: I0929 10:46:51.758961 4752 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 29 10:46:51 crc kubenswrapper[4752]: I0929 10:46:51.770183 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2pwfh" event={"ID":"83d184f7-5dec-4c4c-b53e-d26af311916c","Type":"ContainerStarted","Data":"478dfb4261d161fbc523a095992095d392e0cb9e8221b31213d211a787458bc9"} Sep 29 10:46:51 crc kubenswrapper[4752]: I0929 10:46:51.831772 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-zr2lt" event={"ID":"a2cc4fff-1922-45ac-871c-bab6f753b026","Type":"ContainerStarted","Data":"12cf1d1b637bb78c4a802864f5e7900986752ad6ce38e9e14331aa38fea8f0f9"} Sep 29 10:46:51 crc kubenswrapper[4752]: I0929 10:46:51.864451 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-57fqh"] Sep 29 10:46:51 crc kubenswrapper[4752]: I0929 10:46:51.897388 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-zr2lt" podStartSLOduration=12.897368016 podStartE2EDuration="12.897368016s" podCreationTimestamp="2025-09-29 10:46:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:46:51.872319624 +0000 UTC m=+152.661461311" watchObservedRunningTime="2025-09-29 10:46:51.897368016 +0000 UTC m=+152.686509683" Sep 29 10:46:51 crc kubenswrapper[4752]: I0929 10:46:51.900524 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Sep 29 10:46:51 crc kubenswrapper[4752]: I0929 10:46:51.981989 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-l2pww"] Sep 29 10:46:51 crc kubenswrapper[4752]: I0929 10:46:51.988247 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mskwx"] Sep 29 10:46:52 crc kubenswrapper[4752]: W0929 10:46:52.015175 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod501f967f_86b7_41e1_b650_0b122766a576.slice/crio-ca68c44f333d453fd782d1c1309ae2f10484695a1c5ab32feb33adc5998c1960 WatchSource:0}: Error finding container ca68c44f333d453fd782d1c1309ae2f10484695a1c5ab32feb33adc5998c1960: Status 404 returned error can't find the container with id ca68c44f333d453fd782d1c1309ae2f10484695a1c5ab32feb33adc5998c1960 Sep 29 10:46:52 crc kubenswrapper[4752]: I0929 10:46:52.069160 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Sep 29 10:46:52 crc kubenswrapper[4752]: I0929 10:46:52.532897 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ks2mq"] Sep 29 10:46:52 crc kubenswrapper[4752]: I0929 10:46:52.534835 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ks2mq" Sep 29 10:46:52 crc kubenswrapper[4752]: I0929 10:46:52.538192 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Sep 29 10:46:52 crc kubenswrapper[4752]: I0929 10:46:52.566154 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ks2mq"] Sep 29 10:46:52 crc kubenswrapper[4752]: I0929 10:46:52.575516 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7beaf483-1002-4e94-a9ee-59e20e83f824-catalog-content\") pod \"redhat-marketplace-ks2mq\" (UID: \"7beaf483-1002-4e94-a9ee-59e20e83f824\") " pod="openshift-marketplace/redhat-marketplace-ks2mq" Sep 29 10:46:52 crc kubenswrapper[4752]: I0929 10:46:52.575658 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7beaf483-1002-4e94-a9ee-59e20e83f824-utilities\") pod \"redhat-marketplace-ks2mq\" (UID: \"7beaf483-1002-4e94-a9ee-59e20e83f824\") " pod="openshift-marketplace/redhat-marketplace-ks2mq" Sep 29 10:46:52 crc kubenswrapper[4752]: I0929 10:46:52.575686 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ls7tg\" (UniqueName: \"kubernetes.io/projected/7beaf483-1002-4e94-a9ee-59e20e83f824-kube-api-access-ls7tg\") pod \"redhat-marketplace-ks2mq\" (UID: \"7beaf483-1002-4e94-a9ee-59e20e83f824\") " pod="openshift-marketplace/redhat-marketplace-ks2mq" Sep 29 10:46:52 crc kubenswrapper[4752]: I0929 10:46:52.577929 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-bp6hz" Sep 29 10:46:52 crc kubenswrapper[4752]: I0929 10:46:52.577974 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-bp6hz" Sep 29 10:46:52 crc kubenswrapper[4752]: I0929 10:46:52.580308 4752 patch_prober.go:28] interesting pod/console-f9d7485db-bp6hz container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.35:8443/health\": dial tcp 10.217.0.35:8443: connect: connection refused" start-of-body= Sep 29 10:46:52 crc kubenswrapper[4752]: I0929 10:46:52.580390 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-bp6hz" podUID="53fa29ee-8f5a-4c5b-9d74-3ff726f5ed28" containerName="console" probeResult="failure" output="Get \"https://10.217.0.35:8443/health\": dial tcp 10.217.0.35:8443: connect: connection refused" Sep 29 10:46:52 crc kubenswrapper[4752]: I0929 10:46:52.627456 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-r8k8r" Sep 29 10:46:52 crc kubenswrapper[4752]: I0929 10:46:52.635463 4752 patch_prober.go:28] interesting pod/router-default-5444994796-r8k8r container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 29 10:46:52 crc kubenswrapper[4752]: [-]has-synced failed: reason withheld Sep 29 10:46:52 crc kubenswrapper[4752]: [+]process-running ok Sep 29 10:46:52 crc kubenswrapper[4752]: healthz check failed Sep 29 10:46:52 crc kubenswrapper[4752]: I0929 10:46:52.635567 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r8k8r" podUID="d9cf5107-f1bf-41ee-bd8a-e3dd8dbfeb5d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 29 10:46:52 crc kubenswrapper[4752]: I0929 10:46:52.676613 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7beaf483-1002-4e94-a9ee-59e20e83f824-utilities\") pod \"redhat-marketplace-ks2mq\" (UID: \"7beaf483-1002-4e94-a9ee-59e20e83f824\") " pod="openshift-marketplace/redhat-marketplace-ks2mq" Sep 29 10:46:52 crc kubenswrapper[4752]: I0929 10:46:52.676676 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ls7tg\" (UniqueName: \"kubernetes.io/projected/7beaf483-1002-4e94-a9ee-59e20e83f824-kube-api-access-ls7tg\") pod \"redhat-marketplace-ks2mq\" (UID: \"7beaf483-1002-4e94-a9ee-59e20e83f824\") " pod="openshift-marketplace/redhat-marketplace-ks2mq" Sep 29 10:46:52 crc kubenswrapper[4752]: I0929 10:46:52.676706 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7beaf483-1002-4e94-a9ee-59e20e83f824-catalog-content\") pod \"redhat-marketplace-ks2mq\" (UID: \"7beaf483-1002-4e94-a9ee-59e20e83f824\") " pod="openshift-marketplace/redhat-marketplace-ks2mq" Sep 29 10:46:52 crc kubenswrapper[4752]: I0929 10:46:52.678066 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7beaf483-1002-4e94-a9ee-59e20e83f824-catalog-content\") pod \"redhat-marketplace-ks2mq\" (UID: \"7beaf483-1002-4e94-a9ee-59e20e83f824\") " pod="openshift-marketplace/redhat-marketplace-ks2mq" Sep 29 10:46:52 crc kubenswrapper[4752]: I0929 10:46:52.678399 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7beaf483-1002-4e94-a9ee-59e20e83f824-utilities\") pod \"redhat-marketplace-ks2mq\" (UID: \"7beaf483-1002-4e94-a9ee-59e20e83f824\") " pod="openshift-marketplace/redhat-marketplace-ks2mq" Sep 29 10:46:52 crc kubenswrapper[4752]: I0929 10:46:52.704165 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ls7tg\" (UniqueName: \"kubernetes.io/projected/7beaf483-1002-4e94-a9ee-59e20e83f824-kube-api-access-ls7tg\") pod \"redhat-marketplace-ks2mq\" (UID: \"7beaf483-1002-4e94-a9ee-59e20e83f824\") " pod="openshift-marketplace/redhat-marketplace-ks2mq" Sep 29 10:46:52 crc kubenswrapper[4752]: I0929 10:46:52.823941 4752 patch_prober.go:28] interesting pod/downloads-7954f5f757-bhw29 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Sep 29 10:46:52 crc kubenswrapper[4752]: I0929 10:46:52.824031 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-bhw29" podUID="44ae2b29-ec3a-4321-8590-4d316d810034" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Sep 29 10:46:52 crc kubenswrapper[4752]: I0929 10:46:52.824083 4752 patch_prober.go:28] interesting pod/downloads-7954f5f757-bhw29 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Sep 29 10:46:52 crc kubenswrapper[4752]: I0929 10:46:52.824162 4752 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-bhw29" podUID="44ae2b29-ec3a-4321-8590-4d316d810034" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Sep 29 10:46:52 crc kubenswrapper[4752]: I0929 10:46:52.831035 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jvmfr" Sep 29 10:46:52 crc kubenswrapper[4752]: I0929 10:46:52.831106 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jvmfr" Sep 29 10:46:52 crc kubenswrapper[4752]: I0929 10:46:52.845472 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-q8xbr" Sep 29 10:46:52 crc kubenswrapper[4752]: I0929 10:46:52.847265 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jvmfr" Sep 29 10:46:52 crc kubenswrapper[4752]: I0929 10:46:52.850358 4752 generic.go:334] "Generic (PLEG): container finished" podID="501f967f-86b7-41e1-b650-0b122766a576" containerID="2671ca5fa0374435f5ee61e712f3266a8816685bd16c407b477bd365cef7b46c" exitCode=0 Sep 29 10:46:52 crc kubenswrapper[4752]: I0929 10:46:52.850420 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l2pww" event={"ID":"501f967f-86b7-41e1-b650-0b122766a576","Type":"ContainerDied","Data":"2671ca5fa0374435f5ee61e712f3266a8816685bd16c407b477bd365cef7b46c"} Sep 29 10:46:52 crc kubenswrapper[4752]: I0929 10:46:52.850441 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l2pww" event={"ID":"501f967f-86b7-41e1-b650-0b122766a576","Type":"ContainerStarted","Data":"ca68c44f333d453fd782d1c1309ae2f10484695a1c5ab32feb33adc5998c1960"} Sep 29 10:46:52 crc kubenswrapper[4752]: I0929 10:46:52.852431 4752 generic.go:334] "Generic (PLEG): container finished" podID="83d184f7-5dec-4c4c-b53e-d26af311916c" containerID="7dc55d23e87767af43550aa64cf3e4efd9c4bfbc9617b64d5e43b2ab992c95a1" exitCode=0 Sep 29 10:46:52 crc kubenswrapper[4752]: I0929 10:46:52.852544 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2pwfh" event={"ID":"83d184f7-5dec-4c4c-b53e-d26af311916c","Type":"ContainerDied","Data":"7dc55d23e87767af43550aa64cf3e4efd9c4bfbc9617b64d5e43b2ab992c95a1"} Sep 29 10:46:52 crc kubenswrapper[4752]: I0929 10:46:52.856932 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-57fqh" event={"ID":"8f756d24-5e77-4130-b920-794234a82ece","Type":"ContainerStarted","Data":"8cf7746dad67cc0560347e51f223b56cf7fdf06f99e050e072696b98b4c557e1"} Sep 29 10:46:52 crc kubenswrapper[4752]: I0929 10:46:52.857264 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-57fqh" event={"ID":"8f756d24-5e77-4130-b920-794234a82ece","Type":"ContainerStarted","Data":"0feb392e49335a28f5a46448609b0aade3feabfa88f0c477493b2de19f207244"} Sep 29 10:46:52 crc kubenswrapper[4752]: I0929 10:46:52.858124 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-57fqh" Sep 29 10:46:52 crc kubenswrapper[4752]: I0929 10:46:52.858154 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ks2mq" Sep 29 10:46:52 crc kubenswrapper[4752]: I0929 10:46:52.860787 4752 generic.go:334] "Generic (PLEG): container finished" podID="37b4288c-7657-4647-b24e-98547f53c24f" containerID="b6308f197c883da696565e7608858a9aff07a91afbb7fac78aaaba9db165ec3e" exitCode=0 Sep 29 10:46:52 crc kubenswrapper[4752]: I0929 10:46:52.860938 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mskwx" event={"ID":"37b4288c-7657-4647-b24e-98547f53c24f","Type":"ContainerDied","Data":"b6308f197c883da696565e7608858a9aff07a91afbb7fac78aaaba9db165ec3e"} Sep 29 10:46:52 crc kubenswrapper[4752]: I0929 10:46:52.860978 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mskwx" event={"ID":"37b4288c-7657-4647-b24e-98547f53c24f","Type":"ContainerStarted","Data":"2cd9fc85d5a94b135a60cb791ba2a80976dabf99f0cc1d164d5bad08d0259c23"} Sep 29 10:46:52 crc kubenswrapper[4752]: I0929 10:46:52.864669 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"29994a3c-c9d4-436f-b6bf-c46f1cd81a57","Type":"ContainerStarted","Data":"3d63364a012bb46556f4bf619536b6606d586e7aa2249d8fe09b1c211089f5fc"} Sep 29 10:46:52 crc kubenswrapper[4752]: I0929 10:46:52.864719 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"29994a3c-c9d4-436f-b6bf-c46f1cd81a57","Type":"ContainerStarted","Data":"60594a5e57536659b9887ce59b2e8c8acee2f72c0f67d01dc823ca549dc99b8c"} Sep 29 10:46:52 crc kubenswrapper[4752]: I0929 10:46:52.936996 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wrp7s"] Sep 29 10:46:52 crc kubenswrapper[4752]: I0929 10:46:52.938342 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wrp7s" Sep 29 10:46:52 crc kubenswrapper[4752]: I0929 10:46:52.978361 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=2.978337589 podStartE2EDuration="2.978337589s" podCreationTimestamp="2025-09-29 10:46:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:46:52.977035803 +0000 UTC m=+153.766177470" watchObservedRunningTime="2025-09-29 10:46:52.978337589 +0000 UTC m=+153.767479256" Sep 29 10:46:52 crc kubenswrapper[4752]: I0929 10:46:52.991652 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csmtp\" (UniqueName: \"kubernetes.io/projected/8142731c-cdef-4d76-aeae-bec64f2cb840-kube-api-access-csmtp\") pod \"redhat-marketplace-wrp7s\" (UID: \"8142731c-cdef-4d76-aeae-bec64f2cb840\") " pod="openshift-marketplace/redhat-marketplace-wrp7s" Sep 29 10:46:52 crc kubenswrapper[4752]: I0929 10:46:52.991865 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8142731c-cdef-4d76-aeae-bec64f2cb840-catalog-content\") pod \"redhat-marketplace-wrp7s\" (UID: \"8142731c-cdef-4d76-aeae-bec64f2cb840\") " pod="openshift-marketplace/redhat-marketplace-wrp7s" Sep 29 10:46:52 crc kubenswrapper[4752]: I0929 10:46:52.991950 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8142731c-cdef-4d76-aeae-bec64f2cb840-utilities\") pod \"redhat-marketplace-wrp7s\" (UID: \"8142731c-cdef-4d76-aeae-bec64f2cb840\") " pod="openshift-marketplace/redhat-marketplace-wrp7s" Sep 29 10:46:53 crc kubenswrapper[4752]: I0929 10:46:53.006029 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wrp7s"] Sep 29 10:46:53 crc kubenswrapper[4752]: I0929 10:46:53.085922 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-pbmfv" Sep 29 10:46:53 crc kubenswrapper[4752]: I0929 10:46:53.094190 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8142731c-cdef-4d76-aeae-bec64f2cb840-utilities\") pod \"redhat-marketplace-wrp7s\" (UID: \"8142731c-cdef-4d76-aeae-bec64f2cb840\") " pod="openshift-marketplace/redhat-marketplace-wrp7s" Sep 29 10:46:53 crc kubenswrapper[4752]: I0929 10:46:53.094295 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-csmtp\" (UniqueName: \"kubernetes.io/projected/8142731c-cdef-4d76-aeae-bec64f2cb840-kube-api-access-csmtp\") pod \"redhat-marketplace-wrp7s\" (UID: \"8142731c-cdef-4d76-aeae-bec64f2cb840\") " pod="openshift-marketplace/redhat-marketplace-wrp7s" Sep 29 10:46:53 crc kubenswrapper[4752]: I0929 10:46:53.094392 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8142731c-cdef-4d76-aeae-bec64f2cb840-catalog-content\") pod \"redhat-marketplace-wrp7s\" (UID: \"8142731c-cdef-4d76-aeae-bec64f2cb840\") " pod="openshift-marketplace/redhat-marketplace-wrp7s" Sep 29 10:46:53 crc kubenswrapper[4752]: I0929 10:46:53.095641 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8142731c-cdef-4d76-aeae-bec64f2cb840-catalog-content\") pod \"redhat-marketplace-wrp7s\" (UID: \"8142731c-cdef-4d76-aeae-bec64f2cb840\") " pod="openshift-marketplace/redhat-marketplace-wrp7s" Sep 29 10:46:53 crc kubenswrapper[4752]: I0929 10:46:53.096017 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8142731c-cdef-4d76-aeae-bec64f2cb840-utilities\") pod \"redhat-marketplace-wrp7s\" (UID: \"8142731c-cdef-4d76-aeae-bec64f2cb840\") " pod="openshift-marketplace/redhat-marketplace-wrp7s" Sep 29 10:46:53 crc kubenswrapper[4752]: I0929 10:46:53.096281 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-57fqh" podStartSLOduration=133.096260184 podStartE2EDuration="2m13.096260184s" podCreationTimestamp="2025-09-29 10:44:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:46:53.09359344 +0000 UTC m=+153.882735107" watchObservedRunningTime="2025-09-29 10:46:53.096260184 +0000 UTC m=+153.885401851" Sep 29 10:46:53 crc kubenswrapper[4752]: I0929 10:46:53.130851 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-csmtp\" (UniqueName: \"kubernetes.io/projected/8142731c-cdef-4d76-aeae-bec64f2cb840-kube-api-access-csmtp\") pod \"redhat-marketplace-wrp7s\" (UID: \"8142731c-cdef-4d76-aeae-bec64f2cb840\") " pod="openshift-marketplace/redhat-marketplace-wrp7s" Sep 29 10:46:53 crc kubenswrapper[4752]: I0929 10:46:53.265134 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wrp7s" Sep 29 10:46:53 crc kubenswrapper[4752]: I0929 10:46:53.321908 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ks2mq"] Sep 29 10:46:53 crc kubenswrapper[4752]: W0929 10:46:53.343384 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7beaf483_1002_4e94_a9ee_59e20e83f824.slice/crio-b7569a36b8ee02a6a5ce23995563465893751cd23f5dea2bc883496b095b26be WatchSource:0}: Error finding container b7569a36b8ee02a6a5ce23995563465893751cd23f5dea2bc883496b095b26be: Status 404 returned error can't find the container with id b7569a36b8ee02a6a5ce23995563465893751cd23f5dea2bc883496b095b26be Sep 29 10:46:53 crc kubenswrapper[4752]: I0929 10:46:53.537203 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Sep 29 10:46:53 crc kubenswrapper[4752]: I0929 10:46:53.538037 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 29 10:46:53 crc kubenswrapper[4752]: I0929 10:46:53.541589 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Sep 29 10:46:53 crc kubenswrapper[4752]: I0929 10:46:53.542615 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Sep 29 10:46:53 crc kubenswrapper[4752]: I0929 10:46:53.548869 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wrp7s"] Sep 29 10:46:53 crc kubenswrapper[4752]: I0929 10:46:53.552373 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Sep 29 10:46:53 crc kubenswrapper[4752]: W0929 10:46:53.600403 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8142731c_cdef_4d76_aeae_bec64f2cb840.slice/crio-da86f7af7c4b5a8962282dbeed433104c63a7238e91cc36b34c8798f3b6fc776 WatchSource:0}: Error finding container da86f7af7c4b5a8962282dbeed433104c63a7238e91cc36b34c8798f3b6fc776: Status 404 returned error can't find the container with id da86f7af7c4b5a8962282dbeed433104c63a7238e91cc36b34c8798f3b6fc776 Sep 29 10:46:53 crc kubenswrapper[4752]: I0929 10:46:53.614100 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fb4bcdba-8a53-409c-9c6d-a8d464321183-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"fb4bcdba-8a53-409c-9c6d-a8d464321183\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 29 10:46:53 crc kubenswrapper[4752]: I0929 10:46:53.614207 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fb4bcdba-8a53-409c-9c6d-a8d464321183-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"fb4bcdba-8a53-409c-9c6d-a8d464321183\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 29 10:46:53 crc kubenswrapper[4752]: I0929 10:46:53.633150 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-r8k8r" Sep 29 10:46:53 crc kubenswrapper[4752]: I0929 10:46:53.636251 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-r8k8r" Sep 29 10:46:53 crc kubenswrapper[4752]: I0929 10:46:53.716460 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fb4bcdba-8a53-409c-9c6d-a8d464321183-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"fb4bcdba-8a53-409c-9c6d-a8d464321183\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 29 10:46:53 crc kubenswrapper[4752]: I0929 10:46:53.716606 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fb4bcdba-8a53-409c-9c6d-a8d464321183-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"fb4bcdba-8a53-409c-9c6d-a8d464321183\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 29 10:46:53 crc kubenswrapper[4752]: I0929 10:46:53.717334 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fb4bcdba-8a53-409c-9c6d-a8d464321183-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"fb4bcdba-8a53-409c-9c6d-a8d464321183\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 29 10:46:53 crc kubenswrapper[4752]: I0929 10:46:53.740556 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fb4bcdba-8a53-409c-9c6d-a8d464321183-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"fb4bcdba-8a53-409c-9c6d-a8d464321183\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 29 10:46:53 crc kubenswrapper[4752]: I0929 10:46:53.874205 4752 generic.go:334] "Generic (PLEG): container finished" podID="ae8c092c-ec9d-456a-9ba3-5501c22f6280" containerID="0f2d0a93bf241ea6876c7045753bca8c3b6ffb6faf20b37b4418f0b7c8b82cc4" exitCode=0 Sep 29 10:46:53 crc kubenswrapper[4752]: I0929 10:46:53.874291 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29319045-qpr28" event={"ID":"ae8c092c-ec9d-456a-9ba3-5501c22f6280","Type":"ContainerDied","Data":"0f2d0a93bf241ea6876c7045753bca8c3b6ffb6faf20b37b4418f0b7c8b82cc4"} Sep 29 10:46:53 crc kubenswrapper[4752]: I0929 10:46:53.879666 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ks2mq" event={"ID":"7beaf483-1002-4e94-a9ee-59e20e83f824","Type":"ContainerDied","Data":"801d8c51667f4a39d9989d384b4593ba509f6df68a623832900642a45a18e098"} Sep 29 10:46:53 crc kubenswrapper[4752]: I0929 10:46:53.879517 4752 generic.go:334] "Generic (PLEG): container finished" podID="7beaf483-1002-4e94-a9ee-59e20e83f824" containerID="801d8c51667f4a39d9989d384b4593ba509f6df68a623832900642a45a18e098" exitCode=0 Sep 29 10:46:53 crc kubenswrapper[4752]: I0929 10:46:53.881164 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ks2mq" event={"ID":"7beaf483-1002-4e94-a9ee-59e20e83f824","Type":"ContainerStarted","Data":"b7569a36b8ee02a6a5ce23995563465893751cd23f5dea2bc883496b095b26be"} Sep 29 10:46:53 crc kubenswrapper[4752]: I0929 10:46:53.887568 4752 generic.go:334] "Generic (PLEG): container finished" podID="29994a3c-c9d4-436f-b6bf-c46f1cd81a57" containerID="3d63364a012bb46556f4bf619536b6606d586e7aa2249d8fe09b1c211089f5fc" exitCode=0 Sep 29 10:46:53 crc kubenswrapper[4752]: I0929 10:46:53.887737 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"29994a3c-c9d4-436f-b6bf-c46f1cd81a57","Type":"ContainerDied","Data":"3d63364a012bb46556f4bf619536b6606d586e7aa2249d8fe09b1c211089f5fc"} Sep 29 10:46:53 crc kubenswrapper[4752]: I0929 10:46:53.903795 4752 generic.go:334] "Generic (PLEG): container finished" podID="8142731c-cdef-4d76-aeae-bec64f2cb840" containerID="0ef7a934b8b27f21b65574bab5ce52ffe65e4d45cfc3fcfa2a4efe2ff939d89b" exitCode=0 Sep 29 10:46:53 crc kubenswrapper[4752]: I0929 10:46:53.904990 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wrp7s" event={"ID":"8142731c-cdef-4d76-aeae-bec64f2cb840","Type":"ContainerDied","Data":"0ef7a934b8b27f21b65574bab5ce52ffe65e4d45cfc3fcfa2a4efe2ff939d89b"} Sep 29 10:46:53 crc kubenswrapper[4752]: I0929 10:46:53.905020 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wrp7s" event={"ID":"8142731c-cdef-4d76-aeae-bec64f2cb840","Type":"ContainerStarted","Data":"da86f7af7c4b5a8962282dbeed433104c63a7238e91cc36b34c8798f3b6fc776"} Sep 29 10:46:53 crc kubenswrapper[4752]: I0929 10:46:53.915570 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jvmfr" Sep 29 10:46:53 crc kubenswrapper[4752]: I0929 10:46:53.936116 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 29 10:46:53 crc kubenswrapper[4752]: I0929 10:46:53.974633 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-kxshj"] Sep 29 10:46:53 crc kubenswrapper[4752]: I0929 10:46:53.975973 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kxshj"] Sep 29 10:46:53 crc kubenswrapper[4752]: I0929 10:46:53.976087 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kxshj" Sep 29 10:46:53 crc kubenswrapper[4752]: I0929 10:46:53.982415 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Sep 29 10:46:54 crc kubenswrapper[4752]: I0929 10:46:54.047608 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlzgx\" (UniqueName: \"kubernetes.io/projected/98952fd9-6515-4ba1-8d1a-490a2c3e33b1-kube-api-access-xlzgx\") pod \"redhat-operators-kxshj\" (UID: \"98952fd9-6515-4ba1-8d1a-490a2c3e33b1\") " pod="openshift-marketplace/redhat-operators-kxshj" Sep 29 10:46:54 crc kubenswrapper[4752]: I0929 10:46:54.047851 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98952fd9-6515-4ba1-8d1a-490a2c3e33b1-utilities\") pod \"redhat-operators-kxshj\" (UID: \"98952fd9-6515-4ba1-8d1a-490a2c3e33b1\") " pod="openshift-marketplace/redhat-operators-kxshj" Sep 29 10:46:54 crc kubenswrapper[4752]: I0929 10:46:54.048060 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98952fd9-6515-4ba1-8d1a-490a2c3e33b1-catalog-content\") pod \"redhat-operators-kxshj\" (UID: \"98952fd9-6515-4ba1-8d1a-490a2c3e33b1\") " pod="openshift-marketplace/redhat-operators-kxshj" Sep 29 10:46:54 crc kubenswrapper[4752]: I0929 10:46:54.149831 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlzgx\" (UniqueName: \"kubernetes.io/projected/98952fd9-6515-4ba1-8d1a-490a2c3e33b1-kube-api-access-xlzgx\") pod \"redhat-operators-kxshj\" (UID: \"98952fd9-6515-4ba1-8d1a-490a2c3e33b1\") " pod="openshift-marketplace/redhat-operators-kxshj" Sep 29 10:46:54 crc kubenswrapper[4752]: I0929 10:46:54.149888 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98952fd9-6515-4ba1-8d1a-490a2c3e33b1-utilities\") pod \"redhat-operators-kxshj\" (UID: \"98952fd9-6515-4ba1-8d1a-490a2c3e33b1\") " pod="openshift-marketplace/redhat-operators-kxshj" Sep 29 10:46:54 crc kubenswrapper[4752]: I0929 10:46:54.149931 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98952fd9-6515-4ba1-8d1a-490a2c3e33b1-catalog-content\") pod \"redhat-operators-kxshj\" (UID: \"98952fd9-6515-4ba1-8d1a-490a2c3e33b1\") " pod="openshift-marketplace/redhat-operators-kxshj" Sep 29 10:46:54 crc kubenswrapper[4752]: I0929 10:46:54.150494 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98952fd9-6515-4ba1-8d1a-490a2c3e33b1-catalog-content\") pod \"redhat-operators-kxshj\" (UID: \"98952fd9-6515-4ba1-8d1a-490a2c3e33b1\") " pod="openshift-marketplace/redhat-operators-kxshj" Sep 29 10:46:54 crc kubenswrapper[4752]: I0929 10:46:54.151102 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98952fd9-6515-4ba1-8d1a-490a2c3e33b1-utilities\") pod \"redhat-operators-kxshj\" (UID: \"98952fd9-6515-4ba1-8d1a-490a2c3e33b1\") " pod="openshift-marketplace/redhat-operators-kxshj" Sep 29 10:46:54 crc kubenswrapper[4752]: I0929 10:46:54.176908 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlzgx\" (UniqueName: \"kubernetes.io/projected/98952fd9-6515-4ba1-8d1a-490a2c3e33b1-kube-api-access-xlzgx\") pod \"redhat-operators-kxshj\" (UID: \"98952fd9-6515-4ba1-8d1a-490a2c3e33b1\") " pod="openshift-marketplace/redhat-operators-kxshj" Sep 29 10:46:54 crc kubenswrapper[4752]: I0929 10:46:54.296172 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kxshj" Sep 29 10:46:54 crc kubenswrapper[4752]: I0929 10:46:54.310214 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Sep 29 10:46:54 crc kubenswrapper[4752]: I0929 10:46:54.343765 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ppd9s"] Sep 29 10:46:54 crc kubenswrapper[4752]: I0929 10:46:54.347868 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ppd9s" Sep 29 10:46:54 crc kubenswrapper[4752]: I0929 10:46:54.355916 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ppd9s"] Sep 29 10:46:54 crc kubenswrapper[4752]: W0929 10:46:54.363930 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podfb4bcdba_8a53_409c_9c6d_a8d464321183.slice/crio-5698bfb62a248342249b2202dab1c360b45594c4a82b7020be458a7389d70dd0 WatchSource:0}: Error finding container 5698bfb62a248342249b2202dab1c360b45594c4a82b7020be458a7389d70dd0: Status 404 returned error can't find the container with id 5698bfb62a248342249b2202dab1c360b45594c4a82b7020be458a7389d70dd0 Sep 29 10:46:54 crc kubenswrapper[4752]: I0929 10:46:54.459278 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a042fcf-8877-4ea2-97fc-272870ff20d3-utilities\") pod \"redhat-operators-ppd9s\" (UID: \"5a042fcf-8877-4ea2-97fc-272870ff20d3\") " pod="openshift-marketplace/redhat-operators-ppd9s" Sep 29 10:46:54 crc kubenswrapper[4752]: I0929 10:46:54.459349 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a042fcf-8877-4ea2-97fc-272870ff20d3-catalog-content\") pod \"redhat-operators-ppd9s\" (UID: \"5a042fcf-8877-4ea2-97fc-272870ff20d3\") " pod="openshift-marketplace/redhat-operators-ppd9s" Sep 29 10:46:54 crc kubenswrapper[4752]: I0929 10:46:54.459401 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7mvp\" (UniqueName: \"kubernetes.io/projected/5a042fcf-8877-4ea2-97fc-272870ff20d3-kube-api-access-d7mvp\") pod \"redhat-operators-ppd9s\" (UID: \"5a042fcf-8877-4ea2-97fc-272870ff20d3\") " pod="openshift-marketplace/redhat-operators-ppd9s" Sep 29 10:46:54 crc kubenswrapper[4752]: I0929 10:46:54.568548 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a042fcf-8877-4ea2-97fc-272870ff20d3-utilities\") pod \"redhat-operators-ppd9s\" (UID: \"5a042fcf-8877-4ea2-97fc-272870ff20d3\") " pod="openshift-marketplace/redhat-operators-ppd9s" Sep 29 10:46:54 crc kubenswrapper[4752]: I0929 10:46:54.568623 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a042fcf-8877-4ea2-97fc-272870ff20d3-catalog-content\") pod \"redhat-operators-ppd9s\" (UID: \"5a042fcf-8877-4ea2-97fc-272870ff20d3\") " pod="openshift-marketplace/redhat-operators-ppd9s" Sep 29 10:46:54 crc kubenswrapper[4752]: I0929 10:46:54.568733 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7mvp\" (UniqueName: \"kubernetes.io/projected/5a042fcf-8877-4ea2-97fc-272870ff20d3-kube-api-access-d7mvp\") pod \"redhat-operators-ppd9s\" (UID: \"5a042fcf-8877-4ea2-97fc-272870ff20d3\") " pod="openshift-marketplace/redhat-operators-ppd9s" Sep 29 10:46:54 crc kubenswrapper[4752]: I0929 10:46:54.569930 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a042fcf-8877-4ea2-97fc-272870ff20d3-catalog-content\") pod \"redhat-operators-ppd9s\" (UID: \"5a042fcf-8877-4ea2-97fc-272870ff20d3\") " pod="openshift-marketplace/redhat-operators-ppd9s" Sep 29 10:46:54 crc kubenswrapper[4752]: I0929 10:46:54.570092 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a042fcf-8877-4ea2-97fc-272870ff20d3-utilities\") pod \"redhat-operators-ppd9s\" (UID: \"5a042fcf-8877-4ea2-97fc-272870ff20d3\") " pod="openshift-marketplace/redhat-operators-ppd9s" Sep 29 10:46:54 crc kubenswrapper[4752]: I0929 10:46:54.590743 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7mvp\" (UniqueName: \"kubernetes.io/projected/5a042fcf-8877-4ea2-97fc-272870ff20d3-kube-api-access-d7mvp\") pod \"redhat-operators-ppd9s\" (UID: \"5a042fcf-8877-4ea2-97fc-272870ff20d3\") " pod="openshift-marketplace/redhat-operators-ppd9s" Sep 29 10:46:54 crc kubenswrapper[4752]: I0929 10:46:54.598273 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kxshj"] Sep 29 10:46:54 crc kubenswrapper[4752]: W0929 10:46:54.611064 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod98952fd9_6515_4ba1_8d1a_490a2c3e33b1.slice/crio-6538a00acae804aeba215ba7bea54e6e820516b6b2dd119f655cd22fb4247473 WatchSource:0}: Error finding container 6538a00acae804aeba215ba7bea54e6e820516b6b2dd119f655cd22fb4247473: Status 404 returned error can't find the container with id 6538a00acae804aeba215ba7bea54e6e820516b6b2dd119f655cd22fb4247473 Sep 29 10:46:54 crc kubenswrapper[4752]: I0929 10:46:54.695656 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ppd9s" Sep 29 10:46:54 crc kubenswrapper[4752]: I0929 10:46:54.921706 4752 generic.go:334] "Generic (PLEG): container finished" podID="98952fd9-6515-4ba1-8d1a-490a2c3e33b1" containerID="ff48ef37f008c732e90ca6eadc1d92123e1de0a924df91778543cbe249304097" exitCode=0 Sep 29 10:46:54 crc kubenswrapper[4752]: I0929 10:46:54.921865 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kxshj" event={"ID":"98952fd9-6515-4ba1-8d1a-490a2c3e33b1","Type":"ContainerDied","Data":"ff48ef37f008c732e90ca6eadc1d92123e1de0a924df91778543cbe249304097"} Sep 29 10:46:54 crc kubenswrapper[4752]: I0929 10:46:54.922207 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kxshj" event={"ID":"98952fd9-6515-4ba1-8d1a-490a2c3e33b1","Type":"ContainerStarted","Data":"6538a00acae804aeba215ba7bea54e6e820516b6b2dd119f655cd22fb4247473"} Sep 29 10:46:54 crc kubenswrapper[4752]: I0929 10:46:54.924791 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"fb4bcdba-8a53-409c-9c6d-a8d464321183","Type":"ContainerStarted","Data":"5698bfb62a248342249b2202dab1c360b45594c4a82b7020be458a7389d70dd0"} Sep 29 10:46:55 crc kubenswrapper[4752]: I0929 10:46:55.032402 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ppd9s"] Sep 29 10:46:55 crc kubenswrapper[4752]: W0929 10:46:55.062097 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a042fcf_8877_4ea2_97fc_272870ff20d3.slice/crio-bee4668fdb54047db1f84b3365ff28db1d44a541dfcda5d1de7b2df84c57a84f WatchSource:0}: Error finding container bee4668fdb54047db1f84b3365ff28db1d44a541dfcda5d1de7b2df84c57a84f: Status 404 returned error can't find the container with id bee4668fdb54047db1f84b3365ff28db1d44a541dfcda5d1de7b2df84c57a84f Sep 29 10:46:55 crc kubenswrapper[4752]: I0929 10:46:55.217308 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 29 10:46:55 crc kubenswrapper[4752]: I0929 10:46:55.281889 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/29994a3c-c9d4-436f-b6bf-c46f1cd81a57-kubelet-dir\") pod \"29994a3c-c9d4-436f-b6bf-c46f1cd81a57\" (UID: \"29994a3c-c9d4-436f-b6bf-c46f1cd81a57\") " Sep 29 10:46:55 crc kubenswrapper[4752]: I0929 10:46:55.282525 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/29994a3c-c9d4-436f-b6bf-c46f1cd81a57-kube-api-access\") pod \"29994a3c-c9d4-436f-b6bf-c46f1cd81a57\" (UID: \"29994a3c-c9d4-436f-b6bf-c46f1cd81a57\") " Sep 29 10:46:55 crc kubenswrapper[4752]: I0929 10:46:55.284075 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/29994a3c-c9d4-436f-b6bf-c46f1cd81a57-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "29994a3c-c9d4-436f-b6bf-c46f1cd81a57" (UID: "29994a3c-c9d4-436f-b6bf-c46f1cd81a57"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 10:46:55 crc kubenswrapper[4752]: I0929 10:46:55.290883 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29994a3c-c9d4-436f-b6bf-c46f1cd81a57-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "29994a3c-c9d4-436f-b6bf-c46f1cd81a57" (UID: "29994a3c-c9d4-436f-b6bf-c46f1cd81a57"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:46:55 crc kubenswrapper[4752]: I0929 10:46:55.366821 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29319045-qpr28" Sep 29 10:46:55 crc kubenswrapper[4752]: I0929 10:46:55.390387 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ae8c092c-ec9d-456a-9ba3-5501c22f6280-secret-volume\") pod \"ae8c092c-ec9d-456a-9ba3-5501c22f6280\" (UID: \"ae8c092c-ec9d-456a-9ba3-5501c22f6280\") " Sep 29 10:46:55 crc kubenswrapper[4752]: I0929 10:46:55.390452 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gbcwc\" (UniqueName: \"kubernetes.io/projected/ae8c092c-ec9d-456a-9ba3-5501c22f6280-kube-api-access-gbcwc\") pod \"ae8c092c-ec9d-456a-9ba3-5501c22f6280\" (UID: \"ae8c092c-ec9d-456a-9ba3-5501c22f6280\") " Sep 29 10:46:55 crc kubenswrapper[4752]: I0929 10:46:55.390496 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ae8c092c-ec9d-456a-9ba3-5501c22f6280-config-volume\") pod \"ae8c092c-ec9d-456a-9ba3-5501c22f6280\" (UID: \"ae8c092c-ec9d-456a-9ba3-5501c22f6280\") " Sep 29 10:46:55 crc kubenswrapper[4752]: I0929 10:46:55.390898 4752 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/29994a3c-c9d4-436f-b6bf-c46f1cd81a57-kubelet-dir\") on node \"crc\" DevicePath \"\"" Sep 29 10:46:55 crc kubenswrapper[4752]: I0929 10:46:55.390913 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/29994a3c-c9d4-436f-b6bf-c46f1cd81a57-kube-api-access\") on node \"crc\" DevicePath \"\"" Sep 29 10:46:55 crc kubenswrapper[4752]: I0929 10:46:55.392210 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae8c092c-ec9d-456a-9ba3-5501c22f6280-config-volume" (OuterVolumeSpecName: "config-volume") pod "ae8c092c-ec9d-456a-9ba3-5501c22f6280" (UID: "ae8c092c-ec9d-456a-9ba3-5501c22f6280"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:46:55 crc kubenswrapper[4752]: I0929 10:46:55.396225 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae8c092c-ec9d-456a-9ba3-5501c22f6280-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "ae8c092c-ec9d-456a-9ba3-5501c22f6280" (UID: "ae8c092c-ec9d-456a-9ba3-5501c22f6280"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:46:55 crc kubenswrapper[4752]: I0929 10:46:55.398812 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae8c092c-ec9d-456a-9ba3-5501c22f6280-kube-api-access-gbcwc" (OuterVolumeSpecName: "kube-api-access-gbcwc") pod "ae8c092c-ec9d-456a-9ba3-5501c22f6280" (UID: "ae8c092c-ec9d-456a-9ba3-5501c22f6280"). InnerVolumeSpecName "kube-api-access-gbcwc". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:46:55 crc kubenswrapper[4752]: I0929 10:46:55.491889 4752 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ae8c092c-ec9d-456a-9ba3-5501c22f6280-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 29 10:46:55 crc kubenswrapper[4752]: I0929 10:46:55.491940 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gbcwc\" (UniqueName: \"kubernetes.io/projected/ae8c092c-ec9d-456a-9ba3-5501c22f6280-kube-api-access-gbcwc\") on node \"crc\" DevicePath \"\"" Sep 29 10:46:55 crc kubenswrapper[4752]: I0929 10:46:55.491954 4752 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ae8c092c-ec9d-456a-9ba3-5501c22f6280-config-volume\") on node \"crc\" DevicePath \"\"" Sep 29 10:46:55 crc kubenswrapper[4752]: I0929 10:46:55.950933 4752 generic.go:334] "Generic (PLEG): container finished" podID="5a042fcf-8877-4ea2-97fc-272870ff20d3" containerID="f80becd77050dfa57b8f99dda87853ea911c427d298844842097d02780ecbf0a" exitCode=0 Sep 29 10:46:55 crc kubenswrapper[4752]: I0929 10:46:55.951053 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ppd9s" event={"ID":"5a042fcf-8877-4ea2-97fc-272870ff20d3","Type":"ContainerDied","Data":"f80becd77050dfa57b8f99dda87853ea911c427d298844842097d02780ecbf0a"} Sep 29 10:46:55 crc kubenswrapper[4752]: I0929 10:46:55.951091 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ppd9s" event={"ID":"5a042fcf-8877-4ea2-97fc-272870ff20d3","Type":"ContainerStarted","Data":"bee4668fdb54047db1f84b3365ff28db1d44a541dfcda5d1de7b2df84c57a84f"} Sep 29 10:46:55 crc kubenswrapper[4752]: I0929 10:46:55.956490 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"29994a3c-c9d4-436f-b6bf-c46f1cd81a57","Type":"ContainerDied","Data":"60594a5e57536659b9887ce59b2e8c8acee2f72c0f67d01dc823ca549dc99b8c"} Sep 29 10:46:55 crc kubenswrapper[4752]: I0929 10:46:55.956546 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 29 10:46:55 crc kubenswrapper[4752]: I0929 10:46:55.956553 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="60594a5e57536659b9887ce59b2e8c8acee2f72c0f67d01dc823ca549dc99b8c" Sep 29 10:46:55 crc kubenswrapper[4752]: I0929 10:46:55.963046 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29319045-qpr28" event={"ID":"ae8c092c-ec9d-456a-9ba3-5501c22f6280","Type":"ContainerDied","Data":"c8ad9f8b2cc8e085392060faf803526b988cba516e68587576fcdc4613320ce8"} Sep 29 10:46:55 crc kubenswrapper[4752]: I0929 10:46:55.963096 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c8ad9f8b2cc8e085392060faf803526b988cba516e68587576fcdc4613320ce8" Sep 29 10:46:55 crc kubenswrapper[4752]: I0929 10:46:55.963302 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29319045-qpr28" Sep 29 10:46:55 crc kubenswrapper[4752]: I0929 10:46:55.977969 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"fb4bcdba-8a53-409c-9c6d-a8d464321183","Type":"ContainerDied","Data":"79959936c3876159808610096cdd24b09077b6a6c13a50a26afa76041f6cb178"} Sep 29 10:46:55 crc kubenswrapper[4752]: I0929 10:46:55.978239 4752 generic.go:334] "Generic (PLEG): container finished" podID="fb4bcdba-8a53-409c-9c6d-a8d464321183" containerID="79959936c3876159808610096cdd24b09077b6a6c13a50a26afa76041f6cb178" exitCode=0 Sep 29 10:46:56 crc kubenswrapper[4752]: I0929 10:46:56.175833 4752 patch_prober.go:28] interesting pod/machine-config-daemon-mgrvs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 10:46:56 crc kubenswrapper[4752]: I0929 10:46:56.175931 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 10:46:57 crc kubenswrapper[4752]: I0929 10:46:57.402833 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 29 10:46:57 crc kubenswrapper[4752]: I0929 10:46:57.468735 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fb4bcdba-8a53-409c-9c6d-a8d464321183-kube-api-access\") pod \"fb4bcdba-8a53-409c-9c6d-a8d464321183\" (UID: \"fb4bcdba-8a53-409c-9c6d-a8d464321183\") " Sep 29 10:46:57 crc kubenswrapper[4752]: I0929 10:46:57.468783 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fb4bcdba-8a53-409c-9c6d-a8d464321183-kubelet-dir\") pod \"fb4bcdba-8a53-409c-9c6d-a8d464321183\" (UID: \"fb4bcdba-8a53-409c-9c6d-a8d464321183\") " Sep 29 10:46:57 crc kubenswrapper[4752]: I0929 10:46:57.469255 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fb4bcdba-8a53-409c-9c6d-a8d464321183-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "fb4bcdba-8a53-409c-9c6d-a8d464321183" (UID: "fb4bcdba-8a53-409c-9c6d-a8d464321183"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 10:46:57 crc kubenswrapper[4752]: I0929 10:46:57.483059 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb4bcdba-8a53-409c-9c6d-a8d464321183-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "fb4bcdba-8a53-409c-9c6d-a8d464321183" (UID: "fb4bcdba-8a53-409c-9c6d-a8d464321183"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:46:57 crc kubenswrapper[4752]: I0929 10:46:57.570846 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fb4bcdba-8a53-409c-9c6d-a8d464321183-kube-api-access\") on node \"crc\" DevicePath \"\"" Sep 29 10:46:57 crc kubenswrapper[4752]: I0929 10:46:57.570902 4752 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fb4bcdba-8a53-409c-9c6d-a8d464321183-kubelet-dir\") on node \"crc\" DevicePath \"\"" Sep 29 10:46:57 crc kubenswrapper[4752]: I0929 10:46:57.784707 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-jvk96" Sep 29 10:46:58 crc kubenswrapper[4752]: I0929 10:46:58.018079 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"fb4bcdba-8a53-409c-9c6d-a8d464321183","Type":"ContainerDied","Data":"5698bfb62a248342249b2202dab1c360b45594c4a82b7020be458a7389d70dd0"} Sep 29 10:46:58 crc kubenswrapper[4752]: I0929 10:46:58.018123 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5698bfb62a248342249b2202dab1c360b45594c4a82b7020be458a7389d70dd0" Sep 29 10:46:58 crc kubenswrapper[4752]: I0929 10:46:58.018198 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 29 10:47:02 crc kubenswrapper[4752]: I0929 10:47:02.585056 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-bp6hz" Sep 29 10:47:02 crc kubenswrapper[4752]: I0929 10:47:02.593512 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-bp6hz" Sep 29 10:47:02 crc kubenswrapper[4752]: I0929 10:47:02.823522 4752 patch_prober.go:28] interesting pod/downloads-7954f5f757-bhw29 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Sep 29 10:47:02 crc kubenswrapper[4752]: I0929 10:47:02.823609 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-bhw29" podUID="44ae2b29-ec3a-4321-8590-4d316d810034" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Sep 29 10:47:02 crc kubenswrapper[4752]: I0929 10:47:02.823666 4752 patch_prober.go:28] interesting pod/downloads-7954f5f757-bhw29 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Sep 29 10:47:02 crc kubenswrapper[4752]: I0929 10:47:02.823755 4752 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-bhw29" podUID="44ae2b29-ec3a-4321-8590-4d316d810034" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Sep 29 10:47:03 crc kubenswrapper[4752]: I0929 10:47:03.778484 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0a33b92e-d79c-4162-8500-df7a89df8df3-metrics-certs\") pod \"network-metrics-daemon-sq7f4\" (UID: \"0a33b92e-d79c-4162-8500-df7a89df8df3\") " pod="openshift-multus/network-metrics-daemon-sq7f4" Sep 29 10:47:03 crc kubenswrapper[4752]: I0929 10:47:03.786904 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0a33b92e-d79c-4162-8500-df7a89df8df3-metrics-certs\") pod \"network-metrics-daemon-sq7f4\" (UID: \"0a33b92e-d79c-4162-8500-df7a89df8df3\") " pod="openshift-multus/network-metrics-daemon-sq7f4" Sep 29 10:47:03 crc kubenswrapper[4752]: I0929 10:47:03.945176 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sq7f4" Sep 29 10:47:11 crc kubenswrapper[4752]: I0929 10:47:11.316793 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-57fqh" Sep 29 10:47:12 crc kubenswrapper[4752]: I0929 10:47:12.848202 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-bhw29" Sep 29 10:47:23 crc kubenswrapper[4752]: I0929 10:47:23.014152 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8vwzv" Sep 29 10:47:23 crc kubenswrapper[4752]: E0929 10:47:23.943282 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Sep 29 10:47:23 crc kubenswrapper[4752]: E0929 10:47:23.943440 4752 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-d7mvp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-ppd9s_openshift-marketplace(5a042fcf-8877-4ea2-97fc-272870ff20d3): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Sep 29 10:47:23 crc kubenswrapper[4752]: E0929 10:47:23.945148 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-ppd9s" podUID="5a042fcf-8877-4ea2-97fc-272870ff20d3" Sep 29 10:47:24 crc kubenswrapper[4752]: E0929 10:47:24.693050 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-ppd9s" podUID="5a042fcf-8877-4ea2-97fc-272870ff20d3" Sep 29 10:47:24 crc kubenswrapper[4752]: E0929 10:47:24.780698 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Sep 29 10:47:24 crc kubenswrapper[4752]: E0929 10:47:24.780899 4752 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ls7tg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-ks2mq_openshift-marketplace(7beaf483-1002-4e94-a9ee-59e20e83f824): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Sep 29 10:47:24 crc kubenswrapper[4752]: E0929 10:47:24.782776 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-ks2mq" podUID="7beaf483-1002-4e94-a9ee-59e20e83f824" Sep 29 10:47:25 crc kubenswrapper[4752]: E0929 10:47:25.998101 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-ks2mq" podUID="7beaf483-1002-4e94-a9ee-59e20e83f824" Sep 29 10:47:26 crc kubenswrapper[4752]: E0929 10:47:26.064558 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Sep 29 10:47:26 crc kubenswrapper[4752]: E0929 10:47:26.064758 4752 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5fb64,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-l2pww_openshift-marketplace(501f967f-86b7-41e1-b650-0b122766a576): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Sep 29 10:47:26 crc kubenswrapper[4752]: E0929 10:47:26.065994 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-l2pww" podUID="501f967f-86b7-41e1-b650-0b122766a576" Sep 29 10:47:26 crc kubenswrapper[4752]: I0929 10:47:26.176384 4752 patch_prober.go:28] interesting pod/machine-config-daemon-mgrvs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 10:47:26 crc kubenswrapper[4752]: I0929 10:47:26.176477 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 10:47:26 crc kubenswrapper[4752]: E0929 10:47:26.176742 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Sep 29 10:47:26 crc kubenswrapper[4752]: E0929 10:47:26.176923 4752 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-csmtp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-wrp7s_openshift-marketplace(8142731c-cdef-4d76-aeae-bec64f2cb840): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Sep 29 10:47:26 crc kubenswrapper[4752]: E0929 10:47:26.178277 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-wrp7s" podUID="8142731c-cdef-4d76-aeae-bec64f2cb840" Sep 29 10:47:26 crc kubenswrapper[4752]: E0929 10:47:26.218651 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Sep 29 10:47:26 crc kubenswrapper[4752]: E0929 10:47:26.219113 4752 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qrxjl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-5tlth_openshift-marketplace(06f9e526-21c6-4e20-b1a8-8f4fbfaa6413): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Sep 29 10:47:26 crc kubenswrapper[4752]: E0929 10:47:26.220416 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-5tlth" podUID="06f9e526-21c6-4e20-b1a8-8f4fbfaa6413" Sep 29 10:47:26 crc kubenswrapper[4752]: E0929 10:47:26.265616 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-l2pww" podUID="501f967f-86b7-41e1-b650-0b122766a576" Sep 29 10:47:26 crc kubenswrapper[4752]: E0929 10:47:26.266497 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-5tlth" podUID="06f9e526-21c6-4e20-b1a8-8f4fbfaa6413" Sep 29 10:47:26 crc kubenswrapper[4752]: E0929 10:47:26.270250 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-wrp7s" podUID="8142731c-cdef-4d76-aeae-bec64f2cb840" Sep 29 10:47:26 crc kubenswrapper[4752]: I0929 10:47:26.560452 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-sq7f4"] Sep 29 10:47:26 crc kubenswrapper[4752]: W0929 10:47:26.574984 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a33b92e_d79c_4162_8500_df7a89df8df3.slice/crio-244563b0e7fd9cb8a969ea7d120f3573a18d7384698b3ecb81f14a82740b11d4 WatchSource:0}: Error finding container 244563b0e7fd9cb8a969ea7d120f3573a18d7384698b3ecb81f14a82740b11d4: Status 404 returned error can't find the container with id 244563b0e7fd9cb8a969ea7d120f3573a18d7384698b3ecb81f14a82740b11d4 Sep 29 10:47:27 crc kubenswrapper[4752]: I0929 10:47:27.268546 4752 generic.go:334] "Generic (PLEG): container finished" podID="98952fd9-6515-4ba1-8d1a-490a2c3e33b1" containerID="4aa2a3edcf145318abf8b810b931eccb97bec79f868ba747f02873a8fce23af3" exitCode=0 Sep 29 10:47:27 crc kubenswrapper[4752]: I0929 10:47:27.269010 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kxshj" event={"ID":"98952fd9-6515-4ba1-8d1a-490a2c3e33b1","Type":"ContainerDied","Data":"4aa2a3edcf145318abf8b810b931eccb97bec79f868ba747f02873a8fce23af3"} Sep 29 10:47:27 crc kubenswrapper[4752]: I0929 10:47:27.278062 4752 generic.go:334] "Generic (PLEG): container finished" podID="83d184f7-5dec-4c4c-b53e-d26af311916c" containerID="b1db737947e02f325ac9276d4371efa75931ab79d6707833eb196e806c8be50f" exitCode=0 Sep 29 10:47:27 crc kubenswrapper[4752]: I0929 10:47:27.278141 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2pwfh" event={"ID":"83d184f7-5dec-4c4c-b53e-d26af311916c","Type":"ContainerDied","Data":"b1db737947e02f325ac9276d4371efa75931ab79d6707833eb196e806c8be50f"} Sep 29 10:47:27 crc kubenswrapper[4752]: I0929 10:47:27.283906 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-sq7f4" event={"ID":"0a33b92e-d79c-4162-8500-df7a89df8df3","Type":"ContainerStarted","Data":"335c4b03d85e835c84e394ae4a124c79d20562e3a3d630ef8cf82b8860485096"} Sep 29 10:47:27 crc kubenswrapper[4752]: I0929 10:47:27.283966 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-sq7f4" event={"ID":"0a33b92e-d79c-4162-8500-df7a89df8df3","Type":"ContainerStarted","Data":"38d8b627bd3c7ed415d512fcf02d4d32f1569c92985cee0c43a9befbb24b4b1b"} Sep 29 10:47:27 crc kubenswrapper[4752]: I0929 10:47:27.283979 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-sq7f4" event={"ID":"0a33b92e-d79c-4162-8500-df7a89df8df3","Type":"ContainerStarted","Data":"244563b0e7fd9cb8a969ea7d120f3573a18d7384698b3ecb81f14a82740b11d4"} Sep 29 10:47:27 crc kubenswrapper[4752]: I0929 10:47:27.293376 4752 generic.go:334] "Generic (PLEG): container finished" podID="37b4288c-7657-4647-b24e-98547f53c24f" containerID="5ad159fb7fba44e7c26909a5fdbf93149783a751e1c5932eeb89f9b5d231a692" exitCode=0 Sep 29 10:47:27 crc kubenswrapper[4752]: I0929 10:47:27.293479 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mskwx" event={"ID":"37b4288c-7657-4647-b24e-98547f53c24f","Type":"ContainerDied","Data":"5ad159fb7fba44e7c26909a5fdbf93149783a751e1c5932eeb89f9b5d231a692"} Sep 29 10:47:27 crc kubenswrapper[4752]: I0929 10:47:27.330493 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-sq7f4" podStartSLOduration=167.330473907 podStartE2EDuration="2m47.330473907s" podCreationTimestamp="2025-09-29 10:44:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:47:27.327230487 +0000 UTC m=+188.116372154" watchObservedRunningTime="2025-09-29 10:47:27.330473907 +0000 UTC m=+188.119615574" Sep 29 10:47:28 crc kubenswrapper[4752]: I0929 10:47:28.321950 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mskwx" event={"ID":"37b4288c-7657-4647-b24e-98547f53c24f","Type":"ContainerStarted","Data":"5471d960b6b1b20f08bb0055d010509cf15aa20e711ce7adc2d5c007e50827ab"} Sep 29 10:47:28 crc kubenswrapper[4752]: I0929 10:47:28.325097 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kxshj" event={"ID":"98952fd9-6515-4ba1-8d1a-490a2c3e33b1","Type":"ContainerStarted","Data":"f912b42adf2424dd2eef9173682ffea98049dbd253b645455af2dc3509539ee4"} Sep 29 10:47:28 crc kubenswrapper[4752]: I0929 10:47:28.327790 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2pwfh" event={"ID":"83d184f7-5dec-4c4c-b53e-d26af311916c","Type":"ContainerStarted","Data":"d0c5a7b9b00fc04b5d276284d9ede8949358bd36e57d305311bf53526fb9ae7b"} Sep 29 10:47:28 crc kubenswrapper[4752]: I0929 10:47:28.344145 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mskwx" podStartSLOduration=2.39176562 podStartE2EDuration="37.344117493s" podCreationTimestamp="2025-09-29 10:46:51 +0000 UTC" firstStartedPulling="2025-09-29 10:46:52.870733719 +0000 UTC m=+153.659875386" lastFinishedPulling="2025-09-29 10:47:27.823085592 +0000 UTC m=+188.612227259" observedRunningTime="2025-09-29 10:47:28.343918246 +0000 UTC m=+189.133059933" watchObservedRunningTime="2025-09-29 10:47:28.344117493 +0000 UTC m=+189.133259180" Sep 29 10:47:28 crc kubenswrapper[4752]: I0929 10:47:28.364641 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-kxshj" podStartSLOduration=2.4886873019999998 podStartE2EDuration="35.364617168s" podCreationTimestamp="2025-09-29 10:46:53 +0000 UTC" firstStartedPulling="2025-09-29 10:46:54.923251267 +0000 UTC m=+155.712392934" lastFinishedPulling="2025-09-29 10:47:27.799181123 +0000 UTC m=+188.588322800" observedRunningTime="2025-09-29 10:47:28.362114729 +0000 UTC m=+189.151256386" watchObservedRunningTime="2025-09-29 10:47:28.364617168 +0000 UTC m=+189.153758835" Sep 29 10:47:28 crc kubenswrapper[4752]: I0929 10:47:28.391327 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2pwfh" podStartSLOduration=3.510549138 podStartE2EDuration="38.391288194s" podCreationTimestamp="2025-09-29 10:46:50 +0000 UTC" firstStartedPulling="2025-09-29 10:46:52.854672507 +0000 UTC m=+153.643814174" lastFinishedPulling="2025-09-29 10:47:27.735411563 +0000 UTC m=+188.524553230" observedRunningTime="2025-09-29 10:47:28.391110449 +0000 UTC m=+189.180252126" watchObservedRunningTime="2025-09-29 10:47:28.391288194 +0000 UTC m=+189.180429861" Sep 29 10:47:29 crc kubenswrapper[4752]: I0929 10:47:29.071632 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 10:47:31 crc kubenswrapper[4752]: I0929 10:47:31.069492 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2pwfh" Sep 29 10:47:31 crc kubenswrapper[4752]: I0929 10:47:31.070120 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2pwfh" Sep 29 10:47:31 crc kubenswrapper[4752]: I0929 10:47:31.248385 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2pwfh" Sep 29 10:47:31 crc kubenswrapper[4752]: I0929 10:47:31.473297 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mskwx" Sep 29 10:47:31 crc kubenswrapper[4752]: I0929 10:47:31.473352 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mskwx" Sep 29 10:47:31 crc kubenswrapper[4752]: I0929 10:47:31.532654 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mskwx" Sep 29 10:47:34 crc kubenswrapper[4752]: I0929 10:47:34.298088 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-kxshj" Sep 29 10:47:34 crc kubenswrapper[4752]: I0929 10:47:34.301780 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-kxshj" Sep 29 10:47:34 crc kubenswrapper[4752]: I0929 10:47:34.346485 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-kxshj" Sep 29 10:47:34 crc kubenswrapper[4752]: I0929 10:47:34.419737 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-kxshj" Sep 29 10:47:41 crc kubenswrapper[4752]: I0929 10:47:41.116965 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2pwfh" Sep 29 10:47:41 crc kubenswrapper[4752]: I0929 10:47:41.525684 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mskwx" Sep 29 10:47:41 crc kubenswrapper[4752]: I0929 10:47:41.581895 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mskwx"] Sep 29 10:47:42 crc kubenswrapper[4752]: I0929 10:47:42.413854 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-mskwx" podUID="37b4288c-7657-4647-b24e-98547f53c24f" containerName="registry-server" containerID="cri-o://5471d960b6b1b20f08bb0055d010509cf15aa20e711ce7adc2d5c007e50827ab" gracePeriod=2 Sep 29 10:47:43 crc kubenswrapper[4752]: I0929 10:47:43.420923 4752 generic.go:334] "Generic (PLEG): container finished" podID="37b4288c-7657-4647-b24e-98547f53c24f" containerID="5471d960b6b1b20f08bb0055d010509cf15aa20e711ce7adc2d5c007e50827ab" exitCode=0 Sep 29 10:47:43 crc kubenswrapper[4752]: I0929 10:47:43.420988 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mskwx" event={"ID":"37b4288c-7657-4647-b24e-98547f53c24f","Type":"ContainerDied","Data":"5471d960b6b1b20f08bb0055d010509cf15aa20e711ce7adc2d5c007e50827ab"} Sep 29 10:47:46 crc kubenswrapper[4752]: I0929 10:47:46.144247 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mskwx" Sep 29 10:47:46 crc kubenswrapper[4752]: I0929 10:47:46.232376 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37b4288c-7657-4647-b24e-98547f53c24f-catalog-content\") pod \"37b4288c-7657-4647-b24e-98547f53c24f\" (UID: \"37b4288c-7657-4647-b24e-98547f53c24f\") " Sep 29 10:47:46 crc kubenswrapper[4752]: I0929 10:47:46.232480 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37b4288c-7657-4647-b24e-98547f53c24f-utilities\") pod \"37b4288c-7657-4647-b24e-98547f53c24f\" (UID: \"37b4288c-7657-4647-b24e-98547f53c24f\") " Sep 29 10:47:46 crc kubenswrapper[4752]: I0929 10:47:46.232583 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2s7gw\" (UniqueName: \"kubernetes.io/projected/37b4288c-7657-4647-b24e-98547f53c24f-kube-api-access-2s7gw\") pod \"37b4288c-7657-4647-b24e-98547f53c24f\" (UID: \"37b4288c-7657-4647-b24e-98547f53c24f\") " Sep 29 10:47:46 crc kubenswrapper[4752]: I0929 10:47:46.233643 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37b4288c-7657-4647-b24e-98547f53c24f-utilities" (OuterVolumeSpecName: "utilities") pod "37b4288c-7657-4647-b24e-98547f53c24f" (UID: "37b4288c-7657-4647-b24e-98547f53c24f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:47:46 crc kubenswrapper[4752]: I0929 10:47:46.238066 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37b4288c-7657-4647-b24e-98547f53c24f-kube-api-access-2s7gw" (OuterVolumeSpecName: "kube-api-access-2s7gw") pod "37b4288c-7657-4647-b24e-98547f53c24f" (UID: "37b4288c-7657-4647-b24e-98547f53c24f"). InnerVolumeSpecName "kube-api-access-2s7gw". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:47:46 crc kubenswrapper[4752]: I0929 10:47:46.282319 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37b4288c-7657-4647-b24e-98547f53c24f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "37b4288c-7657-4647-b24e-98547f53c24f" (UID: "37b4288c-7657-4647-b24e-98547f53c24f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:47:46 crc kubenswrapper[4752]: I0929 10:47:46.334123 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37b4288c-7657-4647-b24e-98547f53c24f-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 10:47:46 crc kubenswrapper[4752]: I0929 10:47:46.334354 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37b4288c-7657-4647-b24e-98547f53c24f-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 10:47:46 crc kubenswrapper[4752]: I0929 10:47:46.334454 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2s7gw\" (UniqueName: \"kubernetes.io/projected/37b4288c-7657-4647-b24e-98547f53c24f-kube-api-access-2s7gw\") on node \"crc\" DevicePath \"\"" Sep 29 10:47:46 crc kubenswrapper[4752]: I0929 10:47:46.440574 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ppd9s" event={"ID":"5a042fcf-8877-4ea2-97fc-272870ff20d3","Type":"ContainerStarted","Data":"4bf264f5fc466656b07009ce5780636f2b1ec1df46741d49026f91e2cb1df4d5"} Sep 29 10:47:46 crc kubenswrapper[4752]: I0929 10:47:46.442504 4752 generic.go:334] "Generic (PLEG): container finished" podID="7beaf483-1002-4e94-a9ee-59e20e83f824" containerID="74a6ebf8ba535654ecb7d4e6402ee5f2f831fe1ffd7af9a21b39838d4080455c" exitCode=0 Sep 29 10:47:46 crc kubenswrapper[4752]: I0929 10:47:46.442565 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ks2mq" event={"ID":"7beaf483-1002-4e94-a9ee-59e20e83f824","Type":"ContainerDied","Data":"74a6ebf8ba535654ecb7d4e6402ee5f2f831fe1ffd7af9a21b39838d4080455c"} Sep 29 10:47:46 crc kubenswrapper[4752]: I0929 10:47:46.451638 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5tlth" event={"ID":"06f9e526-21c6-4e20-b1a8-8f4fbfaa6413","Type":"ContainerStarted","Data":"10e0712e6f14e2255ad914c318ba42f3a3566ba33c400cb8b48a2361a4bd0698"} Sep 29 10:47:46 crc kubenswrapper[4752]: I0929 10:47:46.454933 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mskwx" Sep 29 10:47:46 crc kubenswrapper[4752]: I0929 10:47:46.454936 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mskwx" event={"ID":"37b4288c-7657-4647-b24e-98547f53c24f","Type":"ContainerDied","Data":"2cd9fc85d5a94b135a60cb791ba2a80976dabf99f0cc1d164d5bad08d0259c23"} Sep 29 10:47:46 crc kubenswrapper[4752]: I0929 10:47:46.455187 4752 scope.go:117] "RemoveContainer" containerID="5471d960b6b1b20f08bb0055d010509cf15aa20e711ce7adc2d5c007e50827ab" Sep 29 10:47:46 crc kubenswrapper[4752]: I0929 10:47:46.467299 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l2pww" event={"ID":"501f967f-86b7-41e1-b650-0b122766a576","Type":"ContainerStarted","Data":"c490bba0c471aec9bea737e85365d9cc8c2d4cb402f2c6ae8f1eee17ee712449"} Sep 29 10:47:46 crc kubenswrapper[4752]: I0929 10:47:46.473897 4752 scope.go:117] "RemoveContainer" containerID="5ad159fb7fba44e7c26909a5fdbf93149783a751e1c5932eeb89f9b5d231a692" Sep 29 10:47:46 crc kubenswrapper[4752]: I0929 10:47:46.504542 4752 scope.go:117] "RemoveContainer" containerID="b6308f197c883da696565e7608858a9aff07a91afbb7fac78aaaba9db165ec3e" Sep 29 10:47:46 crc kubenswrapper[4752]: I0929 10:47:46.526468 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mskwx"] Sep 29 10:47:46 crc kubenswrapper[4752]: I0929 10:47:46.530093 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-mskwx"] Sep 29 10:47:47 crc kubenswrapper[4752]: I0929 10:47:47.475676 4752 generic.go:334] "Generic (PLEG): container finished" podID="8142731c-cdef-4d76-aeae-bec64f2cb840" containerID="d9344f88cf2987540b46ea6b94cecf36332b1fa331a867f99fa97fb5bb8a72d0" exitCode=0 Sep 29 10:47:47 crc kubenswrapper[4752]: I0929 10:47:47.475906 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wrp7s" event={"ID":"8142731c-cdef-4d76-aeae-bec64f2cb840","Type":"ContainerDied","Data":"d9344f88cf2987540b46ea6b94cecf36332b1fa331a867f99fa97fb5bb8a72d0"} Sep 29 10:47:47 crc kubenswrapper[4752]: I0929 10:47:47.478992 4752 generic.go:334] "Generic (PLEG): container finished" podID="5a042fcf-8877-4ea2-97fc-272870ff20d3" containerID="4bf264f5fc466656b07009ce5780636f2b1ec1df46741d49026f91e2cb1df4d5" exitCode=0 Sep 29 10:47:47 crc kubenswrapper[4752]: I0929 10:47:47.479054 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ppd9s" event={"ID":"5a042fcf-8877-4ea2-97fc-272870ff20d3","Type":"ContainerDied","Data":"4bf264f5fc466656b07009ce5780636f2b1ec1df46741d49026f91e2cb1df4d5"} Sep 29 10:47:47 crc kubenswrapper[4752]: I0929 10:47:47.482189 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ks2mq" event={"ID":"7beaf483-1002-4e94-a9ee-59e20e83f824","Type":"ContainerStarted","Data":"36a7228ab347fdea68adf730b6d6291cc97192132f16007cec3fadbccdb335ce"} Sep 29 10:47:47 crc kubenswrapper[4752]: I0929 10:47:47.485910 4752 generic.go:334] "Generic (PLEG): container finished" podID="06f9e526-21c6-4e20-b1a8-8f4fbfaa6413" containerID="10e0712e6f14e2255ad914c318ba42f3a3566ba33c400cb8b48a2361a4bd0698" exitCode=0 Sep 29 10:47:47 crc kubenswrapper[4752]: I0929 10:47:47.485982 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5tlth" event={"ID":"06f9e526-21c6-4e20-b1a8-8f4fbfaa6413","Type":"ContainerDied","Data":"10e0712e6f14e2255ad914c318ba42f3a3566ba33c400cb8b48a2361a4bd0698"} Sep 29 10:47:47 crc kubenswrapper[4752]: I0929 10:47:47.492421 4752 generic.go:334] "Generic (PLEG): container finished" podID="501f967f-86b7-41e1-b650-0b122766a576" containerID="c490bba0c471aec9bea737e85365d9cc8c2d4cb402f2c6ae8f1eee17ee712449" exitCode=0 Sep 29 10:47:47 crc kubenswrapper[4752]: I0929 10:47:47.492470 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l2pww" event={"ID":"501f967f-86b7-41e1-b650-0b122766a576","Type":"ContainerDied","Data":"c490bba0c471aec9bea737e85365d9cc8c2d4cb402f2c6ae8f1eee17ee712449"} Sep 29 10:47:47 crc kubenswrapper[4752]: I0929 10:47:47.517585 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ks2mq" podStartSLOduration=2.525460971 podStartE2EDuration="55.517566013s" podCreationTimestamp="2025-09-29 10:46:52 +0000 UTC" firstStartedPulling="2025-09-29 10:46:53.884004325 +0000 UTC m=+154.673145992" lastFinishedPulling="2025-09-29 10:47:46.876109367 +0000 UTC m=+207.665251034" observedRunningTime="2025-09-29 10:47:47.515584838 +0000 UTC m=+208.304726515" watchObservedRunningTime="2025-09-29 10:47:47.517566013 +0000 UTC m=+208.306707680" Sep 29 10:47:48 crc kubenswrapper[4752]: I0929 10:47:48.040312 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37b4288c-7657-4647-b24e-98547f53c24f" path="/var/lib/kubelet/pods/37b4288c-7657-4647-b24e-98547f53c24f/volumes" Sep 29 10:47:48 crc kubenswrapper[4752]: I0929 10:47:48.500213 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wrp7s" event={"ID":"8142731c-cdef-4d76-aeae-bec64f2cb840","Type":"ContainerStarted","Data":"fa511215d6202dbe033daf5bbb9ce526ca13c30429c94a3eff640e342ebad66a"} Sep 29 10:47:48 crc kubenswrapper[4752]: I0929 10:47:48.502908 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ppd9s" event={"ID":"5a042fcf-8877-4ea2-97fc-272870ff20d3","Type":"ContainerStarted","Data":"e4150cf8eabe1dd046d90f032e286e41b7b454df869caa4cff43aa42f35294d3"} Sep 29 10:47:48 crc kubenswrapper[4752]: I0929 10:47:48.506458 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5tlth" event={"ID":"06f9e526-21c6-4e20-b1a8-8f4fbfaa6413","Type":"ContainerStarted","Data":"d988ee7d19339e1e3336f3096d0ad29d0050f425aaee5ca8edb17f9c87bfef86"} Sep 29 10:47:48 crc kubenswrapper[4752]: I0929 10:47:48.509299 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l2pww" event={"ID":"501f967f-86b7-41e1-b650-0b122766a576","Type":"ContainerStarted","Data":"df27cb1caef62898401c63c9994bd99c9de931517c3bd10ecdc3388d861e7652"} Sep 29 10:47:48 crc kubenswrapper[4752]: I0929 10:47:48.527776 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wrp7s" podStartSLOduration=2.271788 podStartE2EDuration="56.527754941s" podCreationTimestamp="2025-09-29 10:46:52 +0000 UTC" firstStartedPulling="2025-09-29 10:46:53.906965178 +0000 UTC m=+154.696106835" lastFinishedPulling="2025-09-29 10:47:48.162932109 +0000 UTC m=+208.952073776" observedRunningTime="2025-09-29 10:47:48.524508581 +0000 UTC m=+209.313650248" watchObservedRunningTime="2025-09-29 10:47:48.527754941 +0000 UTC m=+209.316896608" Sep 29 10:47:48 crc kubenswrapper[4752]: I0929 10:47:48.545413 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ppd9s" podStartSLOduration=2.347879627 podStartE2EDuration="54.545387374s" podCreationTimestamp="2025-09-29 10:46:54 +0000 UTC" firstStartedPulling="2025-09-29 10:46:55.957894892 +0000 UTC m=+156.747036559" lastFinishedPulling="2025-09-29 10:47:48.155402639 +0000 UTC m=+208.944544306" observedRunningTime="2025-09-29 10:47:48.542645688 +0000 UTC m=+209.331787355" watchObservedRunningTime="2025-09-29 10:47:48.545387374 +0000 UTC m=+209.334529041" Sep 29 10:47:48 crc kubenswrapper[4752]: I0929 10:47:48.566892 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5tlth" podStartSLOduration=2.261947351 podStartE2EDuration="58.566857654s" podCreationTimestamp="2025-09-29 10:46:50 +0000 UTC" firstStartedPulling="2025-09-29 10:46:51.758707859 +0000 UTC m=+152.547849526" lastFinishedPulling="2025-09-29 10:47:48.063618162 +0000 UTC m=+208.852759829" observedRunningTime="2025-09-29 10:47:48.564000355 +0000 UTC m=+209.353142012" watchObservedRunningTime="2025-09-29 10:47:48.566857654 +0000 UTC m=+209.355999331" Sep 29 10:47:48 crc kubenswrapper[4752]: I0929 10:47:48.586231 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-l2pww" podStartSLOduration=3.098599928 podStartE2EDuration="58.586206316s" podCreationTimestamp="2025-09-29 10:46:50 +0000 UTC" firstStartedPulling="2025-09-29 10:46:52.852737963 +0000 UTC m=+153.641879630" lastFinishedPulling="2025-09-29 10:47:48.34034435 +0000 UTC m=+209.129486018" observedRunningTime="2025-09-29 10:47:48.58137643 +0000 UTC m=+209.370518097" watchObservedRunningTime="2025-09-29 10:47:48.586206316 +0000 UTC m=+209.375347983" Sep 29 10:47:50 crc kubenswrapper[4752]: I0929 10:47:50.867036 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5tlth" Sep 29 10:47:50 crc kubenswrapper[4752]: I0929 10:47:50.867711 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5tlth" Sep 29 10:47:50 crc kubenswrapper[4752]: I0929 10:47:50.929388 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5tlth" Sep 29 10:47:51 crc kubenswrapper[4752]: I0929 10:47:51.254059 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-l2pww" Sep 29 10:47:51 crc kubenswrapper[4752]: I0929 10:47:51.255707 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-l2pww" Sep 29 10:47:51 crc kubenswrapper[4752]: I0929 10:47:51.294201 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-l2pww" Sep 29 10:47:51 crc kubenswrapper[4752]: I0929 10:47:51.702696 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-prpgr"] Sep 29 10:47:52 crc kubenswrapper[4752]: I0929 10:47:52.858853 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ks2mq" Sep 29 10:47:52 crc kubenswrapper[4752]: I0929 10:47:52.859234 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ks2mq" Sep 29 10:47:52 crc kubenswrapper[4752]: I0929 10:47:52.906445 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ks2mq" Sep 29 10:47:53 crc kubenswrapper[4752]: I0929 10:47:53.266741 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wrp7s" Sep 29 10:47:53 crc kubenswrapper[4752]: I0929 10:47:53.266833 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wrp7s" Sep 29 10:47:53 crc kubenswrapper[4752]: I0929 10:47:53.307612 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wrp7s" Sep 29 10:47:53 crc kubenswrapper[4752]: I0929 10:47:53.574406 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wrp7s" Sep 29 10:47:53 crc kubenswrapper[4752]: I0929 10:47:53.575982 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ks2mq" Sep 29 10:47:54 crc kubenswrapper[4752]: I0929 10:47:54.696327 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ppd9s" Sep 29 10:47:54 crc kubenswrapper[4752]: I0929 10:47:54.696380 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ppd9s" Sep 29 10:47:54 crc kubenswrapper[4752]: I0929 10:47:54.738045 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ppd9s" Sep 29 10:47:55 crc kubenswrapper[4752]: I0929 10:47:55.582355 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ppd9s" Sep 29 10:47:55 crc kubenswrapper[4752]: I0929 10:47:55.670920 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wrp7s"] Sep 29 10:47:55 crc kubenswrapper[4752]: I0929 10:47:55.671150 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-wrp7s" podUID="8142731c-cdef-4d76-aeae-bec64f2cb840" containerName="registry-server" containerID="cri-o://fa511215d6202dbe033daf5bbb9ce526ca13c30429c94a3eff640e342ebad66a" gracePeriod=2 Sep 29 10:47:56 crc kubenswrapper[4752]: I0929 10:47:56.069113 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wrp7s" Sep 29 10:47:56 crc kubenswrapper[4752]: I0929 10:47:56.175766 4752 patch_prober.go:28] interesting pod/machine-config-daemon-mgrvs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 10:47:56 crc kubenswrapper[4752]: I0929 10:47:56.175841 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 10:47:56 crc kubenswrapper[4752]: I0929 10:47:56.175894 4752 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" Sep 29 10:47:56 crc kubenswrapper[4752]: I0929 10:47:56.176540 4752 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"32155f6078e9c15abe4c659ac79b064ec182a232ea1d816998da4de273b7aa67"} pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 29 10:47:56 crc kubenswrapper[4752]: I0929 10:47:56.176638 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" containerName="machine-config-daemon" containerID="cri-o://32155f6078e9c15abe4c659ac79b064ec182a232ea1d816998da4de273b7aa67" gracePeriod=600 Sep 29 10:47:56 crc kubenswrapper[4752]: I0929 10:47:56.273556 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8142731c-cdef-4d76-aeae-bec64f2cb840-utilities\") pod \"8142731c-cdef-4d76-aeae-bec64f2cb840\" (UID: \"8142731c-cdef-4d76-aeae-bec64f2cb840\") " Sep 29 10:47:56 crc kubenswrapper[4752]: I0929 10:47:56.274079 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8142731c-cdef-4d76-aeae-bec64f2cb840-catalog-content\") pod \"8142731c-cdef-4d76-aeae-bec64f2cb840\" (UID: \"8142731c-cdef-4d76-aeae-bec64f2cb840\") " Sep 29 10:47:56 crc kubenswrapper[4752]: I0929 10:47:56.274235 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-csmtp\" (UniqueName: \"kubernetes.io/projected/8142731c-cdef-4d76-aeae-bec64f2cb840-kube-api-access-csmtp\") pod \"8142731c-cdef-4d76-aeae-bec64f2cb840\" (UID: \"8142731c-cdef-4d76-aeae-bec64f2cb840\") " Sep 29 10:47:56 crc kubenswrapper[4752]: I0929 10:47:56.274770 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8142731c-cdef-4d76-aeae-bec64f2cb840-utilities" (OuterVolumeSpecName: "utilities") pod "8142731c-cdef-4d76-aeae-bec64f2cb840" (UID: "8142731c-cdef-4d76-aeae-bec64f2cb840"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:47:56 crc kubenswrapper[4752]: I0929 10:47:56.284591 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8142731c-cdef-4d76-aeae-bec64f2cb840-kube-api-access-csmtp" (OuterVolumeSpecName: "kube-api-access-csmtp") pod "8142731c-cdef-4d76-aeae-bec64f2cb840" (UID: "8142731c-cdef-4d76-aeae-bec64f2cb840"). InnerVolumeSpecName "kube-api-access-csmtp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:47:56 crc kubenswrapper[4752]: I0929 10:47:56.287933 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8142731c-cdef-4d76-aeae-bec64f2cb840-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8142731c-cdef-4d76-aeae-bec64f2cb840" (UID: "8142731c-cdef-4d76-aeae-bec64f2cb840"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:47:56 crc kubenswrapper[4752]: I0929 10:47:56.375180 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8142731c-cdef-4d76-aeae-bec64f2cb840-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 10:47:56 crc kubenswrapper[4752]: I0929 10:47:56.375216 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-csmtp\" (UniqueName: \"kubernetes.io/projected/8142731c-cdef-4d76-aeae-bec64f2cb840-kube-api-access-csmtp\") on node \"crc\" DevicePath \"\"" Sep 29 10:47:56 crc kubenswrapper[4752]: I0929 10:47:56.375228 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8142731c-cdef-4d76-aeae-bec64f2cb840-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 10:47:56 crc kubenswrapper[4752]: I0929 10:47:56.553917 4752 generic.go:334] "Generic (PLEG): container finished" podID="8142731c-cdef-4d76-aeae-bec64f2cb840" containerID="fa511215d6202dbe033daf5bbb9ce526ca13c30429c94a3eff640e342ebad66a" exitCode=0 Sep 29 10:47:56 crc kubenswrapper[4752]: I0929 10:47:56.553973 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wrp7s" event={"ID":"8142731c-cdef-4d76-aeae-bec64f2cb840","Type":"ContainerDied","Data":"fa511215d6202dbe033daf5bbb9ce526ca13c30429c94a3eff640e342ebad66a"} Sep 29 10:47:56 crc kubenswrapper[4752]: I0929 10:47:56.554038 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wrp7s" event={"ID":"8142731c-cdef-4d76-aeae-bec64f2cb840","Type":"ContainerDied","Data":"da86f7af7c4b5a8962282dbeed433104c63a7238e91cc36b34c8798f3b6fc776"} Sep 29 10:47:56 crc kubenswrapper[4752]: I0929 10:47:56.554063 4752 scope.go:117] "RemoveContainer" containerID="fa511215d6202dbe033daf5bbb9ce526ca13c30429c94a3eff640e342ebad66a" Sep 29 10:47:56 crc kubenswrapper[4752]: I0929 10:47:56.554478 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wrp7s" Sep 29 10:47:56 crc kubenswrapper[4752]: I0929 10:47:56.559174 4752 generic.go:334] "Generic (PLEG): container finished" podID="5863c243-797d-462a-b11f-71aaf005f8d1" containerID="32155f6078e9c15abe4c659ac79b064ec182a232ea1d816998da4de273b7aa67" exitCode=0 Sep 29 10:47:56 crc kubenswrapper[4752]: I0929 10:47:56.559244 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" event={"ID":"5863c243-797d-462a-b11f-71aaf005f8d1","Type":"ContainerDied","Data":"32155f6078e9c15abe4c659ac79b064ec182a232ea1d816998da4de273b7aa67"} Sep 29 10:47:56 crc kubenswrapper[4752]: I0929 10:47:56.559311 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" event={"ID":"5863c243-797d-462a-b11f-71aaf005f8d1","Type":"ContainerStarted","Data":"907813a6b730b09c945dfe34cd11dc9926afdeb1d7a721e02b4bac45108adba9"} Sep 29 10:47:56 crc kubenswrapper[4752]: I0929 10:47:56.580282 4752 scope.go:117] "RemoveContainer" containerID="d9344f88cf2987540b46ea6b94cecf36332b1fa331a867f99fa97fb5bb8a72d0" Sep 29 10:47:56 crc kubenswrapper[4752]: I0929 10:47:56.593387 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wrp7s"] Sep 29 10:47:56 crc kubenswrapper[4752]: I0929 10:47:56.597536 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-wrp7s"] Sep 29 10:47:56 crc kubenswrapper[4752]: I0929 10:47:56.603707 4752 scope.go:117] "RemoveContainer" containerID="0ef7a934b8b27f21b65574bab5ce52ffe65e4d45cfc3fcfa2a4efe2ff939d89b" Sep 29 10:47:56 crc kubenswrapper[4752]: I0929 10:47:56.619768 4752 scope.go:117] "RemoveContainer" containerID="fa511215d6202dbe033daf5bbb9ce526ca13c30429c94a3eff640e342ebad66a" Sep 29 10:47:56 crc kubenswrapper[4752]: E0929 10:47:56.620309 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa511215d6202dbe033daf5bbb9ce526ca13c30429c94a3eff640e342ebad66a\": container with ID starting with fa511215d6202dbe033daf5bbb9ce526ca13c30429c94a3eff640e342ebad66a not found: ID does not exist" containerID="fa511215d6202dbe033daf5bbb9ce526ca13c30429c94a3eff640e342ebad66a" Sep 29 10:47:56 crc kubenswrapper[4752]: I0929 10:47:56.620361 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa511215d6202dbe033daf5bbb9ce526ca13c30429c94a3eff640e342ebad66a"} err="failed to get container status \"fa511215d6202dbe033daf5bbb9ce526ca13c30429c94a3eff640e342ebad66a\": rpc error: code = NotFound desc = could not find container \"fa511215d6202dbe033daf5bbb9ce526ca13c30429c94a3eff640e342ebad66a\": container with ID starting with fa511215d6202dbe033daf5bbb9ce526ca13c30429c94a3eff640e342ebad66a not found: ID does not exist" Sep 29 10:47:56 crc kubenswrapper[4752]: I0929 10:47:56.620394 4752 scope.go:117] "RemoveContainer" containerID="d9344f88cf2987540b46ea6b94cecf36332b1fa331a867f99fa97fb5bb8a72d0" Sep 29 10:47:56 crc kubenswrapper[4752]: E0929 10:47:56.621447 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9344f88cf2987540b46ea6b94cecf36332b1fa331a867f99fa97fb5bb8a72d0\": container with ID starting with d9344f88cf2987540b46ea6b94cecf36332b1fa331a867f99fa97fb5bb8a72d0 not found: ID does not exist" containerID="d9344f88cf2987540b46ea6b94cecf36332b1fa331a867f99fa97fb5bb8a72d0" Sep 29 10:47:56 crc kubenswrapper[4752]: I0929 10:47:56.621503 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9344f88cf2987540b46ea6b94cecf36332b1fa331a867f99fa97fb5bb8a72d0"} err="failed to get container status \"d9344f88cf2987540b46ea6b94cecf36332b1fa331a867f99fa97fb5bb8a72d0\": rpc error: code = NotFound desc = could not find container \"d9344f88cf2987540b46ea6b94cecf36332b1fa331a867f99fa97fb5bb8a72d0\": container with ID starting with d9344f88cf2987540b46ea6b94cecf36332b1fa331a867f99fa97fb5bb8a72d0 not found: ID does not exist" Sep 29 10:47:56 crc kubenswrapper[4752]: I0929 10:47:56.621549 4752 scope.go:117] "RemoveContainer" containerID="0ef7a934b8b27f21b65574bab5ce52ffe65e4d45cfc3fcfa2a4efe2ff939d89b" Sep 29 10:47:56 crc kubenswrapper[4752]: E0929 10:47:56.621870 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ef7a934b8b27f21b65574bab5ce52ffe65e4d45cfc3fcfa2a4efe2ff939d89b\": container with ID starting with 0ef7a934b8b27f21b65574bab5ce52ffe65e4d45cfc3fcfa2a4efe2ff939d89b not found: ID does not exist" containerID="0ef7a934b8b27f21b65574bab5ce52ffe65e4d45cfc3fcfa2a4efe2ff939d89b" Sep 29 10:47:56 crc kubenswrapper[4752]: I0929 10:47:56.621900 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ef7a934b8b27f21b65574bab5ce52ffe65e4d45cfc3fcfa2a4efe2ff939d89b"} err="failed to get container status \"0ef7a934b8b27f21b65574bab5ce52ffe65e4d45cfc3fcfa2a4efe2ff939d89b\": rpc error: code = NotFound desc = could not find container \"0ef7a934b8b27f21b65574bab5ce52ffe65e4d45cfc3fcfa2a4efe2ff939d89b\": container with ID starting with 0ef7a934b8b27f21b65574bab5ce52ffe65e4d45cfc3fcfa2a4efe2ff939d89b not found: ID does not exist" Sep 29 10:47:58 crc kubenswrapper[4752]: I0929 10:47:58.037931 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8142731c-cdef-4d76-aeae-bec64f2cb840" path="/var/lib/kubelet/pods/8142731c-cdef-4d76-aeae-bec64f2cb840/volumes" Sep 29 10:47:58 crc kubenswrapper[4752]: I0929 10:47:58.071889 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ppd9s"] Sep 29 10:47:58 crc kubenswrapper[4752]: I0929 10:47:58.072116 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ppd9s" podUID="5a042fcf-8877-4ea2-97fc-272870ff20d3" containerName="registry-server" containerID="cri-o://e4150cf8eabe1dd046d90f032e286e41b7b454df869caa4cff43aa42f35294d3" gracePeriod=2 Sep 29 10:47:58 crc kubenswrapper[4752]: I0929 10:47:58.471512 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ppd9s" Sep 29 10:47:58 crc kubenswrapper[4752]: I0929 10:47:58.513397 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d7mvp\" (UniqueName: \"kubernetes.io/projected/5a042fcf-8877-4ea2-97fc-272870ff20d3-kube-api-access-d7mvp\") pod \"5a042fcf-8877-4ea2-97fc-272870ff20d3\" (UID: \"5a042fcf-8877-4ea2-97fc-272870ff20d3\") " Sep 29 10:47:58 crc kubenswrapper[4752]: I0929 10:47:58.513604 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a042fcf-8877-4ea2-97fc-272870ff20d3-utilities\") pod \"5a042fcf-8877-4ea2-97fc-272870ff20d3\" (UID: \"5a042fcf-8877-4ea2-97fc-272870ff20d3\") " Sep 29 10:47:58 crc kubenswrapper[4752]: I0929 10:47:58.513665 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a042fcf-8877-4ea2-97fc-272870ff20d3-catalog-content\") pod \"5a042fcf-8877-4ea2-97fc-272870ff20d3\" (UID: \"5a042fcf-8877-4ea2-97fc-272870ff20d3\") " Sep 29 10:47:58 crc kubenswrapper[4752]: I0929 10:47:58.520983 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a042fcf-8877-4ea2-97fc-272870ff20d3-utilities" (OuterVolumeSpecName: "utilities") pod "5a042fcf-8877-4ea2-97fc-272870ff20d3" (UID: "5a042fcf-8877-4ea2-97fc-272870ff20d3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:47:58 crc kubenswrapper[4752]: I0929 10:47:58.521816 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a042fcf-8877-4ea2-97fc-272870ff20d3-kube-api-access-d7mvp" (OuterVolumeSpecName: "kube-api-access-d7mvp") pod "5a042fcf-8877-4ea2-97fc-272870ff20d3" (UID: "5a042fcf-8877-4ea2-97fc-272870ff20d3"). InnerVolumeSpecName "kube-api-access-d7mvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:47:58 crc kubenswrapper[4752]: I0929 10:47:58.576188 4752 generic.go:334] "Generic (PLEG): container finished" podID="5a042fcf-8877-4ea2-97fc-272870ff20d3" containerID="e4150cf8eabe1dd046d90f032e286e41b7b454df869caa4cff43aa42f35294d3" exitCode=0 Sep 29 10:47:58 crc kubenswrapper[4752]: I0929 10:47:58.576243 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ppd9s" event={"ID":"5a042fcf-8877-4ea2-97fc-272870ff20d3","Type":"ContainerDied","Data":"e4150cf8eabe1dd046d90f032e286e41b7b454df869caa4cff43aa42f35294d3"} Sep 29 10:47:58 crc kubenswrapper[4752]: I0929 10:47:58.576276 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ppd9s" event={"ID":"5a042fcf-8877-4ea2-97fc-272870ff20d3","Type":"ContainerDied","Data":"bee4668fdb54047db1f84b3365ff28db1d44a541dfcda5d1de7b2df84c57a84f"} Sep 29 10:47:58 crc kubenswrapper[4752]: I0929 10:47:58.576297 4752 scope.go:117] "RemoveContainer" containerID="e4150cf8eabe1dd046d90f032e286e41b7b454df869caa4cff43aa42f35294d3" Sep 29 10:47:58 crc kubenswrapper[4752]: I0929 10:47:58.576300 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ppd9s" Sep 29 10:47:58 crc kubenswrapper[4752]: I0929 10:47:58.595686 4752 scope.go:117] "RemoveContainer" containerID="4bf264f5fc466656b07009ce5780636f2b1ec1df46741d49026f91e2cb1df4d5" Sep 29 10:47:58 crc kubenswrapper[4752]: I0929 10:47:58.615100 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a042fcf-8877-4ea2-97fc-272870ff20d3-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 10:47:58 crc kubenswrapper[4752]: I0929 10:47:58.615134 4752 scope.go:117] "RemoveContainer" containerID="f80becd77050dfa57b8f99dda87853ea911c427d298844842097d02780ecbf0a" Sep 29 10:47:58 crc kubenswrapper[4752]: I0929 10:47:58.615173 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d7mvp\" (UniqueName: \"kubernetes.io/projected/5a042fcf-8877-4ea2-97fc-272870ff20d3-kube-api-access-d7mvp\") on node \"crc\" DevicePath \"\"" Sep 29 10:47:58 crc kubenswrapper[4752]: I0929 10:47:58.618742 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a042fcf-8877-4ea2-97fc-272870ff20d3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5a042fcf-8877-4ea2-97fc-272870ff20d3" (UID: "5a042fcf-8877-4ea2-97fc-272870ff20d3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:47:58 crc kubenswrapper[4752]: I0929 10:47:58.634635 4752 scope.go:117] "RemoveContainer" containerID="e4150cf8eabe1dd046d90f032e286e41b7b454df869caa4cff43aa42f35294d3" Sep 29 10:47:58 crc kubenswrapper[4752]: E0929 10:47:58.635383 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4150cf8eabe1dd046d90f032e286e41b7b454df869caa4cff43aa42f35294d3\": container with ID starting with e4150cf8eabe1dd046d90f032e286e41b7b454df869caa4cff43aa42f35294d3 not found: ID does not exist" containerID="e4150cf8eabe1dd046d90f032e286e41b7b454df869caa4cff43aa42f35294d3" Sep 29 10:47:58 crc kubenswrapper[4752]: I0929 10:47:58.635512 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4150cf8eabe1dd046d90f032e286e41b7b454df869caa4cff43aa42f35294d3"} err="failed to get container status \"e4150cf8eabe1dd046d90f032e286e41b7b454df869caa4cff43aa42f35294d3\": rpc error: code = NotFound desc = could not find container \"e4150cf8eabe1dd046d90f032e286e41b7b454df869caa4cff43aa42f35294d3\": container with ID starting with e4150cf8eabe1dd046d90f032e286e41b7b454df869caa4cff43aa42f35294d3 not found: ID does not exist" Sep 29 10:47:58 crc kubenswrapper[4752]: I0929 10:47:58.635601 4752 scope.go:117] "RemoveContainer" containerID="4bf264f5fc466656b07009ce5780636f2b1ec1df46741d49026f91e2cb1df4d5" Sep 29 10:47:58 crc kubenswrapper[4752]: E0929 10:47:58.636430 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4bf264f5fc466656b07009ce5780636f2b1ec1df46741d49026f91e2cb1df4d5\": container with ID starting with 4bf264f5fc466656b07009ce5780636f2b1ec1df46741d49026f91e2cb1df4d5 not found: ID does not exist" containerID="4bf264f5fc466656b07009ce5780636f2b1ec1df46741d49026f91e2cb1df4d5" Sep 29 10:47:58 crc kubenswrapper[4752]: I0929 10:47:58.636479 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4bf264f5fc466656b07009ce5780636f2b1ec1df46741d49026f91e2cb1df4d5"} err="failed to get container status \"4bf264f5fc466656b07009ce5780636f2b1ec1df46741d49026f91e2cb1df4d5\": rpc error: code = NotFound desc = could not find container \"4bf264f5fc466656b07009ce5780636f2b1ec1df46741d49026f91e2cb1df4d5\": container with ID starting with 4bf264f5fc466656b07009ce5780636f2b1ec1df46741d49026f91e2cb1df4d5 not found: ID does not exist" Sep 29 10:47:58 crc kubenswrapper[4752]: I0929 10:47:58.636511 4752 scope.go:117] "RemoveContainer" containerID="f80becd77050dfa57b8f99dda87853ea911c427d298844842097d02780ecbf0a" Sep 29 10:47:58 crc kubenswrapper[4752]: E0929 10:47:58.636890 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f80becd77050dfa57b8f99dda87853ea911c427d298844842097d02780ecbf0a\": container with ID starting with f80becd77050dfa57b8f99dda87853ea911c427d298844842097d02780ecbf0a not found: ID does not exist" containerID="f80becd77050dfa57b8f99dda87853ea911c427d298844842097d02780ecbf0a" Sep 29 10:47:58 crc kubenswrapper[4752]: I0929 10:47:58.636942 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f80becd77050dfa57b8f99dda87853ea911c427d298844842097d02780ecbf0a"} err="failed to get container status \"f80becd77050dfa57b8f99dda87853ea911c427d298844842097d02780ecbf0a\": rpc error: code = NotFound desc = could not find container \"f80becd77050dfa57b8f99dda87853ea911c427d298844842097d02780ecbf0a\": container with ID starting with f80becd77050dfa57b8f99dda87853ea911c427d298844842097d02780ecbf0a not found: ID does not exist" Sep 29 10:47:58 crc kubenswrapper[4752]: I0929 10:47:58.716476 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a042fcf-8877-4ea2-97fc-272870ff20d3-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 10:47:58 crc kubenswrapper[4752]: I0929 10:47:58.904648 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ppd9s"] Sep 29 10:47:58 crc kubenswrapper[4752]: I0929 10:47:58.910458 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ppd9s"] Sep 29 10:48:00 crc kubenswrapper[4752]: I0929 10:48:00.039333 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a042fcf-8877-4ea2-97fc-272870ff20d3" path="/var/lib/kubelet/pods/5a042fcf-8877-4ea2-97fc-272870ff20d3/volumes" Sep 29 10:48:00 crc kubenswrapper[4752]: I0929 10:48:00.930056 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5tlth" Sep 29 10:48:01 crc kubenswrapper[4752]: I0929 10:48:01.305718 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-l2pww" Sep 29 10:48:02 crc kubenswrapper[4752]: I0929 10:48:02.472871 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-l2pww"] Sep 29 10:48:02 crc kubenswrapper[4752]: I0929 10:48:02.473287 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-l2pww" podUID="501f967f-86b7-41e1-b650-0b122766a576" containerName="registry-server" containerID="cri-o://df27cb1caef62898401c63c9994bd99c9de931517c3bd10ecdc3388d861e7652" gracePeriod=2 Sep 29 10:48:02 crc kubenswrapper[4752]: I0929 10:48:02.605275 4752 generic.go:334] "Generic (PLEG): container finished" podID="501f967f-86b7-41e1-b650-0b122766a576" containerID="df27cb1caef62898401c63c9994bd99c9de931517c3bd10ecdc3388d861e7652" exitCode=0 Sep 29 10:48:02 crc kubenswrapper[4752]: I0929 10:48:02.605354 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l2pww" event={"ID":"501f967f-86b7-41e1-b650-0b122766a576","Type":"ContainerDied","Data":"df27cb1caef62898401c63c9994bd99c9de931517c3bd10ecdc3388d861e7652"} Sep 29 10:48:02 crc kubenswrapper[4752]: I0929 10:48:02.867704 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l2pww" Sep 29 10:48:02 crc kubenswrapper[4752]: I0929 10:48:02.992362 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/501f967f-86b7-41e1-b650-0b122766a576-utilities\") pod \"501f967f-86b7-41e1-b650-0b122766a576\" (UID: \"501f967f-86b7-41e1-b650-0b122766a576\") " Sep 29 10:48:02 crc kubenswrapper[4752]: I0929 10:48:02.992494 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5fb64\" (UniqueName: \"kubernetes.io/projected/501f967f-86b7-41e1-b650-0b122766a576-kube-api-access-5fb64\") pod \"501f967f-86b7-41e1-b650-0b122766a576\" (UID: \"501f967f-86b7-41e1-b650-0b122766a576\") " Sep 29 10:48:02 crc kubenswrapper[4752]: I0929 10:48:02.992626 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/501f967f-86b7-41e1-b650-0b122766a576-catalog-content\") pod \"501f967f-86b7-41e1-b650-0b122766a576\" (UID: \"501f967f-86b7-41e1-b650-0b122766a576\") " Sep 29 10:48:02 crc kubenswrapper[4752]: I0929 10:48:02.993819 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/501f967f-86b7-41e1-b650-0b122766a576-utilities" (OuterVolumeSpecName: "utilities") pod "501f967f-86b7-41e1-b650-0b122766a576" (UID: "501f967f-86b7-41e1-b650-0b122766a576"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:48:03 crc kubenswrapper[4752]: I0929 10:48:03.003441 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/501f967f-86b7-41e1-b650-0b122766a576-kube-api-access-5fb64" (OuterVolumeSpecName: "kube-api-access-5fb64") pod "501f967f-86b7-41e1-b650-0b122766a576" (UID: "501f967f-86b7-41e1-b650-0b122766a576"). InnerVolumeSpecName "kube-api-access-5fb64". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:48:03 crc kubenswrapper[4752]: I0929 10:48:03.051978 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/501f967f-86b7-41e1-b650-0b122766a576-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "501f967f-86b7-41e1-b650-0b122766a576" (UID: "501f967f-86b7-41e1-b650-0b122766a576"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:48:03 crc kubenswrapper[4752]: I0929 10:48:03.094463 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5fb64\" (UniqueName: \"kubernetes.io/projected/501f967f-86b7-41e1-b650-0b122766a576-kube-api-access-5fb64\") on node \"crc\" DevicePath \"\"" Sep 29 10:48:03 crc kubenswrapper[4752]: I0929 10:48:03.094516 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/501f967f-86b7-41e1-b650-0b122766a576-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 10:48:03 crc kubenswrapper[4752]: I0929 10:48:03.094528 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/501f967f-86b7-41e1-b650-0b122766a576-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 10:48:03 crc kubenswrapper[4752]: I0929 10:48:03.614737 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l2pww" event={"ID":"501f967f-86b7-41e1-b650-0b122766a576","Type":"ContainerDied","Data":"ca68c44f333d453fd782d1c1309ae2f10484695a1c5ab32feb33adc5998c1960"} Sep 29 10:48:03 crc kubenswrapper[4752]: I0929 10:48:03.614816 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l2pww" Sep 29 10:48:03 crc kubenswrapper[4752]: I0929 10:48:03.615087 4752 scope.go:117] "RemoveContainer" containerID="df27cb1caef62898401c63c9994bd99c9de931517c3bd10ecdc3388d861e7652" Sep 29 10:48:03 crc kubenswrapper[4752]: I0929 10:48:03.636149 4752 scope.go:117] "RemoveContainer" containerID="c490bba0c471aec9bea737e85365d9cc8c2d4cb402f2c6ae8f1eee17ee712449" Sep 29 10:48:03 crc kubenswrapper[4752]: I0929 10:48:03.647473 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-l2pww"] Sep 29 10:48:03 crc kubenswrapper[4752]: I0929 10:48:03.649999 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-l2pww"] Sep 29 10:48:03 crc kubenswrapper[4752]: I0929 10:48:03.657462 4752 scope.go:117] "RemoveContainer" containerID="2671ca5fa0374435f5ee61e712f3266a8816685bd16c407b477bd365cef7b46c" Sep 29 10:48:04 crc kubenswrapper[4752]: I0929 10:48:04.037876 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="501f967f-86b7-41e1-b650-0b122766a576" path="/var/lib/kubelet/pods/501f967f-86b7-41e1-b650-0b122766a576/volumes" Sep 29 10:48:16 crc kubenswrapper[4752]: I0929 10:48:16.733889 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-prpgr" podUID="f00d489f-9b4a-421a-a02a-b9b090ae0449" containerName="oauth-openshift" containerID="cri-o://0bf50d9bebe257170660e11029e3a817b7b0c30c5acf5ae33ab8d53fbc1261a4" gracePeriod=15 Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.119176 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-prpgr" Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.153848 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-57bcd9fbb-45l59"] Sep 29 10:48:17 crc kubenswrapper[4752]: E0929 10:48:17.154116 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f00d489f-9b4a-421a-a02a-b9b090ae0449" containerName="oauth-openshift" Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.154134 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="f00d489f-9b4a-421a-a02a-b9b090ae0449" containerName="oauth-openshift" Sep 29 10:48:17 crc kubenswrapper[4752]: E0929 10:48:17.154146 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a042fcf-8877-4ea2-97fc-272870ff20d3" containerName="extract-content" Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.154153 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a042fcf-8877-4ea2-97fc-272870ff20d3" containerName="extract-content" Sep 29 10:48:17 crc kubenswrapper[4752]: E0929 10:48:17.154171 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="501f967f-86b7-41e1-b650-0b122766a576" containerName="registry-server" Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.154178 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="501f967f-86b7-41e1-b650-0b122766a576" containerName="registry-server" Sep 29 10:48:17 crc kubenswrapper[4752]: E0929 10:48:17.154193 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37b4288c-7657-4647-b24e-98547f53c24f" containerName="registry-server" Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.154201 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="37b4288c-7657-4647-b24e-98547f53c24f" containerName="registry-server" Sep 29 10:48:17 crc kubenswrapper[4752]: E0929 10:48:17.154212 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8142731c-cdef-4d76-aeae-bec64f2cb840" containerName="extract-content" Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.154218 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="8142731c-cdef-4d76-aeae-bec64f2cb840" containerName="extract-content" Sep 29 10:48:17 crc kubenswrapper[4752]: E0929 10:48:17.154226 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="501f967f-86b7-41e1-b650-0b122766a576" containerName="extract-content" Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.154233 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="501f967f-86b7-41e1-b650-0b122766a576" containerName="extract-content" Sep 29 10:48:17 crc kubenswrapper[4752]: E0929 10:48:17.154246 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8142731c-cdef-4d76-aeae-bec64f2cb840" containerName="extract-utilities" Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.154254 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="8142731c-cdef-4d76-aeae-bec64f2cb840" containerName="extract-utilities" Sep 29 10:48:17 crc kubenswrapper[4752]: E0929 10:48:17.154265 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8142731c-cdef-4d76-aeae-bec64f2cb840" containerName="registry-server" Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.154273 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="8142731c-cdef-4d76-aeae-bec64f2cb840" containerName="registry-server" Sep 29 10:48:17 crc kubenswrapper[4752]: E0929 10:48:17.154281 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37b4288c-7657-4647-b24e-98547f53c24f" containerName="extract-utilities" Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.154287 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="37b4288c-7657-4647-b24e-98547f53c24f" containerName="extract-utilities" Sep 29 10:48:17 crc kubenswrapper[4752]: E0929 10:48:17.154296 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a042fcf-8877-4ea2-97fc-272870ff20d3" containerName="extract-utilities" Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.154303 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a042fcf-8877-4ea2-97fc-272870ff20d3" containerName="extract-utilities" Sep 29 10:48:17 crc kubenswrapper[4752]: E0929 10:48:17.154317 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae8c092c-ec9d-456a-9ba3-5501c22f6280" containerName="collect-profiles" Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.154330 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae8c092c-ec9d-456a-9ba3-5501c22f6280" containerName="collect-profiles" Sep 29 10:48:17 crc kubenswrapper[4752]: E0929 10:48:17.154344 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb4bcdba-8a53-409c-9c6d-a8d464321183" containerName="pruner" Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.154353 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb4bcdba-8a53-409c-9c6d-a8d464321183" containerName="pruner" Sep 29 10:48:17 crc kubenswrapper[4752]: E0929 10:48:17.154365 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29994a3c-c9d4-436f-b6bf-c46f1cd81a57" containerName="pruner" Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.154374 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="29994a3c-c9d4-436f-b6bf-c46f1cd81a57" containerName="pruner" Sep 29 10:48:17 crc kubenswrapper[4752]: E0929 10:48:17.154386 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37b4288c-7657-4647-b24e-98547f53c24f" containerName="extract-content" Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.154393 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="37b4288c-7657-4647-b24e-98547f53c24f" containerName="extract-content" Sep 29 10:48:17 crc kubenswrapper[4752]: E0929 10:48:17.154403 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="501f967f-86b7-41e1-b650-0b122766a576" containerName="extract-utilities" Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.154410 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="501f967f-86b7-41e1-b650-0b122766a576" containerName="extract-utilities" Sep 29 10:48:17 crc kubenswrapper[4752]: E0929 10:48:17.154417 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a042fcf-8877-4ea2-97fc-272870ff20d3" containerName="registry-server" Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.154423 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a042fcf-8877-4ea2-97fc-272870ff20d3" containerName="registry-server" Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.154536 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="501f967f-86b7-41e1-b650-0b122766a576" containerName="registry-server" Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.154547 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb4bcdba-8a53-409c-9c6d-a8d464321183" containerName="pruner" Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.154557 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="8142731c-cdef-4d76-aeae-bec64f2cb840" containerName="registry-server" Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.154571 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="29994a3c-c9d4-436f-b6bf-c46f1cd81a57" containerName="pruner" Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.154581 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a042fcf-8877-4ea2-97fc-272870ff20d3" containerName="registry-server" Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.154590 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="37b4288c-7657-4647-b24e-98547f53c24f" containerName="registry-server" Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.154601 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="f00d489f-9b4a-421a-a02a-b9b090ae0449" containerName="oauth-openshift" Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.154609 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae8c092c-ec9d-456a-9ba3-5501c22f6280" containerName="collect-profiles" Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.155094 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-57bcd9fbb-45l59" Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.168315 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-57bcd9fbb-45l59"] Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.181169 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f00d489f-9b4a-421a-a02a-b9b090ae0449-v4-0-config-user-template-login\") pod \"f00d489f-9b4a-421a-a02a-b9b090ae0449\" (UID: \"f00d489f-9b4a-421a-a02a-b9b090ae0449\") " Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.181271 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f00d489f-9b4a-421a-a02a-b9b090ae0449-v4-0-config-system-ocp-branding-template\") pod \"f00d489f-9b4a-421a-a02a-b9b090ae0449\" (UID: \"f00d489f-9b4a-421a-a02a-b9b090ae0449\") " Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.181315 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f00d489f-9b4a-421a-a02a-b9b090ae0449-v4-0-config-system-serving-cert\") pod \"f00d489f-9b4a-421a-a02a-b9b090ae0449\" (UID: \"f00d489f-9b4a-421a-a02a-b9b090ae0449\") " Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.182217 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f00d489f-9b4a-421a-a02a-b9b090ae0449-v4-0-config-system-trusted-ca-bundle\") pod \"f00d489f-9b4a-421a-a02a-b9b090ae0449\" (UID: \"f00d489f-9b4a-421a-a02a-b9b090ae0449\") " Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.182264 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrgs6\" (UniqueName: \"kubernetes.io/projected/f00d489f-9b4a-421a-a02a-b9b090ae0449-kube-api-access-qrgs6\") pod \"f00d489f-9b4a-421a-a02a-b9b090ae0449\" (UID: \"f00d489f-9b4a-421a-a02a-b9b090ae0449\") " Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.182289 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f00d489f-9b4a-421a-a02a-b9b090ae0449-v4-0-config-user-template-error\") pod \"f00d489f-9b4a-421a-a02a-b9b090ae0449\" (UID: \"f00d489f-9b4a-421a-a02a-b9b090ae0449\") " Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.182309 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f00d489f-9b4a-421a-a02a-b9b090ae0449-v4-0-config-system-router-certs\") pod \"f00d489f-9b4a-421a-a02a-b9b090ae0449\" (UID: \"f00d489f-9b4a-421a-a02a-b9b090ae0449\") " Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.182326 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f00d489f-9b4a-421a-a02a-b9b090ae0449-v4-0-config-user-idp-0-file-data\") pod \"f00d489f-9b4a-421a-a02a-b9b090ae0449\" (UID: \"f00d489f-9b4a-421a-a02a-b9b090ae0449\") " Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.182362 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f00d489f-9b4a-421a-a02a-b9b090ae0449-v4-0-config-user-template-provider-selection\") pod \"f00d489f-9b4a-421a-a02a-b9b090ae0449\" (UID: \"f00d489f-9b4a-421a-a02a-b9b090ae0449\") " Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.182685 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f00d489f-9b4a-421a-a02a-b9b090ae0449-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "f00d489f-9b4a-421a-a02a-b9b090ae0449" (UID: "f00d489f-9b4a-421a-a02a-b9b090ae0449"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.182768 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f00d489f-9b4a-421a-a02a-b9b090ae0449-audit-dir\") pod \"f00d489f-9b4a-421a-a02a-b9b090ae0449\" (UID: \"f00d489f-9b4a-421a-a02a-b9b090ae0449\") " Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.182821 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f00d489f-9b4a-421a-a02a-b9b090ae0449-audit-policies\") pod \"f00d489f-9b4a-421a-a02a-b9b090ae0449\" (UID: \"f00d489f-9b4a-421a-a02a-b9b090ae0449\") " Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.182852 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f00d489f-9b4a-421a-a02a-b9b090ae0449-v4-0-config-system-cliconfig\") pod \"f00d489f-9b4a-421a-a02a-b9b090ae0449\" (UID: \"f00d489f-9b4a-421a-a02a-b9b090ae0449\") " Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.182888 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f00d489f-9b4a-421a-a02a-b9b090ae0449-v4-0-config-system-service-ca\") pod \"f00d489f-9b4a-421a-a02a-b9b090ae0449\" (UID: \"f00d489f-9b4a-421a-a02a-b9b090ae0449\") " Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.182912 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f00d489f-9b4a-421a-a02a-b9b090ae0449-v4-0-config-system-session\") pod \"f00d489f-9b4a-421a-a02a-b9b090ae0449\" (UID: \"f00d489f-9b4a-421a-a02a-b9b090ae0449\") " Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.183619 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f00d489f-9b4a-421a-a02a-b9b090ae0449-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "f00d489f-9b4a-421a-a02a-b9b090ae0449" (UID: "f00d489f-9b4a-421a-a02a-b9b090ae0449"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.183616 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f00d489f-9b4a-421a-a02a-b9b090ae0449-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "f00d489f-9b4a-421a-a02a-b9b090ae0449" (UID: "f00d489f-9b4a-421a-a02a-b9b090ae0449"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.183685 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f00d489f-9b4a-421a-a02a-b9b090ae0449-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f00d489f-9b4a-421a-a02a-b9b090ae0449" (UID: "f00d489f-9b4a-421a-a02a-b9b090ae0449"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.184250 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f00d489f-9b4a-421a-a02a-b9b090ae0449-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "f00d489f-9b4a-421a-a02a-b9b090ae0449" (UID: "f00d489f-9b4a-421a-a02a-b9b090ae0449"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.187419 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/44768ad2-2803-4a09-b9a2-0e2572f1a59d-v4-0-config-system-service-ca\") pod \"oauth-openshift-57bcd9fbb-45l59\" (UID: \"44768ad2-2803-4a09-b9a2-0e2572f1a59d\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-45l59" Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.187596 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/44768ad2-2803-4a09-b9a2-0e2572f1a59d-v4-0-config-user-template-login\") pod \"oauth-openshift-57bcd9fbb-45l59\" (UID: \"44768ad2-2803-4a09-b9a2-0e2572f1a59d\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-45l59" Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.187669 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/44768ad2-2803-4a09-b9a2-0e2572f1a59d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-57bcd9fbb-45l59\" (UID: \"44768ad2-2803-4a09-b9a2-0e2572f1a59d\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-45l59" Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.187748 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/44768ad2-2803-4a09-b9a2-0e2572f1a59d-v4-0-config-system-session\") pod \"oauth-openshift-57bcd9fbb-45l59\" (UID: \"44768ad2-2803-4a09-b9a2-0e2572f1a59d\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-45l59" Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.187782 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/44768ad2-2803-4a09-b9a2-0e2572f1a59d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-57bcd9fbb-45l59\" (UID: \"44768ad2-2803-4a09-b9a2-0e2572f1a59d\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-45l59" Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.187841 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8plq\" (UniqueName: \"kubernetes.io/projected/44768ad2-2803-4a09-b9a2-0e2572f1a59d-kube-api-access-v8plq\") pod \"oauth-openshift-57bcd9fbb-45l59\" (UID: \"44768ad2-2803-4a09-b9a2-0e2572f1a59d\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-45l59" Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.187864 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/44768ad2-2803-4a09-b9a2-0e2572f1a59d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-57bcd9fbb-45l59\" (UID: \"44768ad2-2803-4a09-b9a2-0e2572f1a59d\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-45l59" Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.187910 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/44768ad2-2803-4a09-b9a2-0e2572f1a59d-audit-policies\") pod \"oauth-openshift-57bcd9fbb-45l59\" (UID: \"44768ad2-2803-4a09-b9a2-0e2572f1a59d\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-45l59" Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.188231 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/44768ad2-2803-4a09-b9a2-0e2572f1a59d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-57bcd9fbb-45l59\" (UID: \"44768ad2-2803-4a09-b9a2-0e2572f1a59d\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-45l59" Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.188332 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/44768ad2-2803-4a09-b9a2-0e2572f1a59d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-57bcd9fbb-45l59\" (UID: \"44768ad2-2803-4a09-b9a2-0e2572f1a59d\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-45l59" Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.188531 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/44768ad2-2803-4a09-b9a2-0e2572f1a59d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-57bcd9fbb-45l59\" (UID: \"44768ad2-2803-4a09-b9a2-0e2572f1a59d\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-45l59" Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.188731 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/44768ad2-2803-4a09-b9a2-0e2572f1a59d-audit-dir\") pod \"oauth-openshift-57bcd9fbb-45l59\" (UID: \"44768ad2-2803-4a09-b9a2-0e2572f1a59d\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-45l59" Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.190151 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/44768ad2-2803-4a09-b9a2-0e2572f1a59d-v4-0-config-system-router-certs\") pod \"oauth-openshift-57bcd9fbb-45l59\" (UID: \"44768ad2-2803-4a09-b9a2-0e2572f1a59d\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-45l59" Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.190232 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/44768ad2-2803-4a09-b9a2-0e2572f1a59d-v4-0-config-user-template-error\") pod \"oauth-openshift-57bcd9fbb-45l59\" (UID: \"44768ad2-2803-4a09-b9a2-0e2572f1a59d\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-45l59" Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.190584 4752 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f00d489f-9b4a-421a-a02a-b9b090ae0449-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.190613 4752 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f00d489f-9b4a-421a-a02a-b9b090ae0449-audit-dir\") on node \"crc\" DevicePath \"\"" Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.190628 4752 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f00d489f-9b4a-421a-a02a-b9b090ae0449-audit-policies\") on node \"crc\" DevicePath \"\"" Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.190645 4752 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f00d489f-9b4a-421a-a02a-b9b090ae0449-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.190658 4752 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f00d489f-9b4a-421a-a02a-b9b090ae0449-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.190719 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f00d489f-9b4a-421a-a02a-b9b090ae0449-kube-api-access-qrgs6" (OuterVolumeSpecName: "kube-api-access-qrgs6") pod "f00d489f-9b4a-421a-a02a-b9b090ae0449" (UID: "f00d489f-9b4a-421a-a02a-b9b090ae0449"). InnerVolumeSpecName "kube-api-access-qrgs6". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.192082 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f00d489f-9b4a-421a-a02a-b9b090ae0449-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "f00d489f-9b4a-421a-a02a-b9b090ae0449" (UID: "f00d489f-9b4a-421a-a02a-b9b090ae0449"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.192099 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f00d489f-9b4a-421a-a02a-b9b090ae0449-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "f00d489f-9b4a-421a-a02a-b9b090ae0449" (UID: "f00d489f-9b4a-421a-a02a-b9b090ae0449"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.192929 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f00d489f-9b4a-421a-a02a-b9b090ae0449-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "f00d489f-9b4a-421a-a02a-b9b090ae0449" (UID: "f00d489f-9b4a-421a-a02a-b9b090ae0449"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.193257 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f00d489f-9b4a-421a-a02a-b9b090ae0449-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "f00d489f-9b4a-421a-a02a-b9b090ae0449" (UID: "f00d489f-9b4a-421a-a02a-b9b090ae0449"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.194172 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f00d489f-9b4a-421a-a02a-b9b090ae0449-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "f00d489f-9b4a-421a-a02a-b9b090ae0449" (UID: "f00d489f-9b4a-421a-a02a-b9b090ae0449"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.196874 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f00d489f-9b4a-421a-a02a-b9b090ae0449-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "f00d489f-9b4a-421a-a02a-b9b090ae0449" (UID: "f00d489f-9b4a-421a-a02a-b9b090ae0449"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.197753 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f00d489f-9b4a-421a-a02a-b9b090ae0449-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "f00d489f-9b4a-421a-a02a-b9b090ae0449" (UID: "f00d489f-9b4a-421a-a02a-b9b090ae0449"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.198915 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f00d489f-9b4a-421a-a02a-b9b090ae0449-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "f00d489f-9b4a-421a-a02a-b9b090ae0449" (UID: "f00d489f-9b4a-421a-a02a-b9b090ae0449"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.291652 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/44768ad2-2803-4a09-b9a2-0e2572f1a59d-audit-dir\") pod \"oauth-openshift-57bcd9fbb-45l59\" (UID: \"44768ad2-2803-4a09-b9a2-0e2572f1a59d\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-45l59" Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.291841 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/44768ad2-2803-4a09-b9a2-0e2572f1a59d-audit-dir\") pod \"oauth-openshift-57bcd9fbb-45l59\" (UID: \"44768ad2-2803-4a09-b9a2-0e2572f1a59d\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-45l59" Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.292178 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/44768ad2-2803-4a09-b9a2-0e2572f1a59d-v4-0-config-system-router-certs\") pod \"oauth-openshift-57bcd9fbb-45l59\" (UID: \"44768ad2-2803-4a09-b9a2-0e2572f1a59d\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-45l59" Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.292262 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/44768ad2-2803-4a09-b9a2-0e2572f1a59d-v4-0-config-user-template-error\") pod \"oauth-openshift-57bcd9fbb-45l59\" (UID: \"44768ad2-2803-4a09-b9a2-0e2572f1a59d\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-45l59" Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.292320 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/44768ad2-2803-4a09-b9a2-0e2572f1a59d-v4-0-config-system-service-ca\") pod \"oauth-openshift-57bcd9fbb-45l59\" (UID: \"44768ad2-2803-4a09-b9a2-0e2572f1a59d\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-45l59" Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.292388 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/44768ad2-2803-4a09-b9a2-0e2572f1a59d-v4-0-config-user-template-login\") pod \"oauth-openshift-57bcd9fbb-45l59\" (UID: \"44768ad2-2803-4a09-b9a2-0e2572f1a59d\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-45l59" Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.292427 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/44768ad2-2803-4a09-b9a2-0e2572f1a59d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-57bcd9fbb-45l59\" (UID: \"44768ad2-2803-4a09-b9a2-0e2572f1a59d\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-45l59" Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.292488 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/44768ad2-2803-4a09-b9a2-0e2572f1a59d-v4-0-config-system-session\") pod \"oauth-openshift-57bcd9fbb-45l59\" (UID: \"44768ad2-2803-4a09-b9a2-0e2572f1a59d\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-45l59" Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.292532 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/44768ad2-2803-4a09-b9a2-0e2572f1a59d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-57bcd9fbb-45l59\" (UID: \"44768ad2-2803-4a09-b9a2-0e2572f1a59d\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-45l59" Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.292565 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8plq\" (UniqueName: \"kubernetes.io/projected/44768ad2-2803-4a09-b9a2-0e2572f1a59d-kube-api-access-v8plq\") pod \"oauth-openshift-57bcd9fbb-45l59\" (UID: \"44768ad2-2803-4a09-b9a2-0e2572f1a59d\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-45l59" Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.292596 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/44768ad2-2803-4a09-b9a2-0e2572f1a59d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-57bcd9fbb-45l59\" (UID: \"44768ad2-2803-4a09-b9a2-0e2572f1a59d\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-45l59" Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.292622 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/44768ad2-2803-4a09-b9a2-0e2572f1a59d-audit-policies\") pod \"oauth-openshift-57bcd9fbb-45l59\" (UID: \"44768ad2-2803-4a09-b9a2-0e2572f1a59d\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-45l59" Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.292705 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/44768ad2-2803-4a09-b9a2-0e2572f1a59d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-57bcd9fbb-45l59\" (UID: \"44768ad2-2803-4a09-b9a2-0e2572f1a59d\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-45l59" Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.293053 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/44768ad2-2803-4a09-b9a2-0e2572f1a59d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-57bcd9fbb-45l59\" (UID: \"44768ad2-2803-4a09-b9a2-0e2572f1a59d\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-45l59" Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.293130 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/44768ad2-2803-4a09-b9a2-0e2572f1a59d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-57bcd9fbb-45l59\" (UID: \"44768ad2-2803-4a09-b9a2-0e2572f1a59d\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-45l59" Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.293235 4752 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f00d489f-9b4a-421a-a02a-b9b090ae0449-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.293260 4752 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f00d489f-9b4a-421a-a02a-b9b090ae0449-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.293279 4752 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f00d489f-9b4a-421a-a02a-b9b090ae0449-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.293295 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qrgs6\" (UniqueName: \"kubernetes.io/projected/f00d489f-9b4a-421a-a02a-b9b090ae0449-kube-api-access-qrgs6\") on node \"crc\" DevicePath \"\"" Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.293309 4752 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f00d489f-9b4a-421a-a02a-b9b090ae0449-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.293324 4752 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f00d489f-9b4a-421a-a02a-b9b090ae0449-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.293339 4752 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f00d489f-9b4a-421a-a02a-b9b090ae0449-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.293358 4752 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f00d489f-9b4a-421a-a02a-b9b090ae0449-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.293373 4752 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f00d489f-9b4a-421a-a02a-b9b090ae0449-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.294456 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/44768ad2-2803-4a09-b9a2-0e2572f1a59d-audit-policies\") pod \"oauth-openshift-57bcd9fbb-45l59\" (UID: \"44768ad2-2803-4a09-b9a2-0e2572f1a59d\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-45l59" Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.294686 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/44768ad2-2803-4a09-b9a2-0e2572f1a59d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-57bcd9fbb-45l59\" (UID: \"44768ad2-2803-4a09-b9a2-0e2572f1a59d\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-45l59" Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.294724 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/44768ad2-2803-4a09-b9a2-0e2572f1a59d-v4-0-config-system-service-ca\") pod \"oauth-openshift-57bcd9fbb-45l59\" (UID: \"44768ad2-2803-4a09-b9a2-0e2572f1a59d\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-45l59" Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.294732 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/44768ad2-2803-4a09-b9a2-0e2572f1a59d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-57bcd9fbb-45l59\" (UID: \"44768ad2-2803-4a09-b9a2-0e2572f1a59d\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-45l59" Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.296432 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/44768ad2-2803-4a09-b9a2-0e2572f1a59d-v4-0-config-user-template-error\") pod \"oauth-openshift-57bcd9fbb-45l59\" (UID: \"44768ad2-2803-4a09-b9a2-0e2572f1a59d\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-45l59" Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.296511 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/44768ad2-2803-4a09-b9a2-0e2572f1a59d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-57bcd9fbb-45l59\" (UID: \"44768ad2-2803-4a09-b9a2-0e2572f1a59d\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-45l59" Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.297018 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/44768ad2-2803-4a09-b9a2-0e2572f1a59d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-57bcd9fbb-45l59\" (UID: \"44768ad2-2803-4a09-b9a2-0e2572f1a59d\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-45l59" Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.297773 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/44768ad2-2803-4a09-b9a2-0e2572f1a59d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-57bcd9fbb-45l59\" (UID: \"44768ad2-2803-4a09-b9a2-0e2572f1a59d\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-45l59" Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.297880 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/44768ad2-2803-4a09-b9a2-0e2572f1a59d-v4-0-config-system-session\") pod \"oauth-openshift-57bcd9fbb-45l59\" (UID: \"44768ad2-2803-4a09-b9a2-0e2572f1a59d\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-45l59" Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.297988 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/44768ad2-2803-4a09-b9a2-0e2572f1a59d-v4-0-config-user-template-login\") pod \"oauth-openshift-57bcd9fbb-45l59\" (UID: \"44768ad2-2803-4a09-b9a2-0e2572f1a59d\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-45l59" Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.298589 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/44768ad2-2803-4a09-b9a2-0e2572f1a59d-v4-0-config-system-router-certs\") pod \"oauth-openshift-57bcd9fbb-45l59\" (UID: \"44768ad2-2803-4a09-b9a2-0e2572f1a59d\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-45l59" Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.299373 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/44768ad2-2803-4a09-b9a2-0e2572f1a59d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-57bcd9fbb-45l59\" (UID: \"44768ad2-2803-4a09-b9a2-0e2572f1a59d\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-45l59" Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.309179 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8plq\" (UniqueName: \"kubernetes.io/projected/44768ad2-2803-4a09-b9a2-0e2572f1a59d-kube-api-access-v8plq\") pod \"oauth-openshift-57bcd9fbb-45l59\" (UID: \"44768ad2-2803-4a09-b9a2-0e2572f1a59d\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-45l59" Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.479087 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-57bcd9fbb-45l59" Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.697303 4752 generic.go:334] "Generic (PLEG): container finished" podID="f00d489f-9b4a-421a-a02a-b9b090ae0449" containerID="0bf50d9bebe257170660e11029e3a817b7b0c30c5acf5ae33ab8d53fbc1261a4" exitCode=0 Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.697383 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-prpgr" Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.697411 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-prpgr" event={"ID":"f00d489f-9b4a-421a-a02a-b9b090ae0449","Type":"ContainerDied","Data":"0bf50d9bebe257170660e11029e3a817b7b0c30c5acf5ae33ab8d53fbc1261a4"} Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.698027 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-prpgr" event={"ID":"f00d489f-9b4a-421a-a02a-b9b090ae0449","Type":"ContainerDied","Data":"a2f5b94cf3e1b07e27ccd94c0b80eb6a995b63ab6786bd1ae83da50d7a4e25c7"} Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.698063 4752 scope.go:117] "RemoveContainer" containerID="0bf50d9bebe257170660e11029e3a817b7b0c30c5acf5ae33ab8d53fbc1261a4" Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.699025 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-57bcd9fbb-45l59"] Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.722025 4752 scope.go:117] "RemoveContainer" containerID="0bf50d9bebe257170660e11029e3a817b7b0c30c5acf5ae33ab8d53fbc1261a4" Sep 29 10:48:17 crc kubenswrapper[4752]: E0929 10:48:17.723496 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0bf50d9bebe257170660e11029e3a817b7b0c30c5acf5ae33ab8d53fbc1261a4\": container with ID starting with 0bf50d9bebe257170660e11029e3a817b7b0c30c5acf5ae33ab8d53fbc1261a4 not found: ID does not exist" containerID="0bf50d9bebe257170660e11029e3a817b7b0c30c5acf5ae33ab8d53fbc1261a4" Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.723551 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0bf50d9bebe257170660e11029e3a817b7b0c30c5acf5ae33ab8d53fbc1261a4"} err="failed to get container status \"0bf50d9bebe257170660e11029e3a817b7b0c30c5acf5ae33ab8d53fbc1261a4\": rpc error: code = NotFound desc = could not find container \"0bf50d9bebe257170660e11029e3a817b7b0c30c5acf5ae33ab8d53fbc1261a4\": container with ID starting with 0bf50d9bebe257170660e11029e3a817b7b0c30c5acf5ae33ab8d53fbc1261a4 not found: ID does not exist" Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.731766 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-prpgr"] Sep 29 10:48:17 crc kubenswrapper[4752]: I0929 10:48:17.736292 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-prpgr"] Sep 29 10:48:18 crc kubenswrapper[4752]: I0929 10:48:18.038745 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f00d489f-9b4a-421a-a02a-b9b090ae0449" path="/var/lib/kubelet/pods/f00d489f-9b4a-421a-a02a-b9b090ae0449/volumes" Sep 29 10:48:18 crc kubenswrapper[4752]: I0929 10:48:18.712056 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-57bcd9fbb-45l59" event={"ID":"44768ad2-2803-4a09-b9a2-0e2572f1a59d","Type":"ContainerStarted","Data":"9bb15f30a3eb0b644799e567e19cbeca861c0d68d611e7d4856f1eb26dba2ac1"} Sep 29 10:48:18 crc kubenswrapper[4752]: I0929 10:48:18.712346 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-57bcd9fbb-45l59" event={"ID":"44768ad2-2803-4a09-b9a2-0e2572f1a59d","Type":"ContainerStarted","Data":"41b5bc51d33907e938ce60e453a78ce6ef26c0173d7be10556591001e7b53a84"} Sep 29 10:48:18 crc kubenswrapper[4752]: I0929 10:48:18.712499 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-57bcd9fbb-45l59" Sep 29 10:48:18 crc kubenswrapper[4752]: I0929 10:48:18.719769 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-57bcd9fbb-45l59" Sep 29 10:48:18 crc kubenswrapper[4752]: I0929 10:48:18.733367 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-57bcd9fbb-45l59" podStartSLOduration=27.733343373 podStartE2EDuration="27.733343373s" podCreationTimestamp="2025-09-29 10:47:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:48:18.731482772 +0000 UTC m=+239.520624439" watchObservedRunningTime="2025-09-29 10:48:18.733343373 +0000 UTC m=+239.522485040" Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.066856 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2pwfh"] Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.067824 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2pwfh" podUID="83d184f7-5dec-4c4c-b53e-d26af311916c" containerName="registry-server" containerID="cri-o://d0c5a7b9b00fc04b5d276284d9ede8949358bd36e57d305311bf53526fb9ae7b" gracePeriod=30 Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.071000 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5tlth"] Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.071241 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-5tlth" podUID="06f9e526-21c6-4e20-b1a8-8f4fbfaa6413" containerName="registry-server" containerID="cri-o://d988ee7d19339e1e3336f3096d0ad29d0050f425aaee5ca8edb17f9c87bfef86" gracePeriod=30 Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.088515 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-pbmfv"] Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.088776 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-pbmfv" podUID="d04ea3ea-71ac-481c-990b-a989a6f61516" containerName="marketplace-operator" containerID="cri-o://79c2d14bb1544aad70f4c45adb33236836c53b91f8fe8c1bfa1f014194ce9be3" gracePeriod=30 Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.098466 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ks2mq"] Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.098792 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-ks2mq" podUID="7beaf483-1002-4e94-a9ee-59e20e83f824" containerName="registry-server" containerID="cri-o://36a7228ab347fdea68adf730b6d6291cc97192132f16007cec3fadbccdb335ce" gracePeriod=30 Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.115329 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kxshj"] Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.115621 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-kxshj" podUID="98952fd9-6515-4ba1-8d1a-490a2c3e33b1" containerName="registry-server" containerID="cri-o://f912b42adf2424dd2eef9173682ffea98049dbd253b645455af2dc3509539ee4" gracePeriod=30 Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.122212 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-64489"] Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.123166 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-64489" Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.137832 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-64489"] Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.179473 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6d06e125-ac1f-4214-8e49-35a46d23413b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-64489\" (UID: \"6d06e125-ac1f-4214-8e49-35a46d23413b\") " pod="openshift-marketplace/marketplace-operator-79b997595-64489" Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.179523 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6d06e125-ac1f-4214-8e49-35a46d23413b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-64489\" (UID: \"6d06e125-ac1f-4214-8e49-35a46d23413b\") " pod="openshift-marketplace/marketplace-operator-79b997595-64489" Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.179583 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66w27\" (UniqueName: \"kubernetes.io/projected/6d06e125-ac1f-4214-8e49-35a46d23413b-kube-api-access-66w27\") pod \"marketplace-operator-79b997595-64489\" (UID: \"6d06e125-ac1f-4214-8e49-35a46d23413b\") " pod="openshift-marketplace/marketplace-operator-79b997595-64489" Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.281982 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66w27\" (UniqueName: \"kubernetes.io/projected/6d06e125-ac1f-4214-8e49-35a46d23413b-kube-api-access-66w27\") pod \"marketplace-operator-79b997595-64489\" (UID: \"6d06e125-ac1f-4214-8e49-35a46d23413b\") " pod="openshift-marketplace/marketplace-operator-79b997595-64489" Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.282065 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6d06e125-ac1f-4214-8e49-35a46d23413b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-64489\" (UID: \"6d06e125-ac1f-4214-8e49-35a46d23413b\") " pod="openshift-marketplace/marketplace-operator-79b997595-64489" Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.282102 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6d06e125-ac1f-4214-8e49-35a46d23413b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-64489\" (UID: \"6d06e125-ac1f-4214-8e49-35a46d23413b\") " pod="openshift-marketplace/marketplace-operator-79b997595-64489" Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.285499 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6d06e125-ac1f-4214-8e49-35a46d23413b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-64489\" (UID: \"6d06e125-ac1f-4214-8e49-35a46d23413b\") " pod="openshift-marketplace/marketplace-operator-79b997595-64489" Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.290336 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6d06e125-ac1f-4214-8e49-35a46d23413b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-64489\" (UID: \"6d06e125-ac1f-4214-8e49-35a46d23413b\") " pod="openshift-marketplace/marketplace-operator-79b997595-64489" Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.306421 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66w27\" (UniqueName: \"kubernetes.io/projected/6d06e125-ac1f-4214-8e49-35a46d23413b-kube-api-access-66w27\") pod \"marketplace-operator-79b997595-64489\" (UID: \"6d06e125-ac1f-4214-8e49-35a46d23413b\") " pod="openshift-marketplace/marketplace-operator-79b997595-64489" Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.541106 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-64489" Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.567414 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5tlth" Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.573927 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2pwfh" Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.585685 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06f9e526-21c6-4e20-b1a8-8f4fbfaa6413-catalog-content\") pod \"06f9e526-21c6-4e20-b1a8-8f4fbfaa6413\" (UID: \"06f9e526-21c6-4e20-b1a8-8f4fbfaa6413\") " Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.585767 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06f9e526-21c6-4e20-b1a8-8f4fbfaa6413-utilities\") pod \"06f9e526-21c6-4e20-b1a8-8f4fbfaa6413\" (UID: \"06f9e526-21c6-4e20-b1a8-8f4fbfaa6413\") " Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.585788 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83d184f7-5dec-4c4c-b53e-d26af311916c-utilities\") pod \"83d184f7-5dec-4c4c-b53e-d26af311916c\" (UID: \"83d184f7-5dec-4c4c-b53e-d26af311916c\") " Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.585848 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83d184f7-5dec-4c4c-b53e-d26af311916c-catalog-content\") pod \"83d184f7-5dec-4c4c-b53e-d26af311916c\" (UID: \"83d184f7-5dec-4c4c-b53e-d26af311916c\") " Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.585979 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dc2lj\" (UniqueName: \"kubernetes.io/projected/83d184f7-5dec-4c4c-b53e-d26af311916c-kube-api-access-dc2lj\") pod \"83d184f7-5dec-4c4c-b53e-d26af311916c\" (UID: \"83d184f7-5dec-4c4c-b53e-d26af311916c\") " Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.586042 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrxjl\" (UniqueName: \"kubernetes.io/projected/06f9e526-21c6-4e20-b1a8-8f4fbfaa6413-kube-api-access-qrxjl\") pod \"06f9e526-21c6-4e20-b1a8-8f4fbfaa6413\" (UID: \"06f9e526-21c6-4e20-b1a8-8f4fbfaa6413\") " Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.587945 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83d184f7-5dec-4c4c-b53e-d26af311916c-utilities" (OuterVolumeSpecName: "utilities") pod "83d184f7-5dec-4c4c-b53e-d26af311916c" (UID: "83d184f7-5dec-4c4c-b53e-d26af311916c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.589626 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/06f9e526-21c6-4e20-b1a8-8f4fbfaa6413-utilities" (OuterVolumeSpecName: "utilities") pod "06f9e526-21c6-4e20-b1a8-8f4fbfaa6413" (UID: "06f9e526-21c6-4e20-b1a8-8f4fbfaa6413"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.596288 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06f9e526-21c6-4e20-b1a8-8f4fbfaa6413-kube-api-access-qrxjl" (OuterVolumeSpecName: "kube-api-access-qrxjl") pod "06f9e526-21c6-4e20-b1a8-8f4fbfaa6413" (UID: "06f9e526-21c6-4e20-b1a8-8f4fbfaa6413"). InnerVolumeSpecName "kube-api-access-qrxjl". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.605179 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83d184f7-5dec-4c4c-b53e-d26af311916c-kube-api-access-dc2lj" (OuterVolumeSpecName: "kube-api-access-dc2lj") pod "83d184f7-5dec-4c4c-b53e-d26af311916c" (UID: "83d184f7-5dec-4c4c-b53e-d26af311916c"). InnerVolumeSpecName "kube-api-access-dc2lj". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.677439 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ks2mq" Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.677605 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/06f9e526-21c6-4e20-b1a8-8f4fbfaa6413-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "06f9e526-21c6-4e20-b1a8-8f4fbfaa6413" (UID: "06f9e526-21c6-4e20-b1a8-8f4fbfaa6413"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.678516 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-pbmfv" Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.681596 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83d184f7-5dec-4c4c-b53e-d26af311916c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "83d184f7-5dec-4c4c-b53e-d26af311916c" (UID: "83d184f7-5dec-4c4c-b53e-d26af311916c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.687866 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d04ea3ea-71ac-481c-990b-a989a6f61516-marketplace-operator-metrics\") pod \"d04ea3ea-71ac-481c-990b-a989a6f61516\" (UID: \"d04ea3ea-71ac-481c-990b-a989a6f61516\") " Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.687924 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ls7tg\" (UniqueName: \"kubernetes.io/projected/7beaf483-1002-4e94-a9ee-59e20e83f824-kube-api-access-ls7tg\") pod \"7beaf483-1002-4e94-a9ee-59e20e83f824\" (UID: \"7beaf483-1002-4e94-a9ee-59e20e83f824\") " Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.688015 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf22s\" (UniqueName: \"kubernetes.io/projected/d04ea3ea-71ac-481c-990b-a989a6f61516-kube-api-access-gf22s\") pod \"d04ea3ea-71ac-481c-990b-a989a6f61516\" (UID: \"d04ea3ea-71ac-481c-990b-a989a6f61516\") " Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.688051 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7beaf483-1002-4e94-a9ee-59e20e83f824-catalog-content\") pod \"7beaf483-1002-4e94-a9ee-59e20e83f824\" (UID: \"7beaf483-1002-4e94-a9ee-59e20e83f824\") " Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.688080 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7beaf483-1002-4e94-a9ee-59e20e83f824-utilities\") pod \"7beaf483-1002-4e94-a9ee-59e20e83f824\" (UID: \"7beaf483-1002-4e94-a9ee-59e20e83f824\") " Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.688342 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06f9e526-21c6-4e20-b1a8-8f4fbfaa6413-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.688355 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06f9e526-21c6-4e20-b1a8-8f4fbfaa6413-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.688366 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83d184f7-5dec-4c4c-b53e-d26af311916c-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.688375 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83d184f7-5dec-4c4c-b53e-d26af311916c-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.688387 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dc2lj\" (UniqueName: \"kubernetes.io/projected/83d184f7-5dec-4c4c-b53e-d26af311916c-kube-api-access-dc2lj\") on node \"crc\" DevicePath \"\"" Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.688397 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qrxjl\" (UniqueName: \"kubernetes.io/projected/06f9e526-21c6-4e20-b1a8-8f4fbfaa6413-kube-api-access-qrxjl\") on node \"crc\" DevicePath \"\"" Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.689093 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7beaf483-1002-4e94-a9ee-59e20e83f824-utilities" (OuterVolumeSpecName: "utilities") pod "7beaf483-1002-4e94-a9ee-59e20e83f824" (UID: "7beaf483-1002-4e94-a9ee-59e20e83f824"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.691674 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7beaf483-1002-4e94-a9ee-59e20e83f824-kube-api-access-ls7tg" (OuterVolumeSpecName: "kube-api-access-ls7tg") pod "7beaf483-1002-4e94-a9ee-59e20e83f824" (UID: "7beaf483-1002-4e94-a9ee-59e20e83f824"). InnerVolumeSpecName "kube-api-access-ls7tg". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.692901 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d04ea3ea-71ac-481c-990b-a989a6f61516-kube-api-access-gf22s" (OuterVolumeSpecName: "kube-api-access-gf22s") pod "d04ea3ea-71ac-481c-990b-a989a6f61516" (UID: "d04ea3ea-71ac-481c-990b-a989a6f61516"). InnerVolumeSpecName "kube-api-access-gf22s". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.693059 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d04ea3ea-71ac-481c-990b-a989a6f61516-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "d04ea3ea-71ac-481c-990b-a989a6f61516" (UID: "d04ea3ea-71ac-481c-990b-a989a6f61516"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.701068 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kxshj" Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.716050 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7beaf483-1002-4e94-a9ee-59e20e83f824-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7beaf483-1002-4e94-a9ee-59e20e83f824" (UID: "7beaf483-1002-4e94-a9ee-59e20e83f824"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.792124 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d04ea3ea-71ac-481c-990b-a989a6f61516-marketplace-trusted-ca\") pod \"d04ea3ea-71ac-481c-990b-a989a6f61516\" (UID: \"d04ea3ea-71ac-481c-990b-a989a6f61516\") " Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.792173 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98952fd9-6515-4ba1-8d1a-490a2c3e33b1-utilities\") pod \"98952fd9-6515-4ba1-8d1a-490a2c3e33b1\" (UID: \"98952fd9-6515-4ba1-8d1a-490a2c3e33b1\") " Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.792325 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xlzgx\" (UniqueName: \"kubernetes.io/projected/98952fd9-6515-4ba1-8d1a-490a2c3e33b1-kube-api-access-xlzgx\") pod \"98952fd9-6515-4ba1-8d1a-490a2c3e33b1\" (UID: \"98952fd9-6515-4ba1-8d1a-490a2c3e33b1\") " Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.792586 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d04ea3ea-71ac-481c-990b-a989a6f61516-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "d04ea3ea-71ac-481c-990b-a989a6f61516" (UID: "d04ea3ea-71ac-481c-990b-a989a6f61516"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.792994 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98952fd9-6515-4ba1-8d1a-490a2c3e33b1-catalog-content\") pod \"98952fd9-6515-4ba1-8d1a-490a2c3e33b1\" (UID: \"98952fd9-6515-4ba1-8d1a-490a2c3e33b1\") " Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.793397 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98952fd9-6515-4ba1-8d1a-490a2c3e33b1-utilities" (OuterVolumeSpecName: "utilities") pod "98952fd9-6515-4ba1-8d1a-490a2c3e33b1" (UID: "98952fd9-6515-4ba1-8d1a-490a2c3e33b1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.795365 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7beaf483-1002-4e94-a9ee-59e20e83f824-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.795586 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7beaf483-1002-4e94-a9ee-59e20e83f824-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.795769 4752 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d04ea3ea-71ac-481c-990b-a989a6f61516-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.795953 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ls7tg\" (UniqueName: \"kubernetes.io/projected/7beaf483-1002-4e94-a9ee-59e20e83f824-kube-api-access-ls7tg\") on node \"crc\" DevicePath \"\"" Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.796066 4752 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d04ea3ea-71ac-481c-990b-a989a6f61516-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.796180 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98952fd9-6515-4ba1-8d1a-490a2c3e33b1-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.796259 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf22s\" (UniqueName: \"kubernetes.io/projected/d04ea3ea-71ac-481c-990b-a989a6f61516-kube-api-access-gf22s\") on node \"crc\" DevicePath \"\"" Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.798982 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98952fd9-6515-4ba1-8d1a-490a2c3e33b1-kube-api-access-xlzgx" (OuterVolumeSpecName: "kube-api-access-xlzgx") pod "98952fd9-6515-4ba1-8d1a-490a2c3e33b1" (UID: "98952fd9-6515-4ba1-8d1a-490a2c3e33b1"). InnerVolumeSpecName "kube-api-access-xlzgx". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.883389 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98952fd9-6515-4ba1-8d1a-490a2c3e33b1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "98952fd9-6515-4ba1-8d1a-490a2c3e33b1" (UID: "98952fd9-6515-4ba1-8d1a-490a2c3e33b1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.885205 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-64489"] Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.888822 4752 generic.go:334] "Generic (PLEG): container finished" podID="d04ea3ea-71ac-481c-990b-a989a6f61516" containerID="79c2d14bb1544aad70f4c45adb33236836c53b91f8fe8c1bfa1f014194ce9be3" exitCode=0 Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.888916 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-pbmfv" event={"ID":"d04ea3ea-71ac-481c-990b-a989a6f61516","Type":"ContainerDied","Data":"79c2d14bb1544aad70f4c45adb33236836c53b91f8fe8c1bfa1f014194ce9be3"} Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.888947 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-pbmfv" event={"ID":"d04ea3ea-71ac-481c-990b-a989a6f61516","Type":"ContainerDied","Data":"13c8ddc139cf32e517f16ddee6bdc81caf18342fe4a605c1a7be7368d909e3e6"} Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.888968 4752 scope.go:117] "RemoveContainer" containerID="79c2d14bb1544aad70f4c45adb33236836c53b91f8fe8c1bfa1f014194ce9be3" Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.889110 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-pbmfv" Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.897389 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xlzgx\" (UniqueName: \"kubernetes.io/projected/98952fd9-6515-4ba1-8d1a-490a2c3e33b1-kube-api-access-xlzgx\") on node \"crc\" DevicePath \"\"" Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.897438 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98952fd9-6515-4ba1-8d1a-490a2c3e33b1-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.902012 4752 generic.go:334] "Generic (PLEG): container finished" podID="7beaf483-1002-4e94-a9ee-59e20e83f824" containerID="36a7228ab347fdea68adf730b6d6291cc97192132f16007cec3fadbccdb335ce" exitCode=0 Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.902123 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ks2mq" event={"ID":"7beaf483-1002-4e94-a9ee-59e20e83f824","Type":"ContainerDied","Data":"36a7228ab347fdea68adf730b6d6291cc97192132f16007cec3fadbccdb335ce"} Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.902140 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ks2mq" Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.902172 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ks2mq" event={"ID":"7beaf483-1002-4e94-a9ee-59e20e83f824","Type":"ContainerDied","Data":"b7569a36b8ee02a6a5ce23995563465893751cd23f5dea2bc883496b095b26be"} Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.906911 4752 generic.go:334] "Generic (PLEG): container finished" podID="06f9e526-21c6-4e20-b1a8-8f4fbfaa6413" containerID="d988ee7d19339e1e3336f3096d0ad29d0050f425aaee5ca8edb17f9c87bfef86" exitCode=0 Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.906976 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5tlth" Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.906990 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5tlth" event={"ID":"06f9e526-21c6-4e20-b1a8-8f4fbfaa6413","Type":"ContainerDied","Data":"d988ee7d19339e1e3336f3096d0ad29d0050f425aaee5ca8edb17f9c87bfef86"} Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.907171 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5tlth" event={"ID":"06f9e526-21c6-4e20-b1a8-8f4fbfaa6413","Type":"ContainerDied","Data":"724f61baa8b2f9fa972622270ab086c9aa8cccb84359fcc9704a3942f41fb509"} Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.911057 4752 generic.go:334] "Generic (PLEG): container finished" podID="98952fd9-6515-4ba1-8d1a-490a2c3e33b1" containerID="f912b42adf2424dd2eef9173682ffea98049dbd253b645455af2dc3509539ee4" exitCode=0 Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.911090 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kxshj" Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.911165 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kxshj" event={"ID":"98952fd9-6515-4ba1-8d1a-490a2c3e33b1","Type":"ContainerDied","Data":"f912b42adf2424dd2eef9173682ffea98049dbd253b645455af2dc3509539ee4"} Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.911215 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kxshj" event={"ID":"98952fd9-6515-4ba1-8d1a-490a2c3e33b1","Type":"ContainerDied","Data":"6538a00acae804aeba215ba7bea54e6e820516b6b2dd119f655cd22fb4247473"} Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.919511 4752 generic.go:334] "Generic (PLEG): container finished" podID="83d184f7-5dec-4c4c-b53e-d26af311916c" containerID="d0c5a7b9b00fc04b5d276284d9ede8949358bd36e57d305311bf53526fb9ae7b" exitCode=0 Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.919545 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2pwfh" event={"ID":"83d184f7-5dec-4c4c-b53e-d26af311916c","Type":"ContainerDied","Data":"d0c5a7b9b00fc04b5d276284d9ede8949358bd36e57d305311bf53526fb9ae7b"} Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.919568 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2pwfh" event={"ID":"83d184f7-5dec-4c4c-b53e-d26af311916c","Type":"ContainerDied","Data":"478dfb4261d161fbc523a095992095d392e0cb9e8221b31213d211a787458bc9"} Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.919627 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2pwfh" Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.920688 4752 scope.go:117] "RemoveContainer" containerID="79c2d14bb1544aad70f4c45adb33236836c53b91f8fe8c1bfa1f014194ce9be3" Sep 29 10:48:46 crc kubenswrapper[4752]: E0929 10:48:46.926522 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79c2d14bb1544aad70f4c45adb33236836c53b91f8fe8c1bfa1f014194ce9be3\": container with ID starting with 79c2d14bb1544aad70f4c45adb33236836c53b91f8fe8c1bfa1f014194ce9be3 not found: ID does not exist" containerID="79c2d14bb1544aad70f4c45adb33236836c53b91f8fe8c1bfa1f014194ce9be3" Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.926589 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79c2d14bb1544aad70f4c45adb33236836c53b91f8fe8c1bfa1f014194ce9be3"} err="failed to get container status \"79c2d14bb1544aad70f4c45adb33236836c53b91f8fe8c1bfa1f014194ce9be3\": rpc error: code = NotFound desc = could not find container \"79c2d14bb1544aad70f4c45adb33236836c53b91f8fe8c1bfa1f014194ce9be3\": container with ID starting with 79c2d14bb1544aad70f4c45adb33236836c53b91f8fe8c1bfa1f014194ce9be3 not found: ID does not exist" Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.926627 4752 scope.go:117] "RemoveContainer" containerID="36a7228ab347fdea68adf730b6d6291cc97192132f16007cec3fadbccdb335ce" Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.945066 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-pbmfv"] Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.958339 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-pbmfv"] Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.961133 4752 scope.go:117] "RemoveContainer" containerID="74a6ebf8ba535654ecb7d4e6402ee5f2f831fe1ffd7af9a21b39838d4080455c" Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.969638 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ks2mq"] Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.975897 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-ks2mq"] Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.978644 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5tlth"] Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.980874 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-5tlth"] Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.984635 4752 scope.go:117] "RemoveContainer" containerID="801d8c51667f4a39d9989d384b4593ba509f6df68a623832900642a45a18e098" Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.992754 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kxshj"] Sep 29 10:48:46 crc kubenswrapper[4752]: I0929 10:48:46.995214 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-kxshj"] Sep 29 10:48:47 crc kubenswrapper[4752]: I0929 10:48:47.007772 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2pwfh"] Sep 29 10:48:47 crc kubenswrapper[4752]: I0929 10:48:47.010459 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2pwfh"] Sep 29 10:48:47 crc kubenswrapper[4752]: I0929 10:48:47.016284 4752 scope.go:117] "RemoveContainer" containerID="36a7228ab347fdea68adf730b6d6291cc97192132f16007cec3fadbccdb335ce" Sep 29 10:48:47 crc kubenswrapper[4752]: E0929 10:48:47.016948 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36a7228ab347fdea68adf730b6d6291cc97192132f16007cec3fadbccdb335ce\": container with ID starting with 36a7228ab347fdea68adf730b6d6291cc97192132f16007cec3fadbccdb335ce not found: ID does not exist" containerID="36a7228ab347fdea68adf730b6d6291cc97192132f16007cec3fadbccdb335ce" Sep 29 10:48:47 crc kubenswrapper[4752]: I0929 10:48:47.017010 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36a7228ab347fdea68adf730b6d6291cc97192132f16007cec3fadbccdb335ce"} err="failed to get container status \"36a7228ab347fdea68adf730b6d6291cc97192132f16007cec3fadbccdb335ce\": rpc error: code = NotFound desc = could not find container \"36a7228ab347fdea68adf730b6d6291cc97192132f16007cec3fadbccdb335ce\": container with ID starting with 36a7228ab347fdea68adf730b6d6291cc97192132f16007cec3fadbccdb335ce not found: ID does not exist" Sep 29 10:48:47 crc kubenswrapper[4752]: I0929 10:48:47.017046 4752 scope.go:117] "RemoveContainer" containerID="74a6ebf8ba535654ecb7d4e6402ee5f2f831fe1ffd7af9a21b39838d4080455c" Sep 29 10:48:47 crc kubenswrapper[4752]: E0929 10:48:47.017501 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74a6ebf8ba535654ecb7d4e6402ee5f2f831fe1ffd7af9a21b39838d4080455c\": container with ID starting with 74a6ebf8ba535654ecb7d4e6402ee5f2f831fe1ffd7af9a21b39838d4080455c not found: ID does not exist" containerID="74a6ebf8ba535654ecb7d4e6402ee5f2f831fe1ffd7af9a21b39838d4080455c" Sep 29 10:48:47 crc kubenswrapper[4752]: I0929 10:48:47.017540 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74a6ebf8ba535654ecb7d4e6402ee5f2f831fe1ffd7af9a21b39838d4080455c"} err="failed to get container status \"74a6ebf8ba535654ecb7d4e6402ee5f2f831fe1ffd7af9a21b39838d4080455c\": rpc error: code = NotFound desc = could not find container \"74a6ebf8ba535654ecb7d4e6402ee5f2f831fe1ffd7af9a21b39838d4080455c\": container with ID starting with 74a6ebf8ba535654ecb7d4e6402ee5f2f831fe1ffd7af9a21b39838d4080455c not found: ID does not exist" Sep 29 10:48:47 crc kubenswrapper[4752]: I0929 10:48:47.017559 4752 scope.go:117] "RemoveContainer" containerID="801d8c51667f4a39d9989d384b4593ba509f6df68a623832900642a45a18e098" Sep 29 10:48:47 crc kubenswrapper[4752]: E0929 10:48:47.017783 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"801d8c51667f4a39d9989d384b4593ba509f6df68a623832900642a45a18e098\": container with ID starting with 801d8c51667f4a39d9989d384b4593ba509f6df68a623832900642a45a18e098 not found: ID does not exist" containerID="801d8c51667f4a39d9989d384b4593ba509f6df68a623832900642a45a18e098" Sep 29 10:48:47 crc kubenswrapper[4752]: I0929 10:48:47.017815 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"801d8c51667f4a39d9989d384b4593ba509f6df68a623832900642a45a18e098"} err="failed to get container status \"801d8c51667f4a39d9989d384b4593ba509f6df68a623832900642a45a18e098\": rpc error: code = NotFound desc = could not find container \"801d8c51667f4a39d9989d384b4593ba509f6df68a623832900642a45a18e098\": container with ID starting with 801d8c51667f4a39d9989d384b4593ba509f6df68a623832900642a45a18e098 not found: ID does not exist" Sep 29 10:48:47 crc kubenswrapper[4752]: I0929 10:48:47.017828 4752 scope.go:117] "RemoveContainer" containerID="d988ee7d19339e1e3336f3096d0ad29d0050f425aaee5ca8edb17f9c87bfef86" Sep 29 10:48:47 crc kubenswrapper[4752]: I0929 10:48:47.032661 4752 scope.go:117] "RemoveContainer" containerID="10e0712e6f14e2255ad914c318ba42f3a3566ba33c400cb8b48a2361a4bd0698" Sep 29 10:48:47 crc kubenswrapper[4752]: I0929 10:48:47.048949 4752 scope.go:117] "RemoveContainer" containerID="0c2575ba5de5d4677207db7b3a87e1307f2d55f53f76e0a828969c23ce075671" Sep 29 10:48:47 crc kubenswrapper[4752]: I0929 10:48:47.064064 4752 scope.go:117] "RemoveContainer" containerID="d988ee7d19339e1e3336f3096d0ad29d0050f425aaee5ca8edb17f9c87bfef86" Sep 29 10:48:47 crc kubenswrapper[4752]: E0929 10:48:47.064459 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d988ee7d19339e1e3336f3096d0ad29d0050f425aaee5ca8edb17f9c87bfef86\": container with ID starting with d988ee7d19339e1e3336f3096d0ad29d0050f425aaee5ca8edb17f9c87bfef86 not found: ID does not exist" containerID="d988ee7d19339e1e3336f3096d0ad29d0050f425aaee5ca8edb17f9c87bfef86" Sep 29 10:48:47 crc kubenswrapper[4752]: I0929 10:48:47.064510 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d988ee7d19339e1e3336f3096d0ad29d0050f425aaee5ca8edb17f9c87bfef86"} err="failed to get container status \"d988ee7d19339e1e3336f3096d0ad29d0050f425aaee5ca8edb17f9c87bfef86\": rpc error: code = NotFound desc = could not find container \"d988ee7d19339e1e3336f3096d0ad29d0050f425aaee5ca8edb17f9c87bfef86\": container with ID starting with d988ee7d19339e1e3336f3096d0ad29d0050f425aaee5ca8edb17f9c87bfef86 not found: ID does not exist" Sep 29 10:48:47 crc kubenswrapper[4752]: I0929 10:48:47.064537 4752 scope.go:117] "RemoveContainer" containerID="10e0712e6f14e2255ad914c318ba42f3a3566ba33c400cb8b48a2361a4bd0698" Sep 29 10:48:47 crc kubenswrapper[4752]: E0929 10:48:47.064841 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10e0712e6f14e2255ad914c318ba42f3a3566ba33c400cb8b48a2361a4bd0698\": container with ID starting with 10e0712e6f14e2255ad914c318ba42f3a3566ba33c400cb8b48a2361a4bd0698 not found: ID does not exist" containerID="10e0712e6f14e2255ad914c318ba42f3a3566ba33c400cb8b48a2361a4bd0698" Sep 29 10:48:47 crc kubenswrapper[4752]: I0929 10:48:47.064872 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10e0712e6f14e2255ad914c318ba42f3a3566ba33c400cb8b48a2361a4bd0698"} err="failed to get container status \"10e0712e6f14e2255ad914c318ba42f3a3566ba33c400cb8b48a2361a4bd0698\": rpc error: code = NotFound desc = could not find container \"10e0712e6f14e2255ad914c318ba42f3a3566ba33c400cb8b48a2361a4bd0698\": container with ID starting with 10e0712e6f14e2255ad914c318ba42f3a3566ba33c400cb8b48a2361a4bd0698 not found: ID does not exist" Sep 29 10:48:47 crc kubenswrapper[4752]: I0929 10:48:47.064892 4752 scope.go:117] "RemoveContainer" containerID="0c2575ba5de5d4677207db7b3a87e1307f2d55f53f76e0a828969c23ce075671" Sep 29 10:48:47 crc kubenswrapper[4752]: E0929 10:48:47.065116 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c2575ba5de5d4677207db7b3a87e1307f2d55f53f76e0a828969c23ce075671\": container with ID starting with 0c2575ba5de5d4677207db7b3a87e1307f2d55f53f76e0a828969c23ce075671 not found: ID does not exist" containerID="0c2575ba5de5d4677207db7b3a87e1307f2d55f53f76e0a828969c23ce075671" Sep 29 10:48:47 crc kubenswrapper[4752]: I0929 10:48:47.065138 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c2575ba5de5d4677207db7b3a87e1307f2d55f53f76e0a828969c23ce075671"} err="failed to get container status \"0c2575ba5de5d4677207db7b3a87e1307f2d55f53f76e0a828969c23ce075671\": rpc error: code = NotFound desc = could not find container \"0c2575ba5de5d4677207db7b3a87e1307f2d55f53f76e0a828969c23ce075671\": container with ID starting with 0c2575ba5de5d4677207db7b3a87e1307f2d55f53f76e0a828969c23ce075671 not found: ID does not exist" Sep 29 10:48:47 crc kubenswrapper[4752]: I0929 10:48:47.065152 4752 scope.go:117] "RemoveContainer" containerID="f912b42adf2424dd2eef9173682ffea98049dbd253b645455af2dc3509539ee4" Sep 29 10:48:47 crc kubenswrapper[4752]: I0929 10:48:47.079465 4752 scope.go:117] "RemoveContainer" containerID="4aa2a3edcf145318abf8b810b931eccb97bec79f868ba747f02873a8fce23af3" Sep 29 10:48:47 crc kubenswrapper[4752]: I0929 10:48:47.101123 4752 scope.go:117] "RemoveContainer" containerID="ff48ef37f008c732e90ca6eadc1d92123e1de0a924df91778543cbe249304097" Sep 29 10:48:47 crc kubenswrapper[4752]: I0929 10:48:47.119189 4752 scope.go:117] "RemoveContainer" containerID="f912b42adf2424dd2eef9173682ffea98049dbd253b645455af2dc3509539ee4" Sep 29 10:48:47 crc kubenswrapper[4752]: E0929 10:48:47.119743 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f912b42adf2424dd2eef9173682ffea98049dbd253b645455af2dc3509539ee4\": container with ID starting with f912b42adf2424dd2eef9173682ffea98049dbd253b645455af2dc3509539ee4 not found: ID does not exist" containerID="f912b42adf2424dd2eef9173682ffea98049dbd253b645455af2dc3509539ee4" Sep 29 10:48:47 crc kubenswrapper[4752]: I0929 10:48:47.119773 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f912b42adf2424dd2eef9173682ffea98049dbd253b645455af2dc3509539ee4"} err="failed to get container status \"f912b42adf2424dd2eef9173682ffea98049dbd253b645455af2dc3509539ee4\": rpc error: code = NotFound desc = could not find container \"f912b42adf2424dd2eef9173682ffea98049dbd253b645455af2dc3509539ee4\": container with ID starting with f912b42adf2424dd2eef9173682ffea98049dbd253b645455af2dc3509539ee4 not found: ID does not exist" Sep 29 10:48:47 crc kubenswrapper[4752]: I0929 10:48:47.119795 4752 scope.go:117] "RemoveContainer" containerID="4aa2a3edcf145318abf8b810b931eccb97bec79f868ba747f02873a8fce23af3" Sep 29 10:48:47 crc kubenswrapper[4752]: E0929 10:48:47.120493 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4aa2a3edcf145318abf8b810b931eccb97bec79f868ba747f02873a8fce23af3\": container with ID starting with 4aa2a3edcf145318abf8b810b931eccb97bec79f868ba747f02873a8fce23af3 not found: ID does not exist" containerID="4aa2a3edcf145318abf8b810b931eccb97bec79f868ba747f02873a8fce23af3" Sep 29 10:48:47 crc kubenswrapper[4752]: I0929 10:48:47.120525 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4aa2a3edcf145318abf8b810b931eccb97bec79f868ba747f02873a8fce23af3"} err="failed to get container status \"4aa2a3edcf145318abf8b810b931eccb97bec79f868ba747f02873a8fce23af3\": rpc error: code = NotFound desc = could not find container \"4aa2a3edcf145318abf8b810b931eccb97bec79f868ba747f02873a8fce23af3\": container with ID starting with 4aa2a3edcf145318abf8b810b931eccb97bec79f868ba747f02873a8fce23af3 not found: ID does not exist" Sep 29 10:48:47 crc kubenswrapper[4752]: I0929 10:48:47.120543 4752 scope.go:117] "RemoveContainer" containerID="ff48ef37f008c732e90ca6eadc1d92123e1de0a924df91778543cbe249304097" Sep 29 10:48:47 crc kubenswrapper[4752]: E0929 10:48:47.120924 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff48ef37f008c732e90ca6eadc1d92123e1de0a924df91778543cbe249304097\": container with ID starting with ff48ef37f008c732e90ca6eadc1d92123e1de0a924df91778543cbe249304097 not found: ID does not exist" containerID="ff48ef37f008c732e90ca6eadc1d92123e1de0a924df91778543cbe249304097" Sep 29 10:48:47 crc kubenswrapper[4752]: I0929 10:48:47.120973 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff48ef37f008c732e90ca6eadc1d92123e1de0a924df91778543cbe249304097"} err="failed to get container status \"ff48ef37f008c732e90ca6eadc1d92123e1de0a924df91778543cbe249304097\": rpc error: code = NotFound desc = could not find container \"ff48ef37f008c732e90ca6eadc1d92123e1de0a924df91778543cbe249304097\": container with ID starting with ff48ef37f008c732e90ca6eadc1d92123e1de0a924df91778543cbe249304097 not found: ID does not exist" Sep 29 10:48:47 crc kubenswrapper[4752]: I0929 10:48:47.121009 4752 scope.go:117] "RemoveContainer" containerID="d0c5a7b9b00fc04b5d276284d9ede8949358bd36e57d305311bf53526fb9ae7b" Sep 29 10:48:47 crc kubenswrapper[4752]: I0929 10:48:47.137476 4752 scope.go:117] "RemoveContainer" containerID="b1db737947e02f325ac9276d4371efa75931ab79d6707833eb196e806c8be50f" Sep 29 10:48:47 crc kubenswrapper[4752]: I0929 10:48:47.155169 4752 scope.go:117] "RemoveContainer" containerID="7dc55d23e87767af43550aa64cf3e4efd9c4bfbc9617b64d5e43b2ab992c95a1" Sep 29 10:48:47 crc kubenswrapper[4752]: I0929 10:48:47.171577 4752 scope.go:117] "RemoveContainer" containerID="d0c5a7b9b00fc04b5d276284d9ede8949358bd36e57d305311bf53526fb9ae7b" Sep 29 10:48:47 crc kubenswrapper[4752]: E0929 10:48:47.171966 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0c5a7b9b00fc04b5d276284d9ede8949358bd36e57d305311bf53526fb9ae7b\": container with ID starting with d0c5a7b9b00fc04b5d276284d9ede8949358bd36e57d305311bf53526fb9ae7b not found: ID does not exist" containerID="d0c5a7b9b00fc04b5d276284d9ede8949358bd36e57d305311bf53526fb9ae7b" Sep 29 10:48:47 crc kubenswrapper[4752]: I0929 10:48:47.172023 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0c5a7b9b00fc04b5d276284d9ede8949358bd36e57d305311bf53526fb9ae7b"} err="failed to get container status \"d0c5a7b9b00fc04b5d276284d9ede8949358bd36e57d305311bf53526fb9ae7b\": rpc error: code = NotFound desc = could not find container \"d0c5a7b9b00fc04b5d276284d9ede8949358bd36e57d305311bf53526fb9ae7b\": container with ID starting with d0c5a7b9b00fc04b5d276284d9ede8949358bd36e57d305311bf53526fb9ae7b not found: ID does not exist" Sep 29 10:48:47 crc kubenswrapper[4752]: I0929 10:48:47.172050 4752 scope.go:117] "RemoveContainer" containerID="b1db737947e02f325ac9276d4371efa75931ab79d6707833eb196e806c8be50f" Sep 29 10:48:47 crc kubenswrapper[4752]: E0929 10:48:47.172552 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1db737947e02f325ac9276d4371efa75931ab79d6707833eb196e806c8be50f\": container with ID starting with b1db737947e02f325ac9276d4371efa75931ab79d6707833eb196e806c8be50f not found: ID does not exist" containerID="b1db737947e02f325ac9276d4371efa75931ab79d6707833eb196e806c8be50f" Sep 29 10:48:47 crc kubenswrapper[4752]: I0929 10:48:47.172589 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1db737947e02f325ac9276d4371efa75931ab79d6707833eb196e806c8be50f"} err="failed to get container status \"b1db737947e02f325ac9276d4371efa75931ab79d6707833eb196e806c8be50f\": rpc error: code = NotFound desc = could not find container \"b1db737947e02f325ac9276d4371efa75931ab79d6707833eb196e806c8be50f\": container with ID starting with b1db737947e02f325ac9276d4371efa75931ab79d6707833eb196e806c8be50f not found: ID does not exist" Sep 29 10:48:47 crc kubenswrapper[4752]: I0929 10:48:47.172605 4752 scope.go:117] "RemoveContainer" containerID="7dc55d23e87767af43550aa64cf3e4efd9c4bfbc9617b64d5e43b2ab992c95a1" Sep 29 10:48:47 crc kubenswrapper[4752]: E0929 10:48:47.172925 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7dc55d23e87767af43550aa64cf3e4efd9c4bfbc9617b64d5e43b2ab992c95a1\": container with ID starting with 7dc55d23e87767af43550aa64cf3e4efd9c4bfbc9617b64d5e43b2ab992c95a1 not found: ID does not exist" containerID="7dc55d23e87767af43550aa64cf3e4efd9c4bfbc9617b64d5e43b2ab992c95a1" Sep 29 10:48:47 crc kubenswrapper[4752]: I0929 10:48:47.172942 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7dc55d23e87767af43550aa64cf3e4efd9c4bfbc9617b64d5e43b2ab992c95a1"} err="failed to get container status \"7dc55d23e87767af43550aa64cf3e4efd9c4bfbc9617b64d5e43b2ab992c95a1\": rpc error: code = NotFound desc = could not find container \"7dc55d23e87767af43550aa64cf3e4efd9c4bfbc9617b64d5e43b2ab992c95a1\": container with ID starting with 7dc55d23e87767af43550aa64cf3e4efd9c4bfbc9617b64d5e43b2ab992c95a1 not found: ID does not exist" Sep 29 10:48:47 crc kubenswrapper[4752]: I0929 10:48:47.927515 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-64489" event={"ID":"6d06e125-ac1f-4214-8e49-35a46d23413b","Type":"ContainerStarted","Data":"778d5d81b8ee90f59e97e4b4782113e8bd686469c520b0bf831e0c40fd88bd9e"} Sep 29 10:48:47 crc kubenswrapper[4752]: I0929 10:48:47.927568 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-64489" event={"ID":"6d06e125-ac1f-4214-8e49-35a46d23413b","Type":"ContainerStarted","Data":"8d5eae044bf6d4d613989e90cb86a6ae007990092007cc5f2bb84d2f66c1c0b2"} Sep 29 10:48:47 crc kubenswrapper[4752]: I0929 10:48:47.930390 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-64489" Sep 29 10:48:47 crc kubenswrapper[4752]: I0929 10:48:47.931647 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-64489" Sep 29 10:48:47 crc kubenswrapper[4752]: I0929 10:48:47.970132 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-64489" podStartSLOduration=1.970111073 podStartE2EDuration="1.970111073s" podCreationTimestamp="2025-09-29 10:48:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:48:47.951792589 +0000 UTC m=+268.740934256" watchObservedRunningTime="2025-09-29 10:48:47.970111073 +0000 UTC m=+268.759252740" Sep 29 10:48:48 crc kubenswrapper[4752]: I0929 10:48:48.038772 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06f9e526-21c6-4e20-b1a8-8f4fbfaa6413" path="/var/lib/kubelet/pods/06f9e526-21c6-4e20-b1a8-8f4fbfaa6413/volumes" Sep 29 10:48:48 crc kubenswrapper[4752]: I0929 10:48:48.039456 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7beaf483-1002-4e94-a9ee-59e20e83f824" path="/var/lib/kubelet/pods/7beaf483-1002-4e94-a9ee-59e20e83f824/volumes" Sep 29 10:48:48 crc kubenswrapper[4752]: I0929 10:48:48.040181 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83d184f7-5dec-4c4c-b53e-d26af311916c" path="/var/lib/kubelet/pods/83d184f7-5dec-4c4c-b53e-d26af311916c/volumes" Sep 29 10:48:48 crc kubenswrapper[4752]: I0929 10:48:48.041707 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98952fd9-6515-4ba1-8d1a-490a2c3e33b1" path="/var/lib/kubelet/pods/98952fd9-6515-4ba1-8d1a-490a2c3e33b1/volumes" Sep 29 10:48:48 crc kubenswrapper[4752]: I0929 10:48:48.042638 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d04ea3ea-71ac-481c-990b-a989a6f61516" path="/var/lib/kubelet/pods/d04ea3ea-71ac-481c-990b-a989a6f61516/volumes" Sep 29 10:48:48 crc kubenswrapper[4752]: I0929 10:48:48.286127 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-zwcjc"] Sep 29 10:48:48 crc kubenswrapper[4752]: E0929 10:48:48.286382 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83d184f7-5dec-4c4c-b53e-d26af311916c" containerName="extract-utilities" Sep 29 10:48:48 crc kubenswrapper[4752]: I0929 10:48:48.286397 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="83d184f7-5dec-4c4c-b53e-d26af311916c" containerName="extract-utilities" Sep 29 10:48:48 crc kubenswrapper[4752]: E0929 10:48:48.286407 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7beaf483-1002-4e94-a9ee-59e20e83f824" containerName="registry-server" Sep 29 10:48:48 crc kubenswrapper[4752]: I0929 10:48:48.286413 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="7beaf483-1002-4e94-a9ee-59e20e83f824" containerName="registry-server" Sep 29 10:48:48 crc kubenswrapper[4752]: E0929 10:48:48.286425 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7beaf483-1002-4e94-a9ee-59e20e83f824" containerName="extract-content" Sep 29 10:48:48 crc kubenswrapper[4752]: I0929 10:48:48.286433 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="7beaf483-1002-4e94-a9ee-59e20e83f824" containerName="extract-content" Sep 29 10:48:48 crc kubenswrapper[4752]: E0929 10:48:48.286441 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06f9e526-21c6-4e20-b1a8-8f4fbfaa6413" containerName="extract-utilities" Sep 29 10:48:48 crc kubenswrapper[4752]: I0929 10:48:48.286447 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="06f9e526-21c6-4e20-b1a8-8f4fbfaa6413" containerName="extract-utilities" Sep 29 10:48:48 crc kubenswrapper[4752]: E0929 10:48:48.286457 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98952fd9-6515-4ba1-8d1a-490a2c3e33b1" containerName="extract-content" Sep 29 10:48:48 crc kubenswrapper[4752]: I0929 10:48:48.286462 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="98952fd9-6515-4ba1-8d1a-490a2c3e33b1" containerName="extract-content" Sep 29 10:48:48 crc kubenswrapper[4752]: E0929 10:48:48.286470 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06f9e526-21c6-4e20-b1a8-8f4fbfaa6413" containerName="registry-server" Sep 29 10:48:48 crc kubenswrapper[4752]: I0929 10:48:48.286478 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="06f9e526-21c6-4e20-b1a8-8f4fbfaa6413" containerName="registry-server" Sep 29 10:48:48 crc kubenswrapper[4752]: E0929 10:48:48.286485 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83d184f7-5dec-4c4c-b53e-d26af311916c" containerName="extract-content" Sep 29 10:48:48 crc kubenswrapper[4752]: I0929 10:48:48.286491 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="83d184f7-5dec-4c4c-b53e-d26af311916c" containerName="extract-content" Sep 29 10:48:48 crc kubenswrapper[4752]: E0929 10:48:48.286503 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98952fd9-6515-4ba1-8d1a-490a2c3e33b1" containerName="registry-server" Sep 29 10:48:48 crc kubenswrapper[4752]: I0929 10:48:48.286509 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="98952fd9-6515-4ba1-8d1a-490a2c3e33b1" containerName="registry-server" Sep 29 10:48:48 crc kubenswrapper[4752]: E0929 10:48:48.286522 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d04ea3ea-71ac-481c-990b-a989a6f61516" containerName="marketplace-operator" Sep 29 10:48:48 crc kubenswrapper[4752]: I0929 10:48:48.286528 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="d04ea3ea-71ac-481c-990b-a989a6f61516" containerName="marketplace-operator" Sep 29 10:48:48 crc kubenswrapper[4752]: E0929 10:48:48.286534 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7beaf483-1002-4e94-a9ee-59e20e83f824" containerName="extract-utilities" Sep 29 10:48:48 crc kubenswrapper[4752]: I0929 10:48:48.286540 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="7beaf483-1002-4e94-a9ee-59e20e83f824" containerName="extract-utilities" Sep 29 10:48:48 crc kubenswrapper[4752]: E0929 10:48:48.286549 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83d184f7-5dec-4c4c-b53e-d26af311916c" containerName="registry-server" Sep 29 10:48:48 crc kubenswrapper[4752]: I0929 10:48:48.286555 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="83d184f7-5dec-4c4c-b53e-d26af311916c" containerName="registry-server" Sep 29 10:48:48 crc kubenswrapper[4752]: E0929 10:48:48.286564 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06f9e526-21c6-4e20-b1a8-8f4fbfaa6413" containerName="extract-content" Sep 29 10:48:48 crc kubenswrapper[4752]: I0929 10:48:48.286570 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="06f9e526-21c6-4e20-b1a8-8f4fbfaa6413" containerName="extract-content" Sep 29 10:48:48 crc kubenswrapper[4752]: E0929 10:48:48.286583 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98952fd9-6515-4ba1-8d1a-490a2c3e33b1" containerName="extract-utilities" Sep 29 10:48:48 crc kubenswrapper[4752]: I0929 10:48:48.286588 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="98952fd9-6515-4ba1-8d1a-490a2c3e33b1" containerName="extract-utilities" Sep 29 10:48:48 crc kubenswrapper[4752]: I0929 10:48:48.286672 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="06f9e526-21c6-4e20-b1a8-8f4fbfaa6413" containerName="registry-server" Sep 29 10:48:48 crc kubenswrapper[4752]: I0929 10:48:48.286687 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="83d184f7-5dec-4c4c-b53e-d26af311916c" containerName="registry-server" Sep 29 10:48:48 crc kubenswrapper[4752]: I0929 10:48:48.286693 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="d04ea3ea-71ac-481c-990b-a989a6f61516" containerName="marketplace-operator" Sep 29 10:48:48 crc kubenswrapper[4752]: I0929 10:48:48.286703 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="98952fd9-6515-4ba1-8d1a-490a2c3e33b1" containerName="registry-server" Sep 29 10:48:48 crc kubenswrapper[4752]: I0929 10:48:48.286713 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="7beaf483-1002-4e94-a9ee-59e20e83f824" containerName="registry-server" Sep 29 10:48:48 crc kubenswrapper[4752]: I0929 10:48:48.287592 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zwcjc" Sep 29 10:48:48 crc kubenswrapper[4752]: I0929 10:48:48.291675 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Sep 29 10:48:48 crc kubenswrapper[4752]: I0929 10:48:48.301980 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zwcjc"] Sep 29 10:48:48 crc kubenswrapper[4752]: I0929 10:48:48.319566 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10b46062-2635-4c36-9153-0f5b4ae3c054-catalog-content\") pod \"redhat-marketplace-zwcjc\" (UID: \"10b46062-2635-4c36-9153-0f5b4ae3c054\") " pod="openshift-marketplace/redhat-marketplace-zwcjc" Sep 29 10:48:48 crc kubenswrapper[4752]: I0929 10:48:48.319668 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xhx9\" (UniqueName: \"kubernetes.io/projected/10b46062-2635-4c36-9153-0f5b4ae3c054-kube-api-access-6xhx9\") pod \"redhat-marketplace-zwcjc\" (UID: \"10b46062-2635-4c36-9153-0f5b4ae3c054\") " pod="openshift-marketplace/redhat-marketplace-zwcjc" Sep 29 10:48:48 crc kubenswrapper[4752]: I0929 10:48:48.319703 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10b46062-2635-4c36-9153-0f5b4ae3c054-utilities\") pod \"redhat-marketplace-zwcjc\" (UID: \"10b46062-2635-4c36-9153-0f5b4ae3c054\") " pod="openshift-marketplace/redhat-marketplace-zwcjc" Sep 29 10:48:48 crc kubenswrapper[4752]: I0929 10:48:48.421044 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10b46062-2635-4c36-9153-0f5b4ae3c054-catalog-content\") pod \"redhat-marketplace-zwcjc\" (UID: \"10b46062-2635-4c36-9153-0f5b4ae3c054\") " pod="openshift-marketplace/redhat-marketplace-zwcjc" Sep 29 10:48:48 crc kubenswrapper[4752]: I0929 10:48:48.421136 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xhx9\" (UniqueName: \"kubernetes.io/projected/10b46062-2635-4c36-9153-0f5b4ae3c054-kube-api-access-6xhx9\") pod \"redhat-marketplace-zwcjc\" (UID: \"10b46062-2635-4c36-9153-0f5b4ae3c054\") " pod="openshift-marketplace/redhat-marketplace-zwcjc" Sep 29 10:48:48 crc kubenswrapper[4752]: I0929 10:48:48.421163 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10b46062-2635-4c36-9153-0f5b4ae3c054-utilities\") pod \"redhat-marketplace-zwcjc\" (UID: \"10b46062-2635-4c36-9153-0f5b4ae3c054\") " pod="openshift-marketplace/redhat-marketplace-zwcjc" Sep 29 10:48:48 crc kubenswrapper[4752]: I0929 10:48:48.421795 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10b46062-2635-4c36-9153-0f5b4ae3c054-utilities\") pod \"redhat-marketplace-zwcjc\" (UID: \"10b46062-2635-4c36-9153-0f5b4ae3c054\") " pod="openshift-marketplace/redhat-marketplace-zwcjc" Sep 29 10:48:48 crc kubenswrapper[4752]: I0929 10:48:48.422257 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10b46062-2635-4c36-9153-0f5b4ae3c054-catalog-content\") pod \"redhat-marketplace-zwcjc\" (UID: \"10b46062-2635-4c36-9153-0f5b4ae3c054\") " pod="openshift-marketplace/redhat-marketplace-zwcjc" Sep 29 10:48:48 crc kubenswrapper[4752]: I0929 10:48:48.445875 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xhx9\" (UniqueName: \"kubernetes.io/projected/10b46062-2635-4c36-9153-0f5b4ae3c054-kube-api-access-6xhx9\") pod \"redhat-marketplace-zwcjc\" (UID: \"10b46062-2635-4c36-9153-0f5b4ae3c054\") " pod="openshift-marketplace/redhat-marketplace-zwcjc" Sep 29 10:48:48 crc kubenswrapper[4752]: I0929 10:48:48.486827 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-t95zn"] Sep 29 10:48:48 crc kubenswrapper[4752]: I0929 10:48:48.488383 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t95zn" Sep 29 10:48:48 crc kubenswrapper[4752]: I0929 10:48:48.491041 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Sep 29 10:48:48 crc kubenswrapper[4752]: I0929 10:48:48.498405 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-t95zn"] Sep 29 10:48:48 crc kubenswrapper[4752]: I0929 10:48:48.522825 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73deb0e5-ae20-4412-b500-571db85cc292-utilities\") pod \"redhat-operators-t95zn\" (UID: \"73deb0e5-ae20-4412-b500-571db85cc292\") " pod="openshift-marketplace/redhat-operators-t95zn" Sep 29 10:48:48 crc kubenswrapper[4752]: I0929 10:48:48.522895 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73deb0e5-ae20-4412-b500-571db85cc292-catalog-content\") pod \"redhat-operators-t95zn\" (UID: \"73deb0e5-ae20-4412-b500-571db85cc292\") " pod="openshift-marketplace/redhat-operators-t95zn" Sep 29 10:48:48 crc kubenswrapper[4752]: I0929 10:48:48.523201 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lntjf\" (UniqueName: \"kubernetes.io/projected/73deb0e5-ae20-4412-b500-571db85cc292-kube-api-access-lntjf\") pod \"redhat-operators-t95zn\" (UID: \"73deb0e5-ae20-4412-b500-571db85cc292\") " pod="openshift-marketplace/redhat-operators-t95zn" Sep 29 10:48:48 crc kubenswrapper[4752]: I0929 10:48:48.604910 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zwcjc" Sep 29 10:48:48 crc kubenswrapper[4752]: I0929 10:48:48.624884 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lntjf\" (UniqueName: \"kubernetes.io/projected/73deb0e5-ae20-4412-b500-571db85cc292-kube-api-access-lntjf\") pod \"redhat-operators-t95zn\" (UID: \"73deb0e5-ae20-4412-b500-571db85cc292\") " pod="openshift-marketplace/redhat-operators-t95zn" Sep 29 10:48:48 crc kubenswrapper[4752]: I0929 10:48:48.624956 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73deb0e5-ae20-4412-b500-571db85cc292-utilities\") pod \"redhat-operators-t95zn\" (UID: \"73deb0e5-ae20-4412-b500-571db85cc292\") " pod="openshift-marketplace/redhat-operators-t95zn" Sep 29 10:48:48 crc kubenswrapper[4752]: I0929 10:48:48.624988 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73deb0e5-ae20-4412-b500-571db85cc292-catalog-content\") pod \"redhat-operators-t95zn\" (UID: \"73deb0e5-ae20-4412-b500-571db85cc292\") " pod="openshift-marketplace/redhat-operators-t95zn" Sep 29 10:48:48 crc kubenswrapper[4752]: I0929 10:48:48.625450 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73deb0e5-ae20-4412-b500-571db85cc292-catalog-content\") pod \"redhat-operators-t95zn\" (UID: \"73deb0e5-ae20-4412-b500-571db85cc292\") " pod="openshift-marketplace/redhat-operators-t95zn" Sep 29 10:48:48 crc kubenswrapper[4752]: I0929 10:48:48.625550 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73deb0e5-ae20-4412-b500-571db85cc292-utilities\") pod \"redhat-operators-t95zn\" (UID: \"73deb0e5-ae20-4412-b500-571db85cc292\") " pod="openshift-marketplace/redhat-operators-t95zn" Sep 29 10:48:48 crc kubenswrapper[4752]: I0929 10:48:48.646042 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lntjf\" (UniqueName: \"kubernetes.io/projected/73deb0e5-ae20-4412-b500-571db85cc292-kube-api-access-lntjf\") pod \"redhat-operators-t95zn\" (UID: \"73deb0e5-ae20-4412-b500-571db85cc292\") " pod="openshift-marketplace/redhat-operators-t95zn" Sep 29 10:48:48 crc kubenswrapper[4752]: I0929 10:48:48.822208 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t95zn" Sep 29 10:48:49 crc kubenswrapper[4752]: I0929 10:48:49.015584 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zwcjc"] Sep 29 10:48:49 crc kubenswrapper[4752]: W0929 10:48:49.021940 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10b46062_2635_4c36_9153_0f5b4ae3c054.slice/crio-ac50ca3be5cc0413c7955637727c1991aa5783fb95386a2688233a005bdacc9f WatchSource:0}: Error finding container ac50ca3be5cc0413c7955637727c1991aa5783fb95386a2688233a005bdacc9f: Status 404 returned error can't find the container with id ac50ca3be5cc0413c7955637727c1991aa5783fb95386a2688233a005bdacc9f Sep 29 10:48:49 crc kubenswrapper[4752]: I0929 10:48:49.236384 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-t95zn"] Sep 29 10:48:49 crc kubenswrapper[4752]: W0929 10:48:49.275402 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73deb0e5_ae20_4412_b500_571db85cc292.slice/crio-08b88607ff7b915bfd7ab7b65097b91893b965fbad64d3ecf4962873365b76e6 WatchSource:0}: Error finding container 08b88607ff7b915bfd7ab7b65097b91893b965fbad64d3ecf4962873365b76e6: Status 404 returned error can't find the container with id 08b88607ff7b915bfd7ab7b65097b91893b965fbad64d3ecf4962873365b76e6 Sep 29 10:48:49 crc kubenswrapper[4752]: I0929 10:48:49.950037 4752 generic.go:334] "Generic (PLEG): container finished" podID="73deb0e5-ae20-4412-b500-571db85cc292" containerID="751f408aa56a540d619d84417039fefd6f66961578040e4ad9d41d1d1c0d967f" exitCode=0 Sep 29 10:48:49 crc kubenswrapper[4752]: I0929 10:48:49.950149 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t95zn" event={"ID":"73deb0e5-ae20-4412-b500-571db85cc292","Type":"ContainerDied","Data":"751f408aa56a540d619d84417039fefd6f66961578040e4ad9d41d1d1c0d967f"} Sep 29 10:48:49 crc kubenswrapper[4752]: I0929 10:48:49.950195 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t95zn" event={"ID":"73deb0e5-ae20-4412-b500-571db85cc292","Type":"ContainerStarted","Data":"08b88607ff7b915bfd7ab7b65097b91893b965fbad64d3ecf4962873365b76e6"} Sep 29 10:48:49 crc kubenswrapper[4752]: I0929 10:48:49.951957 4752 generic.go:334] "Generic (PLEG): container finished" podID="10b46062-2635-4c36-9153-0f5b4ae3c054" containerID="8a89f64ec8fc7261f1c2dc3cf485b7188cb9bc0799a0960ccb301b2973cd1882" exitCode=0 Sep 29 10:48:49 crc kubenswrapper[4752]: I0929 10:48:49.952408 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zwcjc" event={"ID":"10b46062-2635-4c36-9153-0f5b4ae3c054","Type":"ContainerDied","Data":"8a89f64ec8fc7261f1c2dc3cf485b7188cb9bc0799a0960ccb301b2973cd1882"} Sep 29 10:48:49 crc kubenswrapper[4752]: I0929 10:48:49.952487 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zwcjc" event={"ID":"10b46062-2635-4c36-9153-0f5b4ae3c054","Type":"ContainerStarted","Data":"ac50ca3be5cc0413c7955637727c1991aa5783fb95386a2688233a005bdacc9f"} Sep 29 10:48:50 crc kubenswrapper[4752]: I0929 10:48:50.686198 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-brllf"] Sep 29 10:48:50 crc kubenswrapper[4752]: I0929 10:48:50.687966 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-brllf" Sep 29 10:48:50 crc kubenswrapper[4752]: I0929 10:48:50.690938 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Sep 29 10:48:50 crc kubenswrapper[4752]: I0929 10:48:50.712571 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-brllf"] Sep 29 10:48:50 crc kubenswrapper[4752]: I0929 10:48:50.756008 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b31d27ce-a227-4b44-ac25-10e7c9133fb8-utilities\") pod \"certified-operators-brllf\" (UID: \"b31d27ce-a227-4b44-ac25-10e7c9133fb8\") " pod="openshift-marketplace/certified-operators-brllf" Sep 29 10:48:50 crc kubenswrapper[4752]: I0929 10:48:50.756165 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b31d27ce-a227-4b44-ac25-10e7c9133fb8-catalog-content\") pod \"certified-operators-brllf\" (UID: \"b31d27ce-a227-4b44-ac25-10e7c9133fb8\") " pod="openshift-marketplace/certified-operators-brllf" Sep 29 10:48:50 crc kubenswrapper[4752]: I0929 10:48:50.756207 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w49dt\" (UniqueName: \"kubernetes.io/projected/b31d27ce-a227-4b44-ac25-10e7c9133fb8-kube-api-access-w49dt\") pod \"certified-operators-brllf\" (UID: \"b31d27ce-a227-4b44-ac25-10e7c9133fb8\") " pod="openshift-marketplace/certified-operators-brllf" Sep 29 10:48:50 crc kubenswrapper[4752]: I0929 10:48:50.857235 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b31d27ce-a227-4b44-ac25-10e7c9133fb8-catalog-content\") pod \"certified-operators-brllf\" (UID: \"b31d27ce-a227-4b44-ac25-10e7c9133fb8\") " pod="openshift-marketplace/certified-operators-brllf" Sep 29 10:48:50 crc kubenswrapper[4752]: I0929 10:48:50.857574 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w49dt\" (UniqueName: \"kubernetes.io/projected/b31d27ce-a227-4b44-ac25-10e7c9133fb8-kube-api-access-w49dt\") pod \"certified-operators-brllf\" (UID: \"b31d27ce-a227-4b44-ac25-10e7c9133fb8\") " pod="openshift-marketplace/certified-operators-brllf" Sep 29 10:48:50 crc kubenswrapper[4752]: I0929 10:48:50.857687 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b31d27ce-a227-4b44-ac25-10e7c9133fb8-utilities\") pod \"certified-operators-brllf\" (UID: \"b31d27ce-a227-4b44-ac25-10e7c9133fb8\") " pod="openshift-marketplace/certified-operators-brllf" Sep 29 10:48:50 crc kubenswrapper[4752]: I0929 10:48:50.857969 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b31d27ce-a227-4b44-ac25-10e7c9133fb8-catalog-content\") pod \"certified-operators-brllf\" (UID: \"b31d27ce-a227-4b44-ac25-10e7c9133fb8\") " pod="openshift-marketplace/certified-operators-brllf" Sep 29 10:48:50 crc kubenswrapper[4752]: I0929 10:48:50.858031 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b31d27ce-a227-4b44-ac25-10e7c9133fb8-utilities\") pod \"certified-operators-brllf\" (UID: \"b31d27ce-a227-4b44-ac25-10e7c9133fb8\") " pod="openshift-marketplace/certified-operators-brllf" Sep 29 10:48:50 crc kubenswrapper[4752]: I0929 10:48:50.890621 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-bpmrk"] Sep 29 10:48:50 crc kubenswrapper[4752]: I0929 10:48:50.891865 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bpmrk" Sep 29 10:48:50 crc kubenswrapper[4752]: I0929 10:48:50.896230 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w49dt\" (UniqueName: \"kubernetes.io/projected/b31d27ce-a227-4b44-ac25-10e7c9133fb8-kube-api-access-w49dt\") pod \"certified-operators-brllf\" (UID: \"b31d27ce-a227-4b44-ac25-10e7c9133fb8\") " pod="openshift-marketplace/certified-operators-brllf" Sep 29 10:48:50 crc kubenswrapper[4752]: I0929 10:48:50.897355 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Sep 29 10:48:50 crc kubenswrapper[4752]: I0929 10:48:50.903075 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bpmrk"] Sep 29 10:48:50 crc kubenswrapper[4752]: I0929 10:48:50.958949 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ks6x\" (UniqueName: \"kubernetes.io/projected/201657e8-ebe9-4415-acd9-9971ede44bd2-kube-api-access-5ks6x\") pod \"community-operators-bpmrk\" (UID: \"201657e8-ebe9-4415-acd9-9971ede44bd2\") " pod="openshift-marketplace/community-operators-bpmrk" Sep 29 10:48:50 crc kubenswrapper[4752]: I0929 10:48:50.959349 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/201657e8-ebe9-4415-acd9-9971ede44bd2-catalog-content\") pod \"community-operators-bpmrk\" (UID: \"201657e8-ebe9-4415-acd9-9971ede44bd2\") " pod="openshift-marketplace/community-operators-bpmrk" Sep 29 10:48:50 crc kubenswrapper[4752]: I0929 10:48:50.959402 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/201657e8-ebe9-4415-acd9-9971ede44bd2-utilities\") pod \"community-operators-bpmrk\" (UID: \"201657e8-ebe9-4415-acd9-9971ede44bd2\") " pod="openshift-marketplace/community-operators-bpmrk" Sep 29 10:48:51 crc kubenswrapper[4752]: I0929 10:48:51.008690 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-brllf" Sep 29 10:48:51 crc kubenswrapper[4752]: I0929 10:48:51.060918 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/201657e8-ebe9-4415-acd9-9971ede44bd2-utilities\") pod \"community-operators-bpmrk\" (UID: \"201657e8-ebe9-4415-acd9-9971ede44bd2\") " pod="openshift-marketplace/community-operators-bpmrk" Sep 29 10:48:51 crc kubenswrapper[4752]: I0929 10:48:51.061288 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ks6x\" (UniqueName: \"kubernetes.io/projected/201657e8-ebe9-4415-acd9-9971ede44bd2-kube-api-access-5ks6x\") pod \"community-operators-bpmrk\" (UID: \"201657e8-ebe9-4415-acd9-9971ede44bd2\") " pod="openshift-marketplace/community-operators-bpmrk" Sep 29 10:48:51 crc kubenswrapper[4752]: I0929 10:48:51.061344 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/201657e8-ebe9-4415-acd9-9971ede44bd2-catalog-content\") pod \"community-operators-bpmrk\" (UID: \"201657e8-ebe9-4415-acd9-9971ede44bd2\") " pod="openshift-marketplace/community-operators-bpmrk" Sep 29 10:48:51 crc kubenswrapper[4752]: I0929 10:48:51.061771 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/201657e8-ebe9-4415-acd9-9971ede44bd2-utilities\") pod \"community-operators-bpmrk\" (UID: \"201657e8-ebe9-4415-acd9-9971ede44bd2\") " pod="openshift-marketplace/community-operators-bpmrk" Sep 29 10:48:51 crc kubenswrapper[4752]: I0929 10:48:51.061894 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/201657e8-ebe9-4415-acd9-9971ede44bd2-catalog-content\") pod \"community-operators-bpmrk\" (UID: \"201657e8-ebe9-4415-acd9-9971ede44bd2\") " pod="openshift-marketplace/community-operators-bpmrk" Sep 29 10:48:51 crc kubenswrapper[4752]: I0929 10:48:51.082558 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ks6x\" (UniqueName: \"kubernetes.io/projected/201657e8-ebe9-4415-acd9-9971ede44bd2-kube-api-access-5ks6x\") pod \"community-operators-bpmrk\" (UID: \"201657e8-ebe9-4415-acd9-9971ede44bd2\") " pod="openshift-marketplace/community-operators-bpmrk" Sep 29 10:48:51 crc kubenswrapper[4752]: I0929 10:48:51.237832 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-brllf"] Sep 29 10:48:51 crc kubenswrapper[4752]: I0929 10:48:51.246534 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bpmrk" Sep 29 10:48:51 crc kubenswrapper[4752]: I0929 10:48:51.938259 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bpmrk"] Sep 29 10:48:51 crc kubenswrapper[4752]: I0929 10:48:51.964634 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t95zn" event={"ID":"73deb0e5-ae20-4412-b500-571db85cc292","Type":"ContainerStarted","Data":"2ae1e204533a8c3917baf71c728617b25db8d20622ed0bb1d704957a792e07b5"} Sep 29 10:48:51 crc kubenswrapper[4752]: I0929 10:48:51.967757 4752 generic.go:334] "Generic (PLEG): container finished" podID="10b46062-2635-4c36-9153-0f5b4ae3c054" containerID="6ceefdd8abcd8bec6df0672c2416d55d9b33232b88c1be527248e88c6117c5c5" exitCode=0 Sep 29 10:48:51 crc kubenswrapper[4752]: I0929 10:48:51.967863 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zwcjc" event={"ID":"10b46062-2635-4c36-9153-0f5b4ae3c054","Type":"ContainerDied","Data":"6ceefdd8abcd8bec6df0672c2416d55d9b33232b88c1be527248e88c6117c5c5"} Sep 29 10:48:51 crc kubenswrapper[4752]: I0929 10:48:51.973281 4752 generic.go:334] "Generic (PLEG): container finished" podID="b31d27ce-a227-4b44-ac25-10e7c9133fb8" containerID="27717a49532eeddbf7f0cef6546f5c1cf8918142ea71e9d812e1cdee3b017b7d" exitCode=0 Sep 29 10:48:51 crc kubenswrapper[4752]: I0929 10:48:51.973357 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-brllf" event={"ID":"b31d27ce-a227-4b44-ac25-10e7c9133fb8","Type":"ContainerDied","Data":"27717a49532eeddbf7f0cef6546f5c1cf8918142ea71e9d812e1cdee3b017b7d"} Sep 29 10:48:51 crc kubenswrapper[4752]: I0929 10:48:51.973389 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-brllf" event={"ID":"b31d27ce-a227-4b44-ac25-10e7c9133fb8","Type":"ContainerStarted","Data":"ce50b6cfafaae1a812631057966a57f8260b5b4a5a3904d314a9205e0af7cfba"} Sep 29 10:48:51 crc kubenswrapper[4752]: I0929 10:48:51.974823 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bpmrk" event={"ID":"201657e8-ebe9-4415-acd9-9971ede44bd2","Type":"ContainerStarted","Data":"d4b0d43b276116e22ddf62523f15f9195e736e4f1a0eff6bd30e0184e75cd309"} Sep 29 10:48:52 crc kubenswrapper[4752]: I0929 10:48:52.985917 4752 generic.go:334] "Generic (PLEG): container finished" podID="73deb0e5-ae20-4412-b500-571db85cc292" containerID="2ae1e204533a8c3917baf71c728617b25db8d20622ed0bb1d704957a792e07b5" exitCode=0 Sep 29 10:48:52 crc kubenswrapper[4752]: I0929 10:48:52.986168 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t95zn" event={"ID":"73deb0e5-ae20-4412-b500-571db85cc292","Type":"ContainerDied","Data":"2ae1e204533a8c3917baf71c728617b25db8d20622ed0bb1d704957a792e07b5"} Sep 29 10:48:52 crc kubenswrapper[4752]: I0929 10:48:52.989239 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zwcjc" event={"ID":"10b46062-2635-4c36-9153-0f5b4ae3c054","Type":"ContainerStarted","Data":"99e024d5ad73d3658bfbc77d247b8839c324325f49ef024a3bfff7a0b6e02d36"} Sep 29 10:48:52 crc kubenswrapper[4752]: I0929 10:48:52.992006 4752 generic.go:334] "Generic (PLEG): container finished" podID="201657e8-ebe9-4415-acd9-9971ede44bd2" containerID="7ad1a28f5e8ebea157fd893ce644e2796d6a2e9c3480f2ad7ade6447c75dc7e1" exitCode=0 Sep 29 10:48:52 crc kubenswrapper[4752]: I0929 10:48:52.992070 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bpmrk" event={"ID":"201657e8-ebe9-4415-acd9-9971ede44bd2","Type":"ContainerDied","Data":"7ad1a28f5e8ebea157fd893ce644e2796d6a2e9c3480f2ad7ade6447c75dc7e1"} Sep 29 10:48:53 crc kubenswrapper[4752]: I0929 10:48:53.025369 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-zwcjc" podStartSLOduration=2.242086548 podStartE2EDuration="5.025342305s" podCreationTimestamp="2025-09-29 10:48:48 +0000 UTC" firstStartedPulling="2025-09-29 10:48:49.954100357 +0000 UTC m=+270.743242024" lastFinishedPulling="2025-09-29 10:48:52.737356114 +0000 UTC m=+273.526497781" observedRunningTime="2025-09-29 10:48:53.025284093 +0000 UTC m=+273.814425781" watchObservedRunningTime="2025-09-29 10:48:53.025342305 +0000 UTC m=+273.814483972" Sep 29 10:48:55 crc kubenswrapper[4752]: I0929 10:48:55.008878 4752 generic.go:334] "Generic (PLEG): container finished" podID="b31d27ce-a227-4b44-ac25-10e7c9133fb8" containerID="7af981f530af63bfc306075d760bf83811931dedcf082fc31803afd4918571f5" exitCode=0 Sep 29 10:48:55 crc kubenswrapper[4752]: I0929 10:48:55.008981 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-brllf" event={"ID":"b31d27ce-a227-4b44-ac25-10e7c9133fb8","Type":"ContainerDied","Data":"7af981f530af63bfc306075d760bf83811931dedcf082fc31803afd4918571f5"} Sep 29 10:48:55 crc kubenswrapper[4752]: I0929 10:48:55.016848 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t95zn" event={"ID":"73deb0e5-ae20-4412-b500-571db85cc292","Type":"ContainerStarted","Data":"053570d9ffeec7129249cb9c99bef660d0ca55a7bc728bde0d6b106dbfa59a89"} Sep 29 10:48:55 crc kubenswrapper[4752]: I0929 10:48:55.051980 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-t95zn" podStartSLOduration=2.974175024 podStartE2EDuration="7.051962777s" podCreationTimestamp="2025-09-29 10:48:48 +0000 UTC" firstStartedPulling="2025-09-29 10:48:49.951736816 +0000 UTC m=+270.740878483" lastFinishedPulling="2025-09-29 10:48:54.029524569 +0000 UTC m=+274.818666236" observedRunningTime="2025-09-29 10:48:55.048431034 +0000 UTC m=+275.837572721" watchObservedRunningTime="2025-09-29 10:48:55.051962777 +0000 UTC m=+275.841104444" Sep 29 10:48:56 crc kubenswrapper[4752]: I0929 10:48:56.026248 4752 generic.go:334] "Generic (PLEG): container finished" podID="201657e8-ebe9-4415-acd9-9971ede44bd2" containerID="2977dc9f34141c37593a06c1355aca688e3ab7cdfab516f28e43f8aea4dbbf1e" exitCode=0 Sep 29 10:48:56 crc kubenswrapper[4752]: I0929 10:48:56.026360 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bpmrk" event={"ID":"201657e8-ebe9-4415-acd9-9971ede44bd2","Type":"ContainerDied","Data":"2977dc9f34141c37593a06c1355aca688e3ab7cdfab516f28e43f8aea4dbbf1e"} Sep 29 10:48:56 crc kubenswrapper[4752]: I0929 10:48:56.041197 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-brllf" event={"ID":"b31d27ce-a227-4b44-ac25-10e7c9133fb8","Type":"ContainerStarted","Data":"cfab89d425d73f9838d20b0e663a5437d270265abb2a4bc5e0dd554ef0b9d4d5"} Sep 29 10:48:56 crc kubenswrapper[4752]: I0929 10:48:56.075425 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-brllf" podStartSLOduration=2.34235986 podStartE2EDuration="6.075392511s" podCreationTimestamp="2025-09-29 10:48:50 +0000 UTC" firstStartedPulling="2025-09-29 10:48:51.975362608 +0000 UTC m=+272.764504275" lastFinishedPulling="2025-09-29 10:48:55.708395259 +0000 UTC m=+276.497536926" observedRunningTime="2025-09-29 10:48:56.073617924 +0000 UTC m=+276.862759601" watchObservedRunningTime="2025-09-29 10:48:56.075392511 +0000 UTC m=+276.864534168" Sep 29 10:48:57 crc kubenswrapper[4752]: I0929 10:48:57.041733 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bpmrk" event={"ID":"201657e8-ebe9-4415-acd9-9971ede44bd2","Type":"ContainerStarted","Data":"397193adea99c795e0837cfd11c529738d48410a7cb572b17b02a58ae6456cba"} Sep 29 10:48:58 crc kubenswrapper[4752]: I0929 10:48:58.605264 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-zwcjc" Sep 29 10:48:58 crc kubenswrapper[4752]: I0929 10:48:58.605625 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-zwcjc" Sep 29 10:48:58 crc kubenswrapper[4752]: I0929 10:48:58.660999 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-zwcjc" Sep 29 10:48:58 crc kubenswrapper[4752]: I0929 10:48:58.687764 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-bpmrk" podStartSLOduration=5.37967138 podStartE2EDuration="8.687727964s" podCreationTimestamp="2025-09-29 10:48:50 +0000 UTC" firstStartedPulling="2025-09-29 10:48:53.115775598 +0000 UTC m=+273.904917265" lastFinishedPulling="2025-09-29 10:48:56.423832182 +0000 UTC m=+277.212973849" observedRunningTime="2025-09-29 10:48:57.061039295 +0000 UTC m=+277.850180962" watchObservedRunningTime="2025-09-29 10:48:58.687727964 +0000 UTC m=+279.476869631" Sep 29 10:48:58 crc kubenswrapper[4752]: I0929 10:48:58.823829 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-t95zn" Sep 29 10:48:58 crc kubenswrapper[4752]: I0929 10:48:58.823891 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-t95zn" Sep 29 10:48:58 crc kubenswrapper[4752]: I0929 10:48:58.865076 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-t95zn" Sep 29 10:48:59 crc kubenswrapper[4752]: I0929 10:48:59.093744 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-t95zn" Sep 29 10:48:59 crc kubenswrapper[4752]: I0929 10:48:59.095804 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-zwcjc" Sep 29 10:49:01 crc kubenswrapper[4752]: I0929 10:49:01.008957 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-brllf" Sep 29 10:49:01 crc kubenswrapper[4752]: I0929 10:49:01.009923 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-brllf" Sep 29 10:49:01 crc kubenswrapper[4752]: I0929 10:49:01.050249 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-brllf" Sep 29 10:49:01 crc kubenswrapper[4752]: I0929 10:49:01.104262 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-brllf" Sep 29 10:49:01 crc kubenswrapper[4752]: I0929 10:49:01.247806 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-bpmrk" Sep 29 10:49:01 crc kubenswrapper[4752]: I0929 10:49:01.247873 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-bpmrk" Sep 29 10:49:01 crc kubenswrapper[4752]: I0929 10:49:01.285700 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-bpmrk" Sep 29 10:49:02 crc kubenswrapper[4752]: I0929 10:49:02.108156 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-bpmrk" Sep 29 10:49:56 crc kubenswrapper[4752]: I0929 10:49:56.176166 4752 patch_prober.go:28] interesting pod/machine-config-daemon-mgrvs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 10:49:56 crc kubenswrapper[4752]: I0929 10:49:56.177296 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 10:50:26 crc kubenswrapper[4752]: I0929 10:50:26.175681 4752 patch_prober.go:28] interesting pod/machine-config-daemon-mgrvs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 10:50:26 crc kubenswrapper[4752]: I0929 10:50:26.176620 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 10:50:56 crc kubenswrapper[4752]: I0929 10:50:56.175751 4752 patch_prober.go:28] interesting pod/machine-config-daemon-mgrvs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 10:50:56 crc kubenswrapper[4752]: I0929 10:50:56.176644 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 10:50:56 crc kubenswrapper[4752]: I0929 10:50:56.176709 4752 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" Sep 29 10:50:56 crc kubenswrapper[4752]: I0929 10:50:56.177542 4752 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"907813a6b730b09c945dfe34cd11dc9926afdeb1d7a721e02b4bac45108adba9"} pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 29 10:50:56 crc kubenswrapper[4752]: I0929 10:50:56.177612 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" containerName="machine-config-daemon" containerID="cri-o://907813a6b730b09c945dfe34cd11dc9926afdeb1d7a721e02b4bac45108adba9" gracePeriod=600 Sep 29 10:50:56 crc kubenswrapper[4752]: I0929 10:50:56.741276 4752 generic.go:334] "Generic (PLEG): container finished" podID="5863c243-797d-462a-b11f-71aaf005f8d1" containerID="907813a6b730b09c945dfe34cd11dc9926afdeb1d7a721e02b4bac45108adba9" exitCode=0 Sep 29 10:50:56 crc kubenswrapper[4752]: I0929 10:50:56.741380 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" event={"ID":"5863c243-797d-462a-b11f-71aaf005f8d1","Type":"ContainerDied","Data":"907813a6b730b09c945dfe34cd11dc9926afdeb1d7a721e02b4bac45108adba9"} Sep 29 10:50:56 crc kubenswrapper[4752]: I0929 10:50:56.741850 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" event={"ID":"5863c243-797d-462a-b11f-71aaf005f8d1","Type":"ContainerStarted","Data":"3f331ecf545ae76e5229e9ac291d3a8fabb44711af838903a027a58783c88d03"} Sep 29 10:50:56 crc kubenswrapper[4752]: I0929 10:50:56.741898 4752 scope.go:117] "RemoveContainer" containerID="32155f6078e9c15abe4c659ac79b064ec182a232ea1d816998da4de273b7aa67" Sep 29 10:51:23 crc kubenswrapper[4752]: I0929 10:51:23.148819 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-xxqqt"] Sep 29 10:51:23 crc kubenswrapper[4752]: I0929 10:51:23.150671 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-xxqqt" Sep 29 10:51:23 crc kubenswrapper[4752]: I0929 10:51:23.165021 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-xxqqt"] Sep 29 10:51:23 crc kubenswrapper[4752]: I0929 10:51:23.287494 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e3f0c747-6140-45e8-85a6-eb551b33adef-bound-sa-token\") pod \"image-registry-66df7c8f76-xxqqt\" (UID: \"e3f0c747-6140-45e8-85a6-eb551b33adef\") " pod="openshift-image-registry/image-registry-66df7c8f76-xxqqt" Sep 29 10:51:23 crc kubenswrapper[4752]: I0929 10:51:23.287546 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e3f0c747-6140-45e8-85a6-eb551b33adef-registry-tls\") pod \"image-registry-66df7c8f76-xxqqt\" (UID: \"e3f0c747-6140-45e8-85a6-eb551b33adef\") " pod="openshift-image-registry/image-registry-66df7c8f76-xxqqt" Sep 29 10:51:23 crc kubenswrapper[4752]: I0929 10:51:23.287575 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-xxqqt\" (UID: \"e3f0c747-6140-45e8-85a6-eb551b33adef\") " pod="openshift-image-registry/image-registry-66df7c8f76-xxqqt" Sep 29 10:51:23 crc kubenswrapper[4752]: I0929 10:51:23.287701 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e3f0c747-6140-45e8-85a6-eb551b33adef-ca-trust-extracted\") pod \"image-registry-66df7c8f76-xxqqt\" (UID: \"e3f0c747-6140-45e8-85a6-eb551b33adef\") " pod="openshift-image-registry/image-registry-66df7c8f76-xxqqt" Sep 29 10:51:23 crc kubenswrapper[4752]: I0929 10:51:23.287868 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e3f0c747-6140-45e8-85a6-eb551b33adef-trusted-ca\") pod \"image-registry-66df7c8f76-xxqqt\" (UID: \"e3f0c747-6140-45e8-85a6-eb551b33adef\") " pod="openshift-image-registry/image-registry-66df7c8f76-xxqqt" Sep 29 10:51:23 crc kubenswrapper[4752]: I0929 10:51:23.287921 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4w8fv\" (UniqueName: \"kubernetes.io/projected/e3f0c747-6140-45e8-85a6-eb551b33adef-kube-api-access-4w8fv\") pod \"image-registry-66df7c8f76-xxqqt\" (UID: \"e3f0c747-6140-45e8-85a6-eb551b33adef\") " pod="openshift-image-registry/image-registry-66df7c8f76-xxqqt" Sep 29 10:51:23 crc kubenswrapper[4752]: I0929 10:51:23.287965 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e3f0c747-6140-45e8-85a6-eb551b33adef-registry-certificates\") pod \"image-registry-66df7c8f76-xxqqt\" (UID: \"e3f0c747-6140-45e8-85a6-eb551b33adef\") " pod="openshift-image-registry/image-registry-66df7c8f76-xxqqt" Sep 29 10:51:23 crc kubenswrapper[4752]: I0929 10:51:23.288008 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e3f0c747-6140-45e8-85a6-eb551b33adef-installation-pull-secrets\") pod \"image-registry-66df7c8f76-xxqqt\" (UID: \"e3f0c747-6140-45e8-85a6-eb551b33adef\") " pod="openshift-image-registry/image-registry-66df7c8f76-xxqqt" Sep 29 10:51:23 crc kubenswrapper[4752]: I0929 10:51:23.315329 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-xxqqt\" (UID: \"e3f0c747-6140-45e8-85a6-eb551b33adef\") " pod="openshift-image-registry/image-registry-66df7c8f76-xxqqt" Sep 29 10:51:23 crc kubenswrapper[4752]: I0929 10:51:23.389363 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e3f0c747-6140-45e8-85a6-eb551b33adef-trusted-ca\") pod \"image-registry-66df7c8f76-xxqqt\" (UID: \"e3f0c747-6140-45e8-85a6-eb551b33adef\") " pod="openshift-image-registry/image-registry-66df7c8f76-xxqqt" Sep 29 10:51:23 crc kubenswrapper[4752]: I0929 10:51:23.389428 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4w8fv\" (UniqueName: \"kubernetes.io/projected/e3f0c747-6140-45e8-85a6-eb551b33adef-kube-api-access-4w8fv\") pod \"image-registry-66df7c8f76-xxqqt\" (UID: \"e3f0c747-6140-45e8-85a6-eb551b33adef\") " pod="openshift-image-registry/image-registry-66df7c8f76-xxqqt" Sep 29 10:51:23 crc kubenswrapper[4752]: I0929 10:51:23.389455 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e3f0c747-6140-45e8-85a6-eb551b33adef-registry-certificates\") pod \"image-registry-66df7c8f76-xxqqt\" (UID: \"e3f0c747-6140-45e8-85a6-eb551b33adef\") " pod="openshift-image-registry/image-registry-66df7c8f76-xxqqt" Sep 29 10:51:23 crc kubenswrapper[4752]: I0929 10:51:23.389479 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e3f0c747-6140-45e8-85a6-eb551b33adef-installation-pull-secrets\") pod \"image-registry-66df7c8f76-xxqqt\" (UID: \"e3f0c747-6140-45e8-85a6-eb551b33adef\") " pod="openshift-image-registry/image-registry-66df7c8f76-xxqqt" Sep 29 10:51:23 crc kubenswrapper[4752]: I0929 10:51:23.389516 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e3f0c747-6140-45e8-85a6-eb551b33adef-bound-sa-token\") pod \"image-registry-66df7c8f76-xxqqt\" (UID: \"e3f0c747-6140-45e8-85a6-eb551b33adef\") " pod="openshift-image-registry/image-registry-66df7c8f76-xxqqt" Sep 29 10:51:23 crc kubenswrapper[4752]: I0929 10:51:23.389533 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e3f0c747-6140-45e8-85a6-eb551b33adef-registry-tls\") pod \"image-registry-66df7c8f76-xxqqt\" (UID: \"e3f0c747-6140-45e8-85a6-eb551b33adef\") " pod="openshift-image-registry/image-registry-66df7c8f76-xxqqt" Sep 29 10:51:23 crc kubenswrapper[4752]: I0929 10:51:23.389559 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e3f0c747-6140-45e8-85a6-eb551b33adef-ca-trust-extracted\") pod \"image-registry-66df7c8f76-xxqqt\" (UID: \"e3f0c747-6140-45e8-85a6-eb551b33adef\") " pod="openshift-image-registry/image-registry-66df7c8f76-xxqqt" Sep 29 10:51:23 crc kubenswrapper[4752]: I0929 10:51:23.390746 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e3f0c747-6140-45e8-85a6-eb551b33adef-trusted-ca\") pod \"image-registry-66df7c8f76-xxqqt\" (UID: \"e3f0c747-6140-45e8-85a6-eb551b33adef\") " pod="openshift-image-registry/image-registry-66df7c8f76-xxqqt" Sep 29 10:51:23 crc kubenswrapper[4752]: I0929 10:51:23.390933 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e3f0c747-6140-45e8-85a6-eb551b33adef-ca-trust-extracted\") pod \"image-registry-66df7c8f76-xxqqt\" (UID: \"e3f0c747-6140-45e8-85a6-eb551b33adef\") " pod="openshift-image-registry/image-registry-66df7c8f76-xxqqt" Sep 29 10:51:23 crc kubenswrapper[4752]: I0929 10:51:23.390981 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e3f0c747-6140-45e8-85a6-eb551b33adef-registry-certificates\") pod \"image-registry-66df7c8f76-xxqqt\" (UID: \"e3f0c747-6140-45e8-85a6-eb551b33adef\") " pod="openshift-image-registry/image-registry-66df7c8f76-xxqqt" Sep 29 10:51:23 crc kubenswrapper[4752]: I0929 10:51:23.396692 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e3f0c747-6140-45e8-85a6-eb551b33adef-registry-tls\") pod \"image-registry-66df7c8f76-xxqqt\" (UID: \"e3f0c747-6140-45e8-85a6-eb551b33adef\") " pod="openshift-image-registry/image-registry-66df7c8f76-xxqqt" Sep 29 10:51:23 crc kubenswrapper[4752]: I0929 10:51:23.397527 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e3f0c747-6140-45e8-85a6-eb551b33adef-installation-pull-secrets\") pod \"image-registry-66df7c8f76-xxqqt\" (UID: \"e3f0c747-6140-45e8-85a6-eb551b33adef\") " pod="openshift-image-registry/image-registry-66df7c8f76-xxqqt" Sep 29 10:51:23 crc kubenswrapper[4752]: I0929 10:51:23.405892 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4w8fv\" (UniqueName: \"kubernetes.io/projected/e3f0c747-6140-45e8-85a6-eb551b33adef-kube-api-access-4w8fv\") pod \"image-registry-66df7c8f76-xxqqt\" (UID: \"e3f0c747-6140-45e8-85a6-eb551b33adef\") " pod="openshift-image-registry/image-registry-66df7c8f76-xxqqt" Sep 29 10:51:23 crc kubenswrapper[4752]: I0929 10:51:23.406663 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e3f0c747-6140-45e8-85a6-eb551b33adef-bound-sa-token\") pod \"image-registry-66df7c8f76-xxqqt\" (UID: \"e3f0c747-6140-45e8-85a6-eb551b33adef\") " pod="openshift-image-registry/image-registry-66df7c8f76-xxqqt" Sep 29 10:51:23 crc kubenswrapper[4752]: I0929 10:51:23.471510 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-xxqqt" Sep 29 10:51:23 crc kubenswrapper[4752]: I0929 10:51:23.649933 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-xxqqt"] Sep 29 10:51:23 crc kubenswrapper[4752]: I0929 10:51:23.901245 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-xxqqt" event={"ID":"e3f0c747-6140-45e8-85a6-eb551b33adef","Type":"ContainerStarted","Data":"b801f80791979f6f9677d2793ecf74fb0232073b015fb330c68ba4b6833be7fc"} Sep 29 10:51:23 crc kubenswrapper[4752]: I0929 10:51:23.901294 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-xxqqt" event={"ID":"e3f0c747-6140-45e8-85a6-eb551b33adef","Type":"ContainerStarted","Data":"cef3d6d692d4e9dedafc5afbc1705f4e7954b75b75f7fc37729fe1a7618e1aa3"} Sep 29 10:51:23 crc kubenswrapper[4752]: I0929 10:51:23.901396 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-xxqqt" Sep 29 10:51:23 crc kubenswrapper[4752]: I0929 10:51:23.924527 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-xxqqt" podStartSLOduration=0.924507867 podStartE2EDuration="924.507867ms" podCreationTimestamp="2025-09-29 10:51:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:51:23.920706936 +0000 UTC m=+424.709848623" watchObservedRunningTime="2025-09-29 10:51:23.924507867 +0000 UTC m=+424.713649534" Sep 29 10:51:43 crc kubenswrapper[4752]: I0929 10:51:43.476497 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-xxqqt" Sep 29 10:51:43 crc kubenswrapper[4752]: I0929 10:51:43.528898 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-57fqh"] Sep 29 10:52:08 crc kubenswrapper[4752]: I0929 10:52:08.566055 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-57fqh" podUID="8f756d24-5e77-4130-b920-794234a82ece" containerName="registry" containerID="cri-o://8cf7746dad67cc0560347e51f223b56cf7fdf06f99e050e072696b98b4c557e1" gracePeriod=30 Sep 29 10:52:08 crc kubenswrapper[4752]: I0929 10:52:08.926515 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-57fqh" Sep 29 10:52:09 crc kubenswrapper[4752]: I0929 10:52:09.070187 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f756d24-5e77-4130-b920-794234a82ece-registry-tls\") pod \"8f756d24-5e77-4130-b920-794234a82ece\" (UID: \"8f756d24-5e77-4130-b920-794234a82ece\") " Sep 29 10:52:09 crc kubenswrapper[4752]: I0929 10:52:09.070337 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f756d24-5e77-4130-b920-794234a82ece-trusted-ca\") pod \"8f756d24-5e77-4130-b920-794234a82ece\" (UID: \"8f756d24-5e77-4130-b920-794234a82ece\") " Sep 29 10:52:09 crc kubenswrapper[4752]: I0929 10:52:09.070399 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f756d24-5e77-4130-b920-794234a82ece-ca-trust-extracted\") pod \"8f756d24-5e77-4130-b920-794234a82ece\" (UID: \"8f756d24-5e77-4130-b920-794234a82ece\") " Sep 29 10:52:09 crc kubenswrapper[4752]: I0929 10:52:09.070438 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f756d24-5e77-4130-b920-794234a82ece-registry-certificates\") pod \"8f756d24-5e77-4130-b920-794234a82ece\" (UID: \"8f756d24-5e77-4130-b920-794234a82ece\") " Sep 29 10:52:09 crc kubenswrapper[4752]: I0929 10:52:09.070840 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f756d24-5e77-4130-b920-794234a82ece\" (UID: \"8f756d24-5e77-4130-b920-794234a82ece\") " Sep 29 10:52:09 crc kubenswrapper[4752]: I0929 10:52:09.070896 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gdmbs\" (UniqueName: \"kubernetes.io/projected/8f756d24-5e77-4130-b920-794234a82ece-kube-api-access-gdmbs\") pod \"8f756d24-5e77-4130-b920-794234a82ece\" (UID: \"8f756d24-5e77-4130-b920-794234a82ece\") " Sep 29 10:52:09 crc kubenswrapper[4752]: I0929 10:52:09.070951 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f756d24-5e77-4130-b920-794234a82ece-bound-sa-token\") pod \"8f756d24-5e77-4130-b920-794234a82ece\" (UID: \"8f756d24-5e77-4130-b920-794234a82ece\") " Sep 29 10:52:09 crc kubenswrapper[4752]: I0929 10:52:09.070995 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f756d24-5e77-4130-b920-794234a82ece-installation-pull-secrets\") pod \"8f756d24-5e77-4130-b920-794234a82ece\" (UID: \"8f756d24-5e77-4130-b920-794234a82ece\") " Sep 29 10:52:09 crc kubenswrapper[4752]: I0929 10:52:09.072327 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f756d24-5e77-4130-b920-794234a82ece-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f756d24-5e77-4130-b920-794234a82ece" (UID: "8f756d24-5e77-4130-b920-794234a82ece"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:52:09 crc kubenswrapper[4752]: I0929 10:52:09.072433 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f756d24-5e77-4130-b920-794234a82ece-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f756d24-5e77-4130-b920-794234a82ece" (UID: "8f756d24-5e77-4130-b920-794234a82ece"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:52:09 crc kubenswrapper[4752]: I0929 10:52:09.079893 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f756d24-5e77-4130-b920-794234a82ece-kube-api-access-gdmbs" (OuterVolumeSpecName: "kube-api-access-gdmbs") pod "8f756d24-5e77-4130-b920-794234a82ece" (UID: "8f756d24-5e77-4130-b920-794234a82ece"). InnerVolumeSpecName "kube-api-access-gdmbs". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:52:09 crc kubenswrapper[4752]: I0929 10:52:09.080316 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f756d24-5e77-4130-b920-794234a82ece-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f756d24-5e77-4130-b920-794234a82ece" (UID: "8f756d24-5e77-4130-b920-794234a82ece"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:52:09 crc kubenswrapper[4752]: I0929 10:52:09.080977 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f756d24-5e77-4130-b920-794234a82ece-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f756d24-5e77-4130-b920-794234a82ece" (UID: "8f756d24-5e77-4130-b920-794234a82ece"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:52:09 crc kubenswrapper[4752]: I0929 10:52:09.082827 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f756d24-5e77-4130-b920-794234a82ece-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f756d24-5e77-4130-b920-794234a82ece" (UID: "8f756d24-5e77-4130-b920-794234a82ece"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:52:09 crc kubenswrapper[4752]: I0929 10:52:09.085354 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "8f756d24-5e77-4130-b920-794234a82ece" (UID: "8f756d24-5e77-4130-b920-794234a82ece"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Sep 29 10:52:09 crc kubenswrapper[4752]: I0929 10:52:09.089859 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f756d24-5e77-4130-b920-794234a82ece-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f756d24-5e77-4130-b920-794234a82ece" (UID: "8f756d24-5e77-4130-b920-794234a82ece"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:52:09 crc kubenswrapper[4752]: I0929 10:52:09.160852 4752 generic.go:334] "Generic (PLEG): container finished" podID="8f756d24-5e77-4130-b920-794234a82ece" containerID="8cf7746dad67cc0560347e51f223b56cf7fdf06f99e050e072696b98b4c557e1" exitCode=0 Sep 29 10:52:09 crc kubenswrapper[4752]: I0929 10:52:09.160928 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-57fqh" event={"ID":"8f756d24-5e77-4130-b920-794234a82ece","Type":"ContainerDied","Data":"8cf7746dad67cc0560347e51f223b56cf7fdf06f99e050e072696b98b4c557e1"} Sep 29 10:52:09 crc kubenswrapper[4752]: I0929 10:52:09.160988 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-57fqh" Sep 29 10:52:09 crc kubenswrapper[4752]: I0929 10:52:09.161019 4752 scope.go:117] "RemoveContainer" containerID="8cf7746dad67cc0560347e51f223b56cf7fdf06f99e050e072696b98b4c557e1" Sep 29 10:52:09 crc kubenswrapper[4752]: I0929 10:52:09.160995 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-57fqh" event={"ID":"8f756d24-5e77-4130-b920-794234a82ece","Type":"ContainerDied","Data":"0feb392e49335a28f5a46448609b0aade3feabfa88f0c477493b2de19f207244"} Sep 29 10:52:09 crc kubenswrapper[4752]: I0929 10:52:09.172549 4752 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f756d24-5e77-4130-b920-794234a82ece-registry-tls\") on node \"crc\" DevicePath \"\"" Sep 29 10:52:09 crc kubenswrapper[4752]: I0929 10:52:09.172931 4752 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f756d24-5e77-4130-b920-794234a82ece-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 29 10:52:09 crc kubenswrapper[4752]: I0929 10:52:09.172942 4752 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f756d24-5e77-4130-b920-794234a82ece-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Sep 29 10:52:09 crc kubenswrapper[4752]: I0929 10:52:09.172956 4752 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f756d24-5e77-4130-b920-794234a82ece-registry-certificates\") on node \"crc\" DevicePath \"\"" Sep 29 10:52:09 crc kubenswrapper[4752]: I0929 10:52:09.172968 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gdmbs\" (UniqueName: \"kubernetes.io/projected/8f756d24-5e77-4130-b920-794234a82ece-kube-api-access-gdmbs\") on node \"crc\" DevicePath \"\"" Sep 29 10:52:09 crc kubenswrapper[4752]: I0929 10:52:09.172978 4752 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f756d24-5e77-4130-b920-794234a82ece-bound-sa-token\") on node \"crc\" DevicePath \"\"" Sep 29 10:52:09 crc kubenswrapper[4752]: I0929 10:52:09.172988 4752 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f756d24-5e77-4130-b920-794234a82ece-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Sep 29 10:52:09 crc kubenswrapper[4752]: I0929 10:52:09.178446 4752 scope.go:117] "RemoveContainer" containerID="8cf7746dad67cc0560347e51f223b56cf7fdf06f99e050e072696b98b4c557e1" Sep 29 10:52:09 crc kubenswrapper[4752]: E0929 10:52:09.179106 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8cf7746dad67cc0560347e51f223b56cf7fdf06f99e050e072696b98b4c557e1\": container with ID starting with 8cf7746dad67cc0560347e51f223b56cf7fdf06f99e050e072696b98b4c557e1 not found: ID does not exist" containerID="8cf7746dad67cc0560347e51f223b56cf7fdf06f99e050e072696b98b4c557e1" Sep 29 10:52:09 crc kubenswrapper[4752]: I0929 10:52:09.179142 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cf7746dad67cc0560347e51f223b56cf7fdf06f99e050e072696b98b4c557e1"} err="failed to get container status \"8cf7746dad67cc0560347e51f223b56cf7fdf06f99e050e072696b98b4c557e1\": rpc error: code = NotFound desc = could not find container \"8cf7746dad67cc0560347e51f223b56cf7fdf06f99e050e072696b98b4c557e1\": container with ID starting with 8cf7746dad67cc0560347e51f223b56cf7fdf06f99e050e072696b98b4c557e1 not found: ID does not exist" Sep 29 10:52:09 crc kubenswrapper[4752]: I0929 10:52:09.189036 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-57fqh"] Sep 29 10:52:09 crc kubenswrapper[4752]: I0929 10:52:09.201393 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-57fqh"] Sep 29 10:52:10 crc kubenswrapper[4752]: I0929 10:52:10.042619 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f756d24-5e77-4130-b920-794234a82ece" path="/var/lib/kubelet/pods/8f756d24-5e77-4130-b920-794234a82ece/volumes" Sep 29 10:52:56 crc kubenswrapper[4752]: I0929 10:52:56.175158 4752 patch_prober.go:28] interesting pod/machine-config-daemon-mgrvs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 10:52:56 crc kubenswrapper[4752]: I0929 10:52:56.176126 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 10:53:26 crc kubenswrapper[4752]: I0929 10:53:26.176142 4752 patch_prober.go:28] interesting pod/machine-config-daemon-mgrvs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 10:53:26 crc kubenswrapper[4752]: I0929 10:53:26.177085 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 10:53:56 crc kubenswrapper[4752]: I0929 10:53:56.175437 4752 patch_prober.go:28] interesting pod/machine-config-daemon-mgrvs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 10:53:56 crc kubenswrapper[4752]: I0929 10:53:56.176319 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 10:53:56 crc kubenswrapper[4752]: I0929 10:53:56.176402 4752 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" Sep 29 10:53:56 crc kubenswrapper[4752]: I0929 10:53:56.177233 4752 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3f331ecf545ae76e5229e9ac291d3a8fabb44711af838903a027a58783c88d03"} pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 29 10:53:56 crc kubenswrapper[4752]: I0929 10:53:56.177328 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" containerName="machine-config-daemon" containerID="cri-o://3f331ecf545ae76e5229e9ac291d3a8fabb44711af838903a027a58783c88d03" gracePeriod=600 Sep 29 10:53:56 crc kubenswrapper[4752]: E0929 10:53:56.256432 4752 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5863c243_797d_462a_b11f_71aaf005f8d1.slice/crio-3f331ecf545ae76e5229e9ac291d3a8fabb44711af838903a027a58783c88d03.scope\": RecentStats: unable to find data in memory cache]" Sep 29 10:53:56 crc kubenswrapper[4752]: I0929 10:53:56.823392 4752 generic.go:334] "Generic (PLEG): container finished" podID="5863c243-797d-462a-b11f-71aaf005f8d1" containerID="3f331ecf545ae76e5229e9ac291d3a8fabb44711af838903a027a58783c88d03" exitCode=0 Sep 29 10:53:56 crc kubenswrapper[4752]: I0929 10:53:56.823449 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" event={"ID":"5863c243-797d-462a-b11f-71aaf005f8d1","Type":"ContainerDied","Data":"3f331ecf545ae76e5229e9ac291d3a8fabb44711af838903a027a58783c88d03"} Sep 29 10:53:56 crc kubenswrapper[4752]: I0929 10:53:56.823884 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" event={"ID":"5863c243-797d-462a-b11f-71aaf005f8d1","Type":"ContainerStarted","Data":"c407b08be26fe95221bcb36f9b8690f867d6ce7b5902b3cd248dbfd3fb7865c7"} Sep 29 10:53:56 crc kubenswrapper[4752]: I0929 10:53:56.823923 4752 scope.go:117] "RemoveContainer" containerID="907813a6b730b09c945dfe34cd11dc9926afdeb1d7a721e02b4bac45108adba9" Sep 29 10:55:35 crc kubenswrapper[4752]: I0929 10:55:35.425609 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dmwxrn"] Sep 29 10:55:35 crc kubenswrapper[4752]: E0929 10:55:35.426854 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f756d24-5e77-4130-b920-794234a82ece" containerName="registry" Sep 29 10:55:35 crc kubenswrapper[4752]: I0929 10:55:35.426878 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f756d24-5e77-4130-b920-794234a82ece" containerName="registry" Sep 29 10:55:35 crc kubenswrapper[4752]: I0929 10:55:35.427042 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f756d24-5e77-4130-b920-794234a82ece" containerName="registry" Sep 29 10:55:35 crc kubenswrapper[4752]: I0929 10:55:35.428200 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dmwxrn" Sep 29 10:55:35 crc kubenswrapper[4752]: I0929 10:55:35.431639 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Sep 29 10:55:35 crc kubenswrapper[4752]: I0929 10:55:35.440843 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dmwxrn"] Sep 29 10:55:35 crc kubenswrapper[4752]: I0929 10:55:35.612259 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zz92m\" (UniqueName: \"kubernetes.io/projected/fd528af5-f62d-483e-8c18-ac07cdf64251-kube-api-access-zz92m\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dmwxrn\" (UID: \"fd528af5-f62d-483e-8c18-ac07cdf64251\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dmwxrn" Sep 29 10:55:35 crc kubenswrapper[4752]: I0929 10:55:35.612353 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fd528af5-f62d-483e-8c18-ac07cdf64251-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dmwxrn\" (UID: \"fd528af5-f62d-483e-8c18-ac07cdf64251\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dmwxrn" Sep 29 10:55:35 crc kubenswrapper[4752]: I0929 10:55:35.612430 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fd528af5-f62d-483e-8c18-ac07cdf64251-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dmwxrn\" (UID: \"fd528af5-f62d-483e-8c18-ac07cdf64251\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dmwxrn" Sep 29 10:55:35 crc kubenswrapper[4752]: I0929 10:55:35.713678 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zz92m\" (UniqueName: \"kubernetes.io/projected/fd528af5-f62d-483e-8c18-ac07cdf64251-kube-api-access-zz92m\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dmwxrn\" (UID: \"fd528af5-f62d-483e-8c18-ac07cdf64251\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dmwxrn" Sep 29 10:55:35 crc kubenswrapper[4752]: I0929 10:55:35.713754 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fd528af5-f62d-483e-8c18-ac07cdf64251-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dmwxrn\" (UID: \"fd528af5-f62d-483e-8c18-ac07cdf64251\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dmwxrn" Sep 29 10:55:35 crc kubenswrapper[4752]: I0929 10:55:35.713831 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fd528af5-f62d-483e-8c18-ac07cdf64251-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dmwxrn\" (UID: \"fd528af5-f62d-483e-8c18-ac07cdf64251\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dmwxrn" Sep 29 10:55:35 crc kubenswrapper[4752]: I0929 10:55:35.714433 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fd528af5-f62d-483e-8c18-ac07cdf64251-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dmwxrn\" (UID: \"fd528af5-f62d-483e-8c18-ac07cdf64251\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dmwxrn" Sep 29 10:55:35 crc kubenswrapper[4752]: I0929 10:55:35.715182 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fd528af5-f62d-483e-8c18-ac07cdf64251-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dmwxrn\" (UID: \"fd528af5-f62d-483e-8c18-ac07cdf64251\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dmwxrn" Sep 29 10:55:35 crc kubenswrapper[4752]: I0929 10:55:35.737357 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zz92m\" (UniqueName: \"kubernetes.io/projected/fd528af5-f62d-483e-8c18-ac07cdf64251-kube-api-access-zz92m\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dmwxrn\" (UID: \"fd528af5-f62d-483e-8c18-ac07cdf64251\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dmwxrn" Sep 29 10:55:35 crc kubenswrapper[4752]: I0929 10:55:35.748174 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dmwxrn" Sep 29 10:55:35 crc kubenswrapper[4752]: I0929 10:55:35.976683 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dmwxrn"] Sep 29 10:55:36 crc kubenswrapper[4752]: I0929 10:55:36.431751 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dmwxrn" event={"ID":"fd528af5-f62d-483e-8c18-ac07cdf64251","Type":"ContainerStarted","Data":"a2b5641ee8c8654a3c36276fd02ea9680a064725d69241a20fb93e6fb2c02993"} Sep 29 10:55:36 crc kubenswrapper[4752]: I0929 10:55:36.431828 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dmwxrn" event={"ID":"fd528af5-f62d-483e-8c18-ac07cdf64251","Type":"ContainerStarted","Data":"c4771c8377f06dca3da68e0974f422251a69fe575fdb4848b7ddc7713357ab56"} Sep 29 10:55:37 crc kubenswrapper[4752]: I0929 10:55:37.440083 4752 generic.go:334] "Generic (PLEG): container finished" podID="fd528af5-f62d-483e-8c18-ac07cdf64251" containerID="a2b5641ee8c8654a3c36276fd02ea9680a064725d69241a20fb93e6fb2c02993" exitCode=0 Sep 29 10:55:37 crc kubenswrapper[4752]: I0929 10:55:37.440212 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dmwxrn" event={"ID":"fd528af5-f62d-483e-8c18-ac07cdf64251","Type":"ContainerDied","Data":"a2b5641ee8c8654a3c36276fd02ea9680a064725d69241a20fb93e6fb2c02993"} Sep 29 10:55:37 crc kubenswrapper[4752]: I0929 10:55:37.442357 4752 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 29 10:55:39 crc kubenswrapper[4752]: I0929 10:55:39.453465 4752 generic.go:334] "Generic (PLEG): container finished" podID="fd528af5-f62d-483e-8c18-ac07cdf64251" containerID="fd8bf2f60c97a79bd204e60d7de26693031c10387473cae013844c2e611e7b8b" exitCode=0 Sep 29 10:55:39 crc kubenswrapper[4752]: I0929 10:55:39.453528 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dmwxrn" event={"ID":"fd528af5-f62d-483e-8c18-ac07cdf64251","Type":"ContainerDied","Data":"fd8bf2f60c97a79bd204e60d7de26693031c10387473cae013844c2e611e7b8b"} Sep 29 10:55:40 crc kubenswrapper[4752]: I0929 10:55:40.464682 4752 generic.go:334] "Generic (PLEG): container finished" podID="fd528af5-f62d-483e-8c18-ac07cdf64251" containerID="4fe97ee793d48222ad827dd2b4ce31329c22ed48a242cbe2fd73fb096781eec2" exitCode=0 Sep 29 10:55:40 crc kubenswrapper[4752]: I0929 10:55:40.464877 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dmwxrn" event={"ID":"fd528af5-f62d-483e-8c18-ac07cdf64251","Type":"ContainerDied","Data":"4fe97ee793d48222ad827dd2b4ce31329c22ed48a242cbe2fd73fb096781eec2"} Sep 29 10:55:41 crc kubenswrapper[4752]: I0929 10:55:41.692236 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dmwxrn" Sep 29 10:55:41 crc kubenswrapper[4752]: I0929 10:55:41.794055 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zz92m\" (UniqueName: \"kubernetes.io/projected/fd528af5-f62d-483e-8c18-ac07cdf64251-kube-api-access-zz92m\") pod \"fd528af5-f62d-483e-8c18-ac07cdf64251\" (UID: \"fd528af5-f62d-483e-8c18-ac07cdf64251\") " Sep 29 10:55:41 crc kubenswrapper[4752]: I0929 10:55:41.794135 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fd528af5-f62d-483e-8c18-ac07cdf64251-bundle\") pod \"fd528af5-f62d-483e-8c18-ac07cdf64251\" (UID: \"fd528af5-f62d-483e-8c18-ac07cdf64251\") " Sep 29 10:55:41 crc kubenswrapper[4752]: I0929 10:55:41.794207 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fd528af5-f62d-483e-8c18-ac07cdf64251-util\") pod \"fd528af5-f62d-483e-8c18-ac07cdf64251\" (UID: \"fd528af5-f62d-483e-8c18-ac07cdf64251\") " Sep 29 10:55:41 crc kubenswrapper[4752]: I0929 10:55:41.796547 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd528af5-f62d-483e-8c18-ac07cdf64251-bundle" (OuterVolumeSpecName: "bundle") pod "fd528af5-f62d-483e-8c18-ac07cdf64251" (UID: "fd528af5-f62d-483e-8c18-ac07cdf64251"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:55:41 crc kubenswrapper[4752]: I0929 10:55:41.800157 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd528af5-f62d-483e-8c18-ac07cdf64251-kube-api-access-zz92m" (OuterVolumeSpecName: "kube-api-access-zz92m") pod "fd528af5-f62d-483e-8c18-ac07cdf64251" (UID: "fd528af5-f62d-483e-8c18-ac07cdf64251"). InnerVolumeSpecName "kube-api-access-zz92m". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:55:41 crc kubenswrapper[4752]: I0929 10:55:41.863409 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd528af5-f62d-483e-8c18-ac07cdf64251-util" (OuterVolumeSpecName: "util") pod "fd528af5-f62d-483e-8c18-ac07cdf64251" (UID: "fd528af5-f62d-483e-8c18-ac07cdf64251"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:55:41 crc kubenswrapper[4752]: I0929 10:55:41.896682 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zz92m\" (UniqueName: \"kubernetes.io/projected/fd528af5-f62d-483e-8c18-ac07cdf64251-kube-api-access-zz92m\") on node \"crc\" DevicePath \"\"" Sep 29 10:55:41 crc kubenswrapper[4752]: I0929 10:55:41.896725 4752 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fd528af5-f62d-483e-8c18-ac07cdf64251-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:55:41 crc kubenswrapper[4752]: I0929 10:55:41.896736 4752 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fd528af5-f62d-483e-8c18-ac07cdf64251-util\") on node \"crc\" DevicePath \"\"" Sep 29 10:55:42 crc kubenswrapper[4752]: I0929 10:55:42.480375 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dmwxrn" event={"ID":"fd528af5-f62d-483e-8c18-ac07cdf64251","Type":"ContainerDied","Data":"c4771c8377f06dca3da68e0974f422251a69fe575fdb4848b7ddc7713357ab56"} Sep 29 10:55:42 crc kubenswrapper[4752]: I0929 10:55:42.480429 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c4771c8377f06dca3da68e0974f422251a69fe575fdb4848b7ddc7713357ab56" Sep 29 10:55:42 crc kubenswrapper[4752]: I0929 10:55:42.480513 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dmwxrn" Sep 29 10:55:46 crc kubenswrapper[4752]: I0929 10:55:46.326190 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-c2vrh"] Sep 29 10:55:46 crc kubenswrapper[4752]: I0929 10:55:46.326814 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" podUID="94028c24-ec10-4d5c-b32c-1700e677d539" containerName="sbdb" containerID="cri-o://ea11fb795febf50e35263b0a02c32a01fd69937dfbfe196696cd1792e40cc191" gracePeriod=30 Sep 29 10:55:46 crc kubenswrapper[4752]: I0929 10:55:46.326899 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" podUID="94028c24-ec10-4d5c-b32c-1700e677d539" containerName="nbdb" containerID="cri-o://e34a55130babbc5fbe9fb81d05fc687dc1b06c3bffea762ba699f9f6c317b312" gracePeriod=30 Sep 29 10:55:46 crc kubenswrapper[4752]: I0929 10:55:46.326928 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" podUID="94028c24-ec10-4d5c-b32c-1700e677d539" containerName="northd" containerID="cri-o://5985eb5ebc8fa2ca986873aea235335770621597493b43eaa58d98329cd37009" gracePeriod=30 Sep 29 10:55:46 crc kubenswrapper[4752]: I0929 10:55:46.327030 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" podUID="94028c24-ec10-4d5c-b32c-1700e677d539" containerName="ovn-acl-logging" containerID="cri-o://b46368b26939edaf377aa86ef45fc9dc3ec4fa274dfe1cba458bafb8d32309e4" gracePeriod=30 Sep 29 10:55:46 crc kubenswrapper[4752]: I0929 10:55:46.326835 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" podUID="94028c24-ec10-4d5c-b32c-1700e677d539" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://e2860691a355a598f52a1f13213198fa7889748e67cca21a617ed5714f5eabcc" gracePeriod=30 Sep 29 10:55:46 crc kubenswrapper[4752]: I0929 10:55:46.326984 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" podUID="94028c24-ec10-4d5c-b32c-1700e677d539" containerName="kube-rbac-proxy-node" containerID="cri-o://486ac9c45cc8e6cc88a199b152343c1db14c51125b4357c85d5d082467fc4560" gracePeriod=30 Sep 29 10:55:46 crc kubenswrapper[4752]: I0929 10:55:46.326737 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" podUID="94028c24-ec10-4d5c-b32c-1700e677d539" containerName="ovn-controller" containerID="cri-o://8a98f237ee9baeb799b2ea76ccbe7b349ed70b50f47738fc514ae56b46ee8d1a" gracePeriod=30 Sep 29 10:55:46 crc kubenswrapper[4752]: I0929 10:55:46.363434 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" podUID="94028c24-ec10-4d5c-b32c-1700e677d539" containerName="ovnkube-controller" containerID="cri-o://3019dad252df73aaf83bc4c0b714472cf54345012a9a5b83a88315570d972fb7" gracePeriod=30 Sep 29 10:55:46 crc kubenswrapper[4752]: I0929 10:55:46.503352 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xv5q7_52fc9378-c37b-424b-afde-7b191bab5fde/kube-multus/2.log" Sep 29 10:55:46 crc kubenswrapper[4752]: I0929 10:55:46.504206 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xv5q7_52fc9378-c37b-424b-afde-7b191bab5fde/kube-multus/1.log" Sep 29 10:55:46 crc kubenswrapper[4752]: I0929 10:55:46.504273 4752 generic.go:334] "Generic (PLEG): container finished" podID="52fc9378-c37b-424b-afde-7b191bab5fde" containerID="eff8591a1e7e061df63a2f3b4b4af9f4dd03197426fd89027902ac085abf289f" exitCode=2 Sep 29 10:55:46 crc kubenswrapper[4752]: I0929 10:55:46.504370 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xv5q7" event={"ID":"52fc9378-c37b-424b-afde-7b191bab5fde","Type":"ContainerDied","Data":"eff8591a1e7e061df63a2f3b4b4af9f4dd03197426fd89027902ac085abf289f"} Sep 29 10:55:46 crc kubenswrapper[4752]: I0929 10:55:46.504453 4752 scope.go:117] "RemoveContainer" containerID="d36b7c0411c2a5cbcb37f626fa70cfe6c7d3fc6280f6a9e882fa27766f6de761" Sep 29 10:55:46 crc kubenswrapper[4752]: I0929 10:55:46.505024 4752 scope.go:117] "RemoveContainer" containerID="eff8591a1e7e061df63a2f3b4b4af9f4dd03197426fd89027902ac085abf289f" Sep 29 10:55:46 crc kubenswrapper[4752]: E0929 10:55:46.505218 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-xv5q7_openshift-multus(52fc9378-c37b-424b-afde-7b191bab5fde)\"" pod="openshift-multus/multus-xv5q7" podUID="52fc9378-c37b-424b-afde-7b191bab5fde" Sep 29 10:55:46 crc kubenswrapper[4752]: I0929 10:55:46.509185 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c2vrh_94028c24-ec10-4d5c-b32c-1700e677d539/ovnkube-controller/3.log" Sep 29 10:55:46 crc kubenswrapper[4752]: I0929 10:55:46.513956 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c2vrh_94028c24-ec10-4d5c-b32c-1700e677d539/ovn-acl-logging/0.log" Sep 29 10:55:46 crc kubenswrapper[4752]: I0929 10:55:46.514467 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c2vrh_94028c24-ec10-4d5c-b32c-1700e677d539/ovn-controller/0.log" Sep 29 10:55:46 crc kubenswrapper[4752]: I0929 10:55:46.514918 4752 generic.go:334] "Generic (PLEG): container finished" podID="94028c24-ec10-4d5c-b32c-1700e677d539" containerID="3019dad252df73aaf83bc4c0b714472cf54345012a9a5b83a88315570d972fb7" exitCode=0 Sep 29 10:55:46 crc kubenswrapper[4752]: I0929 10:55:46.514957 4752 generic.go:334] "Generic (PLEG): container finished" podID="94028c24-ec10-4d5c-b32c-1700e677d539" containerID="e2860691a355a598f52a1f13213198fa7889748e67cca21a617ed5714f5eabcc" exitCode=0 Sep 29 10:55:46 crc kubenswrapper[4752]: I0929 10:55:46.514966 4752 generic.go:334] "Generic (PLEG): container finished" podID="94028c24-ec10-4d5c-b32c-1700e677d539" containerID="486ac9c45cc8e6cc88a199b152343c1db14c51125b4357c85d5d082467fc4560" exitCode=0 Sep 29 10:55:46 crc kubenswrapper[4752]: I0929 10:55:46.514974 4752 generic.go:334] "Generic (PLEG): container finished" podID="94028c24-ec10-4d5c-b32c-1700e677d539" containerID="b46368b26939edaf377aa86ef45fc9dc3ec4fa274dfe1cba458bafb8d32309e4" exitCode=143 Sep 29 10:55:46 crc kubenswrapper[4752]: I0929 10:55:46.514983 4752 generic.go:334] "Generic (PLEG): container finished" podID="94028c24-ec10-4d5c-b32c-1700e677d539" containerID="8a98f237ee9baeb799b2ea76ccbe7b349ed70b50f47738fc514ae56b46ee8d1a" exitCode=143 Sep 29 10:55:46 crc kubenswrapper[4752]: I0929 10:55:46.515007 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" event={"ID":"94028c24-ec10-4d5c-b32c-1700e677d539","Type":"ContainerDied","Data":"3019dad252df73aaf83bc4c0b714472cf54345012a9a5b83a88315570d972fb7"} Sep 29 10:55:46 crc kubenswrapper[4752]: I0929 10:55:46.515038 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" event={"ID":"94028c24-ec10-4d5c-b32c-1700e677d539","Type":"ContainerDied","Data":"e2860691a355a598f52a1f13213198fa7889748e67cca21a617ed5714f5eabcc"} Sep 29 10:55:46 crc kubenswrapper[4752]: I0929 10:55:46.515051 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" event={"ID":"94028c24-ec10-4d5c-b32c-1700e677d539","Type":"ContainerDied","Data":"486ac9c45cc8e6cc88a199b152343c1db14c51125b4357c85d5d082467fc4560"} Sep 29 10:55:46 crc kubenswrapper[4752]: I0929 10:55:46.515062 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" event={"ID":"94028c24-ec10-4d5c-b32c-1700e677d539","Type":"ContainerDied","Data":"b46368b26939edaf377aa86ef45fc9dc3ec4fa274dfe1cba458bafb8d32309e4"} Sep 29 10:55:46 crc kubenswrapper[4752]: I0929 10:55:46.515071 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" event={"ID":"94028c24-ec10-4d5c-b32c-1700e677d539","Type":"ContainerDied","Data":"8a98f237ee9baeb799b2ea76ccbe7b349ed70b50f47738fc514ae56b46ee8d1a"} Sep 29 10:55:46 crc kubenswrapper[4752]: I0929 10:55:46.573404 4752 scope.go:117] "RemoveContainer" containerID="f5083afbe3807e485df0ceb9323e330b0f37722f050f83895507559c9f655a21" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.075759 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c2vrh_94028c24-ec10-4d5c-b32c-1700e677d539/ovn-acl-logging/0.log" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.076526 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c2vrh_94028c24-ec10-4d5c-b32c-1700e677d539/ovn-controller/0.log" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.076933 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.169533 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/94028c24-ec10-4d5c-b32c-1700e677d539-host-cni-bin\") pod \"94028c24-ec10-4d5c-b32c-1700e677d539\" (UID: \"94028c24-ec10-4d5c-b32c-1700e677d539\") " Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.169593 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/94028c24-ec10-4d5c-b32c-1700e677d539-host-slash\") pod \"94028c24-ec10-4d5c-b32c-1700e677d539\" (UID: \"94028c24-ec10-4d5c-b32c-1700e677d539\") " Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.169619 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/94028c24-ec10-4d5c-b32c-1700e677d539-host-cni-netd\") pod \"94028c24-ec10-4d5c-b32c-1700e677d539\" (UID: \"94028c24-ec10-4d5c-b32c-1700e677d539\") " Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.169770 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/94028c24-ec10-4d5c-b32c-1700e677d539-env-overrides\") pod \"94028c24-ec10-4d5c-b32c-1700e677d539\" (UID: \"94028c24-ec10-4d5c-b32c-1700e677d539\") " Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.169681 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/94028c24-ec10-4d5c-b32c-1700e677d539-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "94028c24-ec10-4d5c-b32c-1700e677d539" (UID: "94028c24-ec10-4d5c-b32c-1700e677d539"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.169707 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/94028c24-ec10-4d5c-b32c-1700e677d539-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "94028c24-ec10-4d5c-b32c-1700e677d539" (UID: "94028c24-ec10-4d5c-b32c-1700e677d539"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.169710 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/94028c24-ec10-4d5c-b32c-1700e677d539-host-slash" (OuterVolumeSpecName: "host-slash") pod "94028c24-ec10-4d5c-b32c-1700e677d539" (UID: "94028c24-ec10-4d5c-b32c-1700e677d539"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.169945 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/94028c24-ec10-4d5c-b32c-1700e677d539-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "94028c24-ec10-4d5c-b32c-1700e677d539" (UID: "94028c24-ec10-4d5c-b32c-1700e677d539"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.170225 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94028c24-ec10-4d5c-b32c-1700e677d539-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "94028c24-ec10-4d5c-b32c-1700e677d539" (UID: "94028c24-ec10-4d5c-b32c-1700e677d539"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.170282 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/94028c24-ec10-4d5c-b32c-1700e677d539-host-run-netns\") pod \"94028c24-ec10-4d5c-b32c-1700e677d539\" (UID: \"94028c24-ec10-4d5c-b32c-1700e677d539\") " Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.170335 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/94028c24-ec10-4d5c-b32c-1700e677d539-run-openvswitch\") pod \"94028c24-ec10-4d5c-b32c-1700e677d539\" (UID: \"94028c24-ec10-4d5c-b32c-1700e677d539\") " Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.170363 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/94028c24-ec10-4d5c-b32c-1700e677d539-etc-openvswitch\") pod \"94028c24-ec10-4d5c-b32c-1700e677d539\" (UID: \"94028c24-ec10-4d5c-b32c-1700e677d539\") " Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.170408 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/94028c24-ec10-4d5c-b32c-1700e677d539-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "94028c24-ec10-4d5c-b32c-1700e677d539" (UID: "94028c24-ec10-4d5c-b32c-1700e677d539"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.170504 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/94028c24-ec10-4d5c-b32c-1700e677d539-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "94028c24-ec10-4d5c-b32c-1700e677d539" (UID: "94028c24-ec10-4d5c-b32c-1700e677d539"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.170472 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/94028c24-ec10-4d5c-b32c-1700e677d539-log-socket\") pod \"94028c24-ec10-4d5c-b32c-1700e677d539\" (UID: \"94028c24-ec10-4d5c-b32c-1700e677d539\") " Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.170566 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/94028c24-ec10-4d5c-b32c-1700e677d539-run-systemd\") pod \"94028c24-ec10-4d5c-b32c-1700e677d539\" (UID: \"94028c24-ec10-4d5c-b32c-1700e677d539\") " Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.171745 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/94028c24-ec10-4d5c-b32c-1700e677d539-var-lib-openvswitch\") pod \"94028c24-ec10-4d5c-b32c-1700e677d539\" (UID: \"94028c24-ec10-4d5c-b32c-1700e677d539\") " Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.170605 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/94028c24-ec10-4d5c-b32c-1700e677d539-log-socket" (OuterVolumeSpecName: "log-socket") pod "94028c24-ec10-4d5c-b32c-1700e677d539" (UID: "94028c24-ec10-4d5c-b32c-1700e677d539"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.171829 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/94028c24-ec10-4d5c-b32c-1700e677d539-host-run-ovn-kubernetes\") pod \"94028c24-ec10-4d5c-b32c-1700e677d539\" (UID: \"94028c24-ec10-4d5c-b32c-1700e677d539\") " Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.171894 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/94028c24-ec10-4d5c-b32c-1700e677d539-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "94028c24-ec10-4d5c-b32c-1700e677d539" (UID: "94028c24-ec10-4d5c-b32c-1700e677d539"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.171860 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/94028c24-ec10-4d5c-b32c-1700e677d539-host-kubelet\") pod \"94028c24-ec10-4d5c-b32c-1700e677d539\" (UID: \"94028c24-ec10-4d5c-b32c-1700e677d539\") " Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.171915 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/94028c24-ec10-4d5c-b32c-1700e677d539-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "94028c24-ec10-4d5c-b32c-1700e677d539" (UID: "94028c24-ec10-4d5c-b32c-1700e677d539"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.171932 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9v6qn\" (UniqueName: \"kubernetes.io/projected/94028c24-ec10-4d5c-b32c-1700e677d539-kube-api-access-9v6qn\") pod \"94028c24-ec10-4d5c-b32c-1700e677d539\" (UID: \"94028c24-ec10-4d5c-b32c-1700e677d539\") " Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.171959 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/94028c24-ec10-4d5c-b32c-1700e677d539-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "94028c24-ec10-4d5c-b32c-1700e677d539" (UID: "94028c24-ec10-4d5c-b32c-1700e677d539"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.171978 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/94028c24-ec10-4d5c-b32c-1700e677d539-run-ovn\") pod \"94028c24-ec10-4d5c-b32c-1700e677d539\" (UID: \"94028c24-ec10-4d5c-b32c-1700e677d539\") " Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.172002 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/94028c24-ec10-4d5c-b32c-1700e677d539-ovnkube-config\") pod \"94028c24-ec10-4d5c-b32c-1700e677d539\" (UID: \"94028c24-ec10-4d5c-b32c-1700e677d539\") " Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.172050 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/94028c24-ec10-4d5c-b32c-1700e677d539-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "94028c24-ec10-4d5c-b32c-1700e677d539" (UID: "94028c24-ec10-4d5c-b32c-1700e677d539"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.172102 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/94028c24-ec10-4d5c-b32c-1700e677d539-systemd-units\") pod \"94028c24-ec10-4d5c-b32c-1700e677d539\" (UID: \"94028c24-ec10-4d5c-b32c-1700e677d539\") " Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.172124 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/94028c24-ec10-4d5c-b32c-1700e677d539-host-var-lib-cni-networks-ovn-kubernetes\") pod \"94028c24-ec10-4d5c-b32c-1700e677d539\" (UID: \"94028c24-ec10-4d5c-b32c-1700e677d539\") " Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.172183 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/94028c24-ec10-4d5c-b32c-1700e677d539-node-log\") pod \"94028c24-ec10-4d5c-b32c-1700e677d539\" (UID: \"94028c24-ec10-4d5c-b32c-1700e677d539\") " Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.172249 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/94028c24-ec10-4d5c-b32c-1700e677d539-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "94028c24-ec10-4d5c-b32c-1700e677d539" (UID: "94028c24-ec10-4d5c-b32c-1700e677d539"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.172261 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/94028c24-ec10-4d5c-b32c-1700e677d539-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "94028c24-ec10-4d5c-b32c-1700e677d539" (UID: "94028c24-ec10-4d5c-b32c-1700e677d539"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.172287 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/94028c24-ec10-4d5c-b32c-1700e677d539-node-log" (OuterVolumeSpecName: "node-log") pod "94028c24-ec10-4d5c-b32c-1700e677d539" (UID: "94028c24-ec10-4d5c-b32c-1700e677d539"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.172284 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/94028c24-ec10-4d5c-b32c-1700e677d539-ovn-node-metrics-cert\") pod \"94028c24-ec10-4d5c-b32c-1700e677d539\" (UID: \"94028c24-ec10-4d5c-b32c-1700e677d539\") " Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.172379 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/94028c24-ec10-4d5c-b32c-1700e677d539-ovnkube-script-lib\") pod \"94028c24-ec10-4d5c-b32c-1700e677d539\" (UID: \"94028c24-ec10-4d5c-b32c-1700e677d539\") " Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.172517 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94028c24-ec10-4d5c-b32c-1700e677d539-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "94028c24-ec10-4d5c-b32c-1700e677d539" (UID: "94028c24-ec10-4d5c-b32c-1700e677d539"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.173021 4752 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/94028c24-ec10-4d5c-b32c-1700e677d539-host-cni-bin\") on node \"crc\" DevicePath \"\"" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.173049 4752 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/94028c24-ec10-4d5c-b32c-1700e677d539-host-slash\") on node \"crc\" DevicePath \"\"" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.173061 4752 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/94028c24-ec10-4d5c-b32c-1700e677d539-host-cni-netd\") on node \"crc\" DevicePath \"\"" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.173071 4752 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/94028c24-ec10-4d5c-b32c-1700e677d539-env-overrides\") on node \"crc\" DevicePath \"\"" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.173080 4752 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/94028c24-ec10-4d5c-b32c-1700e677d539-host-run-netns\") on node \"crc\" DevicePath \"\"" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.173090 4752 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/94028c24-ec10-4d5c-b32c-1700e677d539-run-openvswitch\") on node \"crc\" DevicePath \"\"" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.173113 4752 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/94028c24-ec10-4d5c-b32c-1700e677d539-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.173121 4752 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/94028c24-ec10-4d5c-b32c-1700e677d539-log-socket\") on node \"crc\" DevicePath \"\"" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.173130 4752 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/94028c24-ec10-4d5c-b32c-1700e677d539-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.173141 4752 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/94028c24-ec10-4d5c-b32c-1700e677d539-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.173155 4752 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/94028c24-ec10-4d5c-b32c-1700e677d539-host-kubelet\") on node \"crc\" DevicePath \"\"" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.173163 4752 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/94028c24-ec10-4d5c-b32c-1700e677d539-run-ovn\") on node \"crc\" DevicePath \"\"" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.173175 4752 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/94028c24-ec10-4d5c-b32c-1700e677d539-ovnkube-config\") on node \"crc\" DevicePath \"\"" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.173182 4752 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/94028c24-ec10-4d5c-b32c-1700e677d539-systemd-units\") on node \"crc\" DevicePath \"\"" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.173191 4752 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/94028c24-ec10-4d5c-b32c-1700e677d539-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.173199 4752 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/94028c24-ec10-4d5c-b32c-1700e677d539-node-log\") on node \"crc\" DevicePath \"\"" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.173019 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94028c24-ec10-4d5c-b32c-1700e677d539-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "94028c24-ec10-4d5c-b32c-1700e677d539" (UID: "94028c24-ec10-4d5c-b32c-1700e677d539"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.189459 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94028c24-ec10-4d5c-b32c-1700e677d539-kube-api-access-9v6qn" (OuterVolumeSpecName: "kube-api-access-9v6qn") pod "94028c24-ec10-4d5c-b32c-1700e677d539" (UID: "94028c24-ec10-4d5c-b32c-1700e677d539"). InnerVolumeSpecName "kube-api-access-9v6qn". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.191095 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94028c24-ec10-4d5c-b32c-1700e677d539-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "94028c24-ec10-4d5c-b32c-1700e677d539" (UID: "94028c24-ec10-4d5c-b32c-1700e677d539"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.191401 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-jxmdk"] Sep 29 10:55:47 crc kubenswrapper[4752]: E0929 10:55:47.191664 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94028c24-ec10-4d5c-b32c-1700e677d539" containerName="sbdb" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.191687 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="94028c24-ec10-4d5c-b32c-1700e677d539" containerName="sbdb" Sep 29 10:55:47 crc kubenswrapper[4752]: E0929 10:55:47.191698 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94028c24-ec10-4d5c-b32c-1700e677d539" containerName="ovnkube-controller" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.191708 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="94028c24-ec10-4d5c-b32c-1700e677d539" containerName="ovnkube-controller" Sep 29 10:55:47 crc kubenswrapper[4752]: E0929 10:55:47.191717 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94028c24-ec10-4d5c-b32c-1700e677d539" containerName="kube-rbac-proxy-ovn-metrics" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.191725 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="94028c24-ec10-4d5c-b32c-1700e677d539" containerName="kube-rbac-proxy-ovn-metrics" Sep 29 10:55:47 crc kubenswrapper[4752]: E0929 10:55:47.191745 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd528af5-f62d-483e-8c18-ac07cdf64251" containerName="util" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.191753 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd528af5-f62d-483e-8c18-ac07cdf64251" containerName="util" Sep 29 10:55:47 crc kubenswrapper[4752]: E0929 10:55:47.191764 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94028c24-ec10-4d5c-b32c-1700e677d539" containerName="ovnkube-controller" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.191771 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="94028c24-ec10-4d5c-b32c-1700e677d539" containerName="ovnkube-controller" Sep 29 10:55:47 crc kubenswrapper[4752]: E0929 10:55:47.191780 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94028c24-ec10-4d5c-b32c-1700e677d539" containerName="ovnkube-controller" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.191789 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="94028c24-ec10-4d5c-b32c-1700e677d539" containerName="ovnkube-controller" Sep 29 10:55:47 crc kubenswrapper[4752]: E0929 10:55:47.191814 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94028c24-ec10-4d5c-b32c-1700e677d539" containerName="northd" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.191825 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="94028c24-ec10-4d5c-b32c-1700e677d539" containerName="northd" Sep 29 10:55:47 crc kubenswrapper[4752]: E0929 10:55:47.191838 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd528af5-f62d-483e-8c18-ac07cdf64251" containerName="pull" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.191847 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd528af5-f62d-483e-8c18-ac07cdf64251" containerName="pull" Sep 29 10:55:47 crc kubenswrapper[4752]: E0929 10:55:47.191857 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94028c24-ec10-4d5c-b32c-1700e677d539" containerName="kubecfg-setup" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.191865 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="94028c24-ec10-4d5c-b32c-1700e677d539" containerName="kubecfg-setup" Sep 29 10:55:47 crc kubenswrapper[4752]: E0929 10:55:47.191875 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94028c24-ec10-4d5c-b32c-1700e677d539" containerName="ovnkube-controller" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.191883 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="94028c24-ec10-4d5c-b32c-1700e677d539" containerName="ovnkube-controller" Sep 29 10:55:47 crc kubenswrapper[4752]: E0929 10:55:47.191898 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94028c24-ec10-4d5c-b32c-1700e677d539" containerName="ovnkube-controller" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.191908 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="94028c24-ec10-4d5c-b32c-1700e677d539" containerName="ovnkube-controller" Sep 29 10:55:47 crc kubenswrapper[4752]: E0929 10:55:47.191919 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd528af5-f62d-483e-8c18-ac07cdf64251" containerName="extract" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.191927 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd528af5-f62d-483e-8c18-ac07cdf64251" containerName="extract" Sep 29 10:55:47 crc kubenswrapper[4752]: E0929 10:55:47.191940 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94028c24-ec10-4d5c-b32c-1700e677d539" containerName="nbdb" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.191948 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="94028c24-ec10-4d5c-b32c-1700e677d539" containerName="nbdb" Sep 29 10:55:47 crc kubenswrapper[4752]: E0929 10:55:47.191958 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94028c24-ec10-4d5c-b32c-1700e677d539" containerName="ovn-acl-logging" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.191966 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="94028c24-ec10-4d5c-b32c-1700e677d539" containerName="ovn-acl-logging" Sep 29 10:55:47 crc kubenswrapper[4752]: E0929 10:55:47.191974 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94028c24-ec10-4d5c-b32c-1700e677d539" containerName="ovn-controller" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.191982 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="94028c24-ec10-4d5c-b32c-1700e677d539" containerName="ovn-controller" Sep 29 10:55:47 crc kubenswrapper[4752]: E0929 10:55:47.191993 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94028c24-ec10-4d5c-b32c-1700e677d539" containerName="kube-rbac-proxy-node" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.192000 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="94028c24-ec10-4d5c-b32c-1700e677d539" containerName="kube-rbac-proxy-node" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.192124 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="94028c24-ec10-4d5c-b32c-1700e677d539" containerName="sbdb" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.192135 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="94028c24-ec10-4d5c-b32c-1700e677d539" containerName="kube-rbac-proxy-node" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.192147 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="94028c24-ec10-4d5c-b32c-1700e677d539" containerName="ovnkube-controller" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.192156 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd528af5-f62d-483e-8c18-ac07cdf64251" containerName="extract" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.192168 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="94028c24-ec10-4d5c-b32c-1700e677d539" containerName="kube-rbac-proxy-ovn-metrics" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.192180 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="94028c24-ec10-4d5c-b32c-1700e677d539" containerName="ovnkube-controller" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.192188 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="94028c24-ec10-4d5c-b32c-1700e677d539" containerName="nbdb" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.192200 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="94028c24-ec10-4d5c-b32c-1700e677d539" containerName="ovnkube-controller" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.192208 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="94028c24-ec10-4d5c-b32c-1700e677d539" containerName="ovn-acl-logging" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.192217 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="94028c24-ec10-4d5c-b32c-1700e677d539" containerName="northd" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.192226 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="94028c24-ec10-4d5c-b32c-1700e677d539" containerName="ovn-controller" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.192437 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="94028c24-ec10-4d5c-b32c-1700e677d539" containerName="ovnkube-controller" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.192451 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="94028c24-ec10-4d5c-b32c-1700e677d539" containerName="ovnkube-controller" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.194591 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-jxmdk" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.198650 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/94028c24-ec10-4d5c-b32c-1700e677d539-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "94028c24-ec10-4d5c-b32c-1700e677d539" (UID: "94028c24-ec10-4d5c-b32c-1700e677d539"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.274816 4752 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/94028c24-ec10-4d5c-b32c-1700e677d539-run-systemd\") on node \"crc\" DevicePath \"\"" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.274852 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9v6qn\" (UniqueName: \"kubernetes.io/projected/94028c24-ec10-4d5c-b32c-1700e677d539-kube-api-access-9v6qn\") on node \"crc\" DevicePath \"\"" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.274862 4752 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/94028c24-ec10-4d5c-b32c-1700e677d539-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.274870 4752 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/94028c24-ec10-4d5c-b32c-1700e677d539-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.376529 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d26fb315-5752-4082-a006-d18643556c6f-ovnkube-config\") pod \"ovnkube-node-jxmdk\" (UID: \"d26fb315-5752-4082-a006-d18643556c6f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jxmdk" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.376600 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d26fb315-5752-4082-a006-d18643556c6f-var-lib-openvswitch\") pod \"ovnkube-node-jxmdk\" (UID: \"d26fb315-5752-4082-a006-d18643556c6f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jxmdk" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.376629 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d26fb315-5752-4082-a006-d18643556c6f-etc-openvswitch\") pod \"ovnkube-node-jxmdk\" (UID: \"d26fb315-5752-4082-a006-d18643556c6f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jxmdk" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.376645 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d26fb315-5752-4082-a006-d18643556c6f-ovnkube-script-lib\") pod \"ovnkube-node-jxmdk\" (UID: \"d26fb315-5752-4082-a006-d18643556c6f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jxmdk" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.376679 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d26fb315-5752-4082-a006-d18643556c6f-host-cni-bin\") pod \"ovnkube-node-jxmdk\" (UID: \"d26fb315-5752-4082-a006-d18643556c6f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jxmdk" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.376696 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d26fb315-5752-4082-a006-d18643556c6f-log-socket\") pod \"ovnkube-node-jxmdk\" (UID: \"d26fb315-5752-4082-a006-d18643556c6f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jxmdk" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.376718 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d26fb315-5752-4082-a006-d18643556c6f-run-openvswitch\") pod \"ovnkube-node-jxmdk\" (UID: \"d26fb315-5752-4082-a006-d18643556c6f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jxmdk" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.376747 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d26fb315-5752-4082-a006-d18643556c6f-host-cni-netd\") pod \"ovnkube-node-jxmdk\" (UID: \"d26fb315-5752-4082-a006-d18643556c6f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jxmdk" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.376770 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d26fb315-5752-4082-a006-d18643556c6f-node-log\") pod \"ovnkube-node-jxmdk\" (UID: \"d26fb315-5752-4082-a006-d18643556c6f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jxmdk" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.376949 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d26fb315-5752-4082-a006-d18643556c6f-ovn-node-metrics-cert\") pod \"ovnkube-node-jxmdk\" (UID: \"d26fb315-5752-4082-a006-d18643556c6f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jxmdk" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.377022 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d26fb315-5752-4082-a006-d18643556c6f-host-run-ovn-kubernetes\") pod \"ovnkube-node-jxmdk\" (UID: \"d26fb315-5752-4082-a006-d18643556c6f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jxmdk" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.377054 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d26fb315-5752-4082-a006-d18643556c6f-systemd-units\") pod \"ovnkube-node-jxmdk\" (UID: \"d26fb315-5752-4082-a006-d18643556c6f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jxmdk" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.377071 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d26fb315-5752-4082-a006-d18643556c6f-run-systemd\") pod \"ovnkube-node-jxmdk\" (UID: \"d26fb315-5752-4082-a006-d18643556c6f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jxmdk" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.377086 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49s7v\" (UniqueName: \"kubernetes.io/projected/d26fb315-5752-4082-a006-d18643556c6f-kube-api-access-49s7v\") pod \"ovnkube-node-jxmdk\" (UID: \"d26fb315-5752-4082-a006-d18643556c6f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jxmdk" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.377131 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d26fb315-5752-4082-a006-d18643556c6f-host-run-netns\") pod \"ovnkube-node-jxmdk\" (UID: \"d26fb315-5752-4082-a006-d18643556c6f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jxmdk" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.377275 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d26fb315-5752-4082-a006-d18643556c6f-env-overrides\") pod \"ovnkube-node-jxmdk\" (UID: \"d26fb315-5752-4082-a006-d18643556c6f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jxmdk" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.377351 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d26fb315-5752-4082-a006-d18643556c6f-host-kubelet\") pod \"ovnkube-node-jxmdk\" (UID: \"d26fb315-5752-4082-a006-d18643556c6f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jxmdk" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.377376 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d26fb315-5752-4082-a006-d18643556c6f-host-slash\") pod \"ovnkube-node-jxmdk\" (UID: \"d26fb315-5752-4082-a006-d18643556c6f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jxmdk" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.377412 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d26fb315-5752-4082-a006-d18643556c6f-run-ovn\") pod \"ovnkube-node-jxmdk\" (UID: \"d26fb315-5752-4082-a006-d18643556c6f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jxmdk" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.377435 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d26fb315-5752-4082-a006-d18643556c6f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jxmdk\" (UID: \"d26fb315-5752-4082-a006-d18643556c6f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jxmdk" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.478379 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d26fb315-5752-4082-a006-d18643556c6f-ovnkube-config\") pod \"ovnkube-node-jxmdk\" (UID: \"d26fb315-5752-4082-a006-d18643556c6f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jxmdk" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.478439 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d26fb315-5752-4082-a006-d18643556c6f-var-lib-openvswitch\") pod \"ovnkube-node-jxmdk\" (UID: \"d26fb315-5752-4082-a006-d18643556c6f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jxmdk" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.478461 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d26fb315-5752-4082-a006-d18643556c6f-etc-openvswitch\") pod \"ovnkube-node-jxmdk\" (UID: \"d26fb315-5752-4082-a006-d18643556c6f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jxmdk" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.478480 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d26fb315-5752-4082-a006-d18643556c6f-ovnkube-script-lib\") pod \"ovnkube-node-jxmdk\" (UID: \"d26fb315-5752-4082-a006-d18643556c6f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jxmdk" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.478502 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d26fb315-5752-4082-a006-d18643556c6f-host-cni-bin\") pod \"ovnkube-node-jxmdk\" (UID: \"d26fb315-5752-4082-a006-d18643556c6f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jxmdk" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.478523 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d26fb315-5752-4082-a006-d18643556c6f-log-socket\") pod \"ovnkube-node-jxmdk\" (UID: \"d26fb315-5752-4082-a006-d18643556c6f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jxmdk" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.478551 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d26fb315-5752-4082-a006-d18643556c6f-run-openvswitch\") pod \"ovnkube-node-jxmdk\" (UID: \"d26fb315-5752-4082-a006-d18643556c6f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jxmdk" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.478569 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d26fb315-5752-4082-a006-d18643556c6f-host-cni-netd\") pod \"ovnkube-node-jxmdk\" (UID: \"d26fb315-5752-4082-a006-d18643556c6f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jxmdk" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.478592 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d26fb315-5752-4082-a006-d18643556c6f-node-log\") pod \"ovnkube-node-jxmdk\" (UID: \"d26fb315-5752-4082-a006-d18643556c6f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jxmdk" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.478616 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d26fb315-5752-4082-a006-d18643556c6f-ovn-node-metrics-cert\") pod \"ovnkube-node-jxmdk\" (UID: \"d26fb315-5752-4082-a006-d18643556c6f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jxmdk" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.478605 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d26fb315-5752-4082-a006-d18643556c6f-var-lib-openvswitch\") pod \"ovnkube-node-jxmdk\" (UID: \"d26fb315-5752-4082-a006-d18643556c6f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jxmdk" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.478650 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d26fb315-5752-4082-a006-d18643556c6f-node-log\") pod \"ovnkube-node-jxmdk\" (UID: \"d26fb315-5752-4082-a006-d18643556c6f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jxmdk" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.478670 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d26fb315-5752-4082-a006-d18643556c6f-host-run-ovn-kubernetes\") pod \"ovnkube-node-jxmdk\" (UID: \"d26fb315-5752-4082-a006-d18643556c6f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jxmdk" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.478639 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d26fb315-5752-4082-a006-d18643556c6f-host-run-ovn-kubernetes\") pod \"ovnkube-node-jxmdk\" (UID: \"d26fb315-5752-4082-a006-d18643556c6f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jxmdk" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.478605 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d26fb315-5752-4082-a006-d18643556c6f-etc-openvswitch\") pod \"ovnkube-node-jxmdk\" (UID: \"d26fb315-5752-4082-a006-d18643556c6f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jxmdk" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.478735 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d26fb315-5752-4082-a006-d18643556c6f-systemd-units\") pod \"ovnkube-node-jxmdk\" (UID: \"d26fb315-5752-4082-a006-d18643556c6f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jxmdk" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.478701 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d26fb315-5752-4082-a006-d18643556c6f-host-cni-bin\") pod \"ovnkube-node-jxmdk\" (UID: \"d26fb315-5752-4082-a006-d18643556c6f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jxmdk" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.478704 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d26fb315-5752-4082-a006-d18643556c6f-run-openvswitch\") pod \"ovnkube-node-jxmdk\" (UID: \"d26fb315-5752-4082-a006-d18643556c6f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jxmdk" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.478790 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d26fb315-5752-4082-a006-d18643556c6f-run-systemd\") pod \"ovnkube-node-jxmdk\" (UID: \"d26fb315-5752-4082-a006-d18643556c6f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jxmdk" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.478696 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d26fb315-5752-4082-a006-d18643556c6f-host-cni-netd\") pod \"ovnkube-node-jxmdk\" (UID: \"d26fb315-5752-4082-a006-d18643556c6f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jxmdk" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.478819 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d26fb315-5752-4082-a006-d18643556c6f-log-socket\") pod \"ovnkube-node-jxmdk\" (UID: \"d26fb315-5752-4082-a006-d18643556c6f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jxmdk" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.478839 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d26fb315-5752-4082-a006-d18643556c6f-systemd-units\") pod \"ovnkube-node-jxmdk\" (UID: \"d26fb315-5752-4082-a006-d18643556c6f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jxmdk" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.478891 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49s7v\" (UniqueName: \"kubernetes.io/projected/d26fb315-5752-4082-a006-d18643556c6f-kube-api-access-49s7v\") pod \"ovnkube-node-jxmdk\" (UID: \"d26fb315-5752-4082-a006-d18643556c6f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jxmdk" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.478953 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d26fb315-5752-4082-a006-d18643556c6f-run-systemd\") pod \"ovnkube-node-jxmdk\" (UID: \"d26fb315-5752-4082-a006-d18643556c6f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jxmdk" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.478988 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d26fb315-5752-4082-a006-d18643556c6f-host-run-netns\") pod \"ovnkube-node-jxmdk\" (UID: \"d26fb315-5752-4082-a006-d18643556c6f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jxmdk" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.478966 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d26fb315-5752-4082-a006-d18643556c6f-host-run-netns\") pod \"ovnkube-node-jxmdk\" (UID: \"d26fb315-5752-4082-a006-d18643556c6f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jxmdk" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.479079 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d26fb315-5752-4082-a006-d18643556c6f-env-overrides\") pod \"ovnkube-node-jxmdk\" (UID: \"d26fb315-5752-4082-a006-d18643556c6f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jxmdk" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.479121 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d26fb315-5752-4082-a006-d18643556c6f-host-kubelet\") pod \"ovnkube-node-jxmdk\" (UID: \"d26fb315-5752-4082-a006-d18643556c6f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jxmdk" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.479147 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d26fb315-5752-4082-a006-d18643556c6f-host-slash\") pod \"ovnkube-node-jxmdk\" (UID: \"d26fb315-5752-4082-a006-d18643556c6f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jxmdk" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.479176 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d26fb315-5752-4082-a006-d18643556c6f-run-ovn\") pod \"ovnkube-node-jxmdk\" (UID: \"d26fb315-5752-4082-a006-d18643556c6f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jxmdk" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.479214 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d26fb315-5752-4082-a006-d18643556c6f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jxmdk\" (UID: \"d26fb315-5752-4082-a006-d18643556c6f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jxmdk" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.479352 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d26fb315-5752-4082-a006-d18643556c6f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jxmdk\" (UID: \"d26fb315-5752-4082-a006-d18643556c6f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jxmdk" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.479353 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d26fb315-5752-4082-a006-d18643556c6f-ovnkube-config\") pod \"ovnkube-node-jxmdk\" (UID: \"d26fb315-5752-4082-a006-d18643556c6f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jxmdk" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.479372 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d26fb315-5752-4082-a006-d18643556c6f-host-kubelet\") pod \"ovnkube-node-jxmdk\" (UID: \"d26fb315-5752-4082-a006-d18643556c6f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jxmdk" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.479384 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d26fb315-5752-4082-a006-d18643556c6f-host-slash\") pod \"ovnkube-node-jxmdk\" (UID: \"d26fb315-5752-4082-a006-d18643556c6f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jxmdk" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.479430 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d26fb315-5752-4082-a006-d18643556c6f-run-ovn\") pod \"ovnkube-node-jxmdk\" (UID: \"d26fb315-5752-4082-a006-d18643556c6f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jxmdk" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.479444 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d26fb315-5752-4082-a006-d18643556c6f-ovnkube-script-lib\") pod \"ovnkube-node-jxmdk\" (UID: \"d26fb315-5752-4082-a006-d18643556c6f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jxmdk" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.479960 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d26fb315-5752-4082-a006-d18643556c6f-env-overrides\") pod \"ovnkube-node-jxmdk\" (UID: \"d26fb315-5752-4082-a006-d18643556c6f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jxmdk" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.484711 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d26fb315-5752-4082-a006-d18643556c6f-ovn-node-metrics-cert\") pod \"ovnkube-node-jxmdk\" (UID: \"d26fb315-5752-4082-a006-d18643556c6f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jxmdk" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.501410 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49s7v\" (UniqueName: \"kubernetes.io/projected/d26fb315-5752-4082-a006-d18643556c6f-kube-api-access-49s7v\") pod \"ovnkube-node-jxmdk\" (UID: \"d26fb315-5752-4082-a006-d18643556c6f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jxmdk" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.511437 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-jxmdk" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.523218 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xv5q7_52fc9378-c37b-424b-afde-7b191bab5fde/kube-multus/2.log" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.528513 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c2vrh_94028c24-ec10-4d5c-b32c-1700e677d539/ovn-acl-logging/0.log" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.529950 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c2vrh_94028c24-ec10-4d5c-b32c-1700e677d539/ovn-controller/0.log" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.534371 4752 generic.go:334] "Generic (PLEG): container finished" podID="94028c24-ec10-4d5c-b32c-1700e677d539" containerID="ea11fb795febf50e35263b0a02c32a01fd69937dfbfe196696cd1792e40cc191" exitCode=0 Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.534410 4752 generic.go:334] "Generic (PLEG): container finished" podID="94028c24-ec10-4d5c-b32c-1700e677d539" containerID="e34a55130babbc5fbe9fb81d05fc687dc1b06c3bffea762ba699f9f6c317b312" exitCode=0 Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.534421 4752 generic.go:334] "Generic (PLEG): container finished" podID="94028c24-ec10-4d5c-b32c-1700e677d539" containerID="5985eb5ebc8fa2ca986873aea235335770621597493b43eaa58d98329cd37009" exitCode=0 Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.534460 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" event={"ID":"94028c24-ec10-4d5c-b32c-1700e677d539","Type":"ContainerDied","Data":"ea11fb795febf50e35263b0a02c32a01fd69937dfbfe196696cd1792e40cc191"} Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.534503 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" event={"ID":"94028c24-ec10-4d5c-b32c-1700e677d539","Type":"ContainerDied","Data":"e34a55130babbc5fbe9fb81d05fc687dc1b06c3bffea762ba699f9f6c317b312"} Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.534518 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" event={"ID":"94028c24-ec10-4d5c-b32c-1700e677d539","Type":"ContainerDied","Data":"5985eb5ebc8fa2ca986873aea235335770621597493b43eaa58d98329cd37009"} Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.534530 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" event={"ID":"94028c24-ec10-4d5c-b32c-1700e677d539","Type":"ContainerDied","Data":"905a8e32917410182bb8374b5bc38a9b93ec66b303093ada9c998360c91433a2"} Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.534530 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-c2vrh" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.534555 4752 scope.go:117] "RemoveContainer" containerID="3019dad252df73aaf83bc4c0b714472cf54345012a9a5b83a88315570d972fb7" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.601051 4752 scope.go:117] "RemoveContainer" containerID="ea11fb795febf50e35263b0a02c32a01fd69937dfbfe196696cd1792e40cc191" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.627357 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-c2vrh"] Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.638143 4752 scope.go:117] "RemoveContainer" containerID="e34a55130babbc5fbe9fb81d05fc687dc1b06c3bffea762ba699f9f6c317b312" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.640906 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-c2vrh"] Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.674574 4752 scope.go:117] "RemoveContainer" containerID="5985eb5ebc8fa2ca986873aea235335770621597493b43eaa58d98329cd37009" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.705118 4752 scope.go:117] "RemoveContainer" containerID="e2860691a355a598f52a1f13213198fa7889748e67cca21a617ed5714f5eabcc" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.746944 4752 scope.go:117] "RemoveContainer" containerID="486ac9c45cc8e6cc88a199b152343c1db14c51125b4357c85d5d082467fc4560" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.780916 4752 scope.go:117] "RemoveContainer" containerID="b46368b26939edaf377aa86ef45fc9dc3ec4fa274dfe1cba458bafb8d32309e4" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.825038 4752 scope.go:117] "RemoveContainer" containerID="8a98f237ee9baeb799b2ea76ccbe7b349ed70b50f47738fc514ae56b46ee8d1a" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.875567 4752 scope.go:117] "RemoveContainer" containerID="f22dfbbd26fb3ebf4869b46406913cc1963e33c11794193c815235be5acee338" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.893920 4752 scope.go:117] "RemoveContainer" containerID="3019dad252df73aaf83bc4c0b714472cf54345012a9a5b83a88315570d972fb7" Sep 29 10:55:47 crc kubenswrapper[4752]: E0929 10:55:47.894508 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3019dad252df73aaf83bc4c0b714472cf54345012a9a5b83a88315570d972fb7\": container with ID starting with 3019dad252df73aaf83bc4c0b714472cf54345012a9a5b83a88315570d972fb7 not found: ID does not exist" containerID="3019dad252df73aaf83bc4c0b714472cf54345012a9a5b83a88315570d972fb7" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.894549 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3019dad252df73aaf83bc4c0b714472cf54345012a9a5b83a88315570d972fb7"} err="failed to get container status \"3019dad252df73aaf83bc4c0b714472cf54345012a9a5b83a88315570d972fb7\": rpc error: code = NotFound desc = could not find container \"3019dad252df73aaf83bc4c0b714472cf54345012a9a5b83a88315570d972fb7\": container with ID starting with 3019dad252df73aaf83bc4c0b714472cf54345012a9a5b83a88315570d972fb7 not found: ID does not exist" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.894587 4752 scope.go:117] "RemoveContainer" containerID="ea11fb795febf50e35263b0a02c32a01fd69937dfbfe196696cd1792e40cc191" Sep 29 10:55:47 crc kubenswrapper[4752]: E0929 10:55:47.895700 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea11fb795febf50e35263b0a02c32a01fd69937dfbfe196696cd1792e40cc191\": container with ID starting with ea11fb795febf50e35263b0a02c32a01fd69937dfbfe196696cd1792e40cc191 not found: ID does not exist" containerID="ea11fb795febf50e35263b0a02c32a01fd69937dfbfe196696cd1792e40cc191" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.895785 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea11fb795febf50e35263b0a02c32a01fd69937dfbfe196696cd1792e40cc191"} err="failed to get container status \"ea11fb795febf50e35263b0a02c32a01fd69937dfbfe196696cd1792e40cc191\": rpc error: code = NotFound desc = could not find container \"ea11fb795febf50e35263b0a02c32a01fd69937dfbfe196696cd1792e40cc191\": container with ID starting with ea11fb795febf50e35263b0a02c32a01fd69937dfbfe196696cd1792e40cc191 not found: ID does not exist" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.895846 4752 scope.go:117] "RemoveContainer" containerID="e34a55130babbc5fbe9fb81d05fc687dc1b06c3bffea762ba699f9f6c317b312" Sep 29 10:55:47 crc kubenswrapper[4752]: E0929 10:55:47.896250 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e34a55130babbc5fbe9fb81d05fc687dc1b06c3bffea762ba699f9f6c317b312\": container with ID starting with e34a55130babbc5fbe9fb81d05fc687dc1b06c3bffea762ba699f9f6c317b312 not found: ID does not exist" containerID="e34a55130babbc5fbe9fb81d05fc687dc1b06c3bffea762ba699f9f6c317b312" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.896305 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e34a55130babbc5fbe9fb81d05fc687dc1b06c3bffea762ba699f9f6c317b312"} err="failed to get container status \"e34a55130babbc5fbe9fb81d05fc687dc1b06c3bffea762ba699f9f6c317b312\": rpc error: code = NotFound desc = could not find container \"e34a55130babbc5fbe9fb81d05fc687dc1b06c3bffea762ba699f9f6c317b312\": container with ID starting with e34a55130babbc5fbe9fb81d05fc687dc1b06c3bffea762ba699f9f6c317b312 not found: ID does not exist" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.896330 4752 scope.go:117] "RemoveContainer" containerID="5985eb5ebc8fa2ca986873aea235335770621597493b43eaa58d98329cd37009" Sep 29 10:55:47 crc kubenswrapper[4752]: E0929 10:55:47.896614 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5985eb5ebc8fa2ca986873aea235335770621597493b43eaa58d98329cd37009\": container with ID starting with 5985eb5ebc8fa2ca986873aea235335770621597493b43eaa58d98329cd37009 not found: ID does not exist" containerID="5985eb5ebc8fa2ca986873aea235335770621597493b43eaa58d98329cd37009" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.896644 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5985eb5ebc8fa2ca986873aea235335770621597493b43eaa58d98329cd37009"} err="failed to get container status \"5985eb5ebc8fa2ca986873aea235335770621597493b43eaa58d98329cd37009\": rpc error: code = NotFound desc = could not find container \"5985eb5ebc8fa2ca986873aea235335770621597493b43eaa58d98329cd37009\": container with ID starting with 5985eb5ebc8fa2ca986873aea235335770621597493b43eaa58d98329cd37009 not found: ID does not exist" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.896662 4752 scope.go:117] "RemoveContainer" containerID="e2860691a355a598f52a1f13213198fa7889748e67cca21a617ed5714f5eabcc" Sep 29 10:55:47 crc kubenswrapper[4752]: E0929 10:55:47.896984 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2860691a355a598f52a1f13213198fa7889748e67cca21a617ed5714f5eabcc\": container with ID starting with e2860691a355a598f52a1f13213198fa7889748e67cca21a617ed5714f5eabcc not found: ID does not exist" containerID="e2860691a355a598f52a1f13213198fa7889748e67cca21a617ed5714f5eabcc" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.897007 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2860691a355a598f52a1f13213198fa7889748e67cca21a617ed5714f5eabcc"} err="failed to get container status \"e2860691a355a598f52a1f13213198fa7889748e67cca21a617ed5714f5eabcc\": rpc error: code = NotFound desc = could not find container \"e2860691a355a598f52a1f13213198fa7889748e67cca21a617ed5714f5eabcc\": container with ID starting with e2860691a355a598f52a1f13213198fa7889748e67cca21a617ed5714f5eabcc not found: ID does not exist" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.897028 4752 scope.go:117] "RemoveContainer" containerID="486ac9c45cc8e6cc88a199b152343c1db14c51125b4357c85d5d082467fc4560" Sep 29 10:55:47 crc kubenswrapper[4752]: E0929 10:55:47.897283 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"486ac9c45cc8e6cc88a199b152343c1db14c51125b4357c85d5d082467fc4560\": container with ID starting with 486ac9c45cc8e6cc88a199b152343c1db14c51125b4357c85d5d082467fc4560 not found: ID does not exist" containerID="486ac9c45cc8e6cc88a199b152343c1db14c51125b4357c85d5d082467fc4560" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.897311 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"486ac9c45cc8e6cc88a199b152343c1db14c51125b4357c85d5d082467fc4560"} err="failed to get container status \"486ac9c45cc8e6cc88a199b152343c1db14c51125b4357c85d5d082467fc4560\": rpc error: code = NotFound desc = could not find container \"486ac9c45cc8e6cc88a199b152343c1db14c51125b4357c85d5d082467fc4560\": container with ID starting with 486ac9c45cc8e6cc88a199b152343c1db14c51125b4357c85d5d082467fc4560 not found: ID does not exist" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.897329 4752 scope.go:117] "RemoveContainer" containerID="b46368b26939edaf377aa86ef45fc9dc3ec4fa274dfe1cba458bafb8d32309e4" Sep 29 10:55:47 crc kubenswrapper[4752]: E0929 10:55:47.897564 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b46368b26939edaf377aa86ef45fc9dc3ec4fa274dfe1cba458bafb8d32309e4\": container with ID starting with b46368b26939edaf377aa86ef45fc9dc3ec4fa274dfe1cba458bafb8d32309e4 not found: ID does not exist" containerID="b46368b26939edaf377aa86ef45fc9dc3ec4fa274dfe1cba458bafb8d32309e4" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.897592 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b46368b26939edaf377aa86ef45fc9dc3ec4fa274dfe1cba458bafb8d32309e4"} err="failed to get container status \"b46368b26939edaf377aa86ef45fc9dc3ec4fa274dfe1cba458bafb8d32309e4\": rpc error: code = NotFound desc = could not find container \"b46368b26939edaf377aa86ef45fc9dc3ec4fa274dfe1cba458bafb8d32309e4\": container with ID starting with b46368b26939edaf377aa86ef45fc9dc3ec4fa274dfe1cba458bafb8d32309e4 not found: ID does not exist" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.897609 4752 scope.go:117] "RemoveContainer" containerID="8a98f237ee9baeb799b2ea76ccbe7b349ed70b50f47738fc514ae56b46ee8d1a" Sep 29 10:55:47 crc kubenswrapper[4752]: E0929 10:55:47.897947 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a98f237ee9baeb799b2ea76ccbe7b349ed70b50f47738fc514ae56b46ee8d1a\": container with ID starting with 8a98f237ee9baeb799b2ea76ccbe7b349ed70b50f47738fc514ae56b46ee8d1a not found: ID does not exist" containerID="8a98f237ee9baeb799b2ea76ccbe7b349ed70b50f47738fc514ae56b46ee8d1a" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.897989 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a98f237ee9baeb799b2ea76ccbe7b349ed70b50f47738fc514ae56b46ee8d1a"} err="failed to get container status \"8a98f237ee9baeb799b2ea76ccbe7b349ed70b50f47738fc514ae56b46ee8d1a\": rpc error: code = NotFound desc = could not find container \"8a98f237ee9baeb799b2ea76ccbe7b349ed70b50f47738fc514ae56b46ee8d1a\": container with ID starting with 8a98f237ee9baeb799b2ea76ccbe7b349ed70b50f47738fc514ae56b46ee8d1a not found: ID does not exist" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.898019 4752 scope.go:117] "RemoveContainer" containerID="f22dfbbd26fb3ebf4869b46406913cc1963e33c11794193c815235be5acee338" Sep 29 10:55:47 crc kubenswrapper[4752]: E0929 10:55:47.898324 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f22dfbbd26fb3ebf4869b46406913cc1963e33c11794193c815235be5acee338\": container with ID starting with f22dfbbd26fb3ebf4869b46406913cc1963e33c11794193c815235be5acee338 not found: ID does not exist" containerID="f22dfbbd26fb3ebf4869b46406913cc1963e33c11794193c815235be5acee338" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.898356 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f22dfbbd26fb3ebf4869b46406913cc1963e33c11794193c815235be5acee338"} err="failed to get container status \"f22dfbbd26fb3ebf4869b46406913cc1963e33c11794193c815235be5acee338\": rpc error: code = NotFound desc = could not find container \"f22dfbbd26fb3ebf4869b46406913cc1963e33c11794193c815235be5acee338\": container with ID starting with f22dfbbd26fb3ebf4869b46406913cc1963e33c11794193c815235be5acee338 not found: ID does not exist" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.898377 4752 scope.go:117] "RemoveContainer" containerID="3019dad252df73aaf83bc4c0b714472cf54345012a9a5b83a88315570d972fb7" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.898628 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3019dad252df73aaf83bc4c0b714472cf54345012a9a5b83a88315570d972fb7"} err="failed to get container status \"3019dad252df73aaf83bc4c0b714472cf54345012a9a5b83a88315570d972fb7\": rpc error: code = NotFound desc = could not find container \"3019dad252df73aaf83bc4c0b714472cf54345012a9a5b83a88315570d972fb7\": container with ID starting with 3019dad252df73aaf83bc4c0b714472cf54345012a9a5b83a88315570d972fb7 not found: ID does not exist" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.898649 4752 scope.go:117] "RemoveContainer" containerID="ea11fb795febf50e35263b0a02c32a01fd69937dfbfe196696cd1792e40cc191" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.898957 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea11fb795febf50e35263b0a02c32a01fd69937dfbfe196696cd1792e40cc191"} err="failed to get container status \"ea11fb795febf50e35263b0a02c32a01fd69937dfbfe196696cd1792e40cc191\": rpc error: code = NotFound desc = could not find container \"ea11fb795febf50e35263b0a02c32a01fd69937dfbfe196696cd1792e40cc191\": container with ID starting with ea11fb795febf50e35263b0a02c32a01fd69937dfbfe196696cd1792e40cc191 not found: ID does not exist" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.898986 4752 scope.go:117] "RemoveContainer" containerID="e34a55130babbc5fbe9fb81d05fc687dc1b06c3bffea762ba699f9f6c317b312" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.899479 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e34a55130babbc5fbe9fb81d05fc687dc1b06c3bffea762ba699f9f6c317b312"} err="failed to get container status \"e34a55130babbc5fbe9fb81d05fc687dc1b06c3bffea762ba699f9f6c317b312\": rpc error: code = NotFound desc = could not find container \"e34a55130babbc5fbe9fb81d05fc687dc1b06c3bffea762ba699f9f6c317b312\": container with ID starting with e34a55130babbc5fbe9fb81d05fc687dc1b06c3bffea762ba699f9f6c317b312 not found: ID does not exist" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.899508 4752 scope.go:117] "RemoveContainer" containerID="5985eb5ebc8fa2ca986873aea235335770621597493b43eaa58d98329cd37009" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.899785 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5985eb5ebc8fa2ca986873aea235335770621597493b43eaa58d98329cd37009"} err="failed to get container status \"5985eb5ebc8fa2ca986873aea235335770621597493b43eaa58d98329cd37009\": rpc error: code = NotFound desc = could not find container \"5985eb5ebc8fa2ca986873aea235335770621597493b43eaa58d98329cd37009\": container with ID starting with 5985eb5ebc8fa2ca986873aea235335770621597493b43eaa58d98329cd37009 not found: ID does not exist" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.899832 4752 scope.go:117] "RemoveContainer" containerID="e2860691a355a598f52a1f13213198fa7889748e67cca21a617ed5714f5eabcc" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.900082 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2860691a355a598f52a1f13213198fa7889748e67cca21a617ed5714f5eabcc"} err="failed to get container status \"e2860691a355a598f52a1f13213198fa7889748e67cca21a617ed5714f5eabcc\": rpc error: code = NotFound desc = could not find container \"e2860691a355a598f52a1f13213198fa7889748e67cca21a617ed5714f5eabcc\": container with ID starting with e2860691a355a598f52a1f13213198fa7889748e67cca21a617ed5714f5eabcc not found: ID does not exist" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.900115 4752 scope.go:117] "RemoveContainer" containerID="486ac9c45cc8e6cc88a199b152343c1db14c51125b4357c85d5d082467fc4560" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.900573 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"486ac9c45cc8e6cc88a199b152343c1db14c51125b4357c85d5d082467fc4560"} err="failed to get container status \"486ac9c45cc8e6cc88a199b152343c1db14c51125b4357c85d5d082467fc4560\": rpc error: code = NotFound desc = could not find container \"486ac9c45cc8e6cc88a199b152343c1db14c51125b4357c85d5d082467fc4560\": container with ID starting with 486ac9c45cc8e6cc88a199b152343c1db14c51125b4357c85d5d082467fc4560 not found: ID does not exist" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.900600 4752 scope.go:117] "RemoveContainer" containerID="b46368b26939edaf377aa86ef45fc9dc3ec4fa274dfe1cba458bafb8d32309e4" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.900880 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b46368b26939edaf377aa86ef45fc9dc3ec4fa274dfe1cba458bafb8d32309e4"} err="failed to get container status \"b46368b26939edaf377aa86ef45fc9dc3ec4fa274dfe1cba458bafb8d32309e4\": rpc error: code = NotFound desc = could not find container \"b46368b26939edaf377aa86ef45fc9dc3ec4fa274dfe1cba458bafb8d32309e4\": container with ID starting with b46368b26939edaf377aa86ef45fc9dc3ec4fa274dfe1cba458bafb8d32309e4 not found: ID does not exist" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.900910 4752 scope.go:117] "RemoveContainer" containerID="8a98f237ee9baeb799b2ea76ccbe7b349ed70b50f47738fc514ae56b46ee8d1a" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.901173 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a98f237ee9baeb799b2ea76ccbe7b349ed70b50f47738fc514ae56b46ee8d1a"} err="failed to get container status \"8a98f237ee9baeb799b2ea76ccbe7b349ed70b50f47738fc514ae56b46ee8d1a\": rpc error: code = NotFound desc = could not find container \"8a98f237ee9baeb799b2ea76ccbe7b349ed70b50f47738fc514ae56b46ee8d1a\": container with ID starting with 8a98f237ee9baeb799b2ea76ccbe7b349ed70b50f47738fc514ae56b46ee8d1a not found: ID does not exist" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.901203 4752 scope.go:117] "RemoveContainer" containerID="f22dfbbd26fb3ebf4869b46406913cc1963e33c11794193c815235be5acee338" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.901438 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f22dfbbd26fb3ebf4869b46406913cc1963e33c11794193c815235be5acee338"} err="failed to get container status \"f22dfbbd26fb3ebf4869b46406913cc1963e33c11794193c815235be5acee338\": rpc error: code = NotFound desc = could not find container \"f22dfbbd26fb3ebf4869b46406913cc1963e33c11794193c815235be5acee338\": container with ID starting with f22dfbbd26fb3ebf4869b46406913cc1963e33c11794193c815235be5acee338 not found: ID does not exist" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.901458 4752 scope.go:117] "RemoveContainer" containerID="3019dad252df73aaf83bc4c0b714472cf54345012a9a5b83a88315570d972fb7" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.901683 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3019dad252df73aaf83bc4c0b714472cf54345012a9a5b83a88315570d972fb7"} err="failed to get container status \"3019dad252df73aaf83bc4c0b714472cf54345012a9a5b83a88315570d972fb7\": rpc error: code = NotFound desc = could not find container \"3019dad252df73aaf83bc4c0b714472cf54345012a9a5b83a88315570d972fb7\": container with ID starting with 3019dad252df73aaf83bc4c0b714472cf54345012a9a5b83a88315570d972fb7 not found: ID does not exist" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.901709 4752 scope.go:117] "RemoveContainer" containerID="ea11fb795febf50e35263b0a02c32a01fd69937dfbfe196696cd1792e40cc191" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.901986 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea11fb795febf50e35263b0a02c32a01fd69937dfbfe196696cd1792e40cc191"} err="failed to get container status \"ea11fb795febf50e35263b0a02c32a01fd69937dfbfe196696cd1792e40cc191\": rpc error: code = NotFound desc = could not find container \"ea11fb795febf50e35263b0a02c32a01fd69937dfbfe196696cd1792e40cc191\": container with ID starting with ea11fb795febf50e35263b0a02c32a01fd69937dfbfe196696cd1792e40cc191 not found: ID does not exist" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.902014 4752 scope.go:117] "RemoveContainer" containerID="e34a55130babbc5fbe9fb81d05fc687dc1b06c3bffea762ba699f9f6c317b312" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.902268 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e34a55130babbc5fbe9fb81d05fc687dc1b06c3bffea762ba699f9f6c317b312"} err="failed to get container status \"e34a55130babbc5fbe9fb81d05fc687dc1b06c3bffea762ba699f9f6c317b312\": rpc error: code = NotFound desc = could not find container \"e34a55130babbc5fbe9fb81d05fc687dc1b06c3bffea762ba699f9f6c317b312\": container with ID starting with e34a55130babbc5fbe9fb81d05fc687dc1b06c3bffea762ba699f9f6c317b312 not found: ID does not exist" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.902302 4752 scope.go:117] "RemoveContainer" containerID="5985eb5ebc8fa2ca986873aea235335770621597493b43eaa58d98329cd37009" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.902566 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5985eb5ebc8fa2ca986873aea235335770621597493b43eaa58d98329cd37009"} err="failed to get container status \"5985eb5ebc8fa2ca986873aea235335770621597493b43eaa58d98329cd37009\": rpc error: code = NotFound desc = could not find container \"5985eb5ebc8fa2ca986873aea235335770621597493b43eaa58d98329cd37009\": container with ID starting with 5985eb5ebc8fa2ca986873aea235335770621597493b43eaa58d98329cd37009 not found: ID does not exist" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.902586 4752 scope.go:117] "RemoveContainer" containerID="e2860691a355a598f52a1f13213198fa7889748e67cca21a617ed5714f5eabcc" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.902796 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2860691a355a598f52a1f13213198fa7889748e67cca21a617ed5714f5eabcc"} err="failed to get container status \"e2860691a355a598f52a1f13213198fa7889748e67cca21a617ed5714f5eabcc\": rpc error: code = NotFound desc = could not find container \"e2860691a355a598f52a1f13213198fa7889748e67cca21a617ed5714f5eabcc\": container with ID starting with e2860691a355a598f52a1f13213198fa7889748e67cca21a617ed5714f5eabcc not found: ID does not exist" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.902856 4752 scope.go:117] "RemoveContainer" containerID="486ac9c45cc8e6cc88a199b152343c1db14c51125b4357c85d5d082467fc4560" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.903092 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"486ac9c45cc8e6cc88a199b152343c1db14c51125b4357c85d5d082467fc4560"} err="failed to get container status \"486ac9c45cc8e6cc88a199b152343c1db14c51125b4357c85d5d082467fc4560\": rpc error: code = NotFound desc = could not find container \"486ac9c45cc8e6cc88a199b152343c1db14c51125b4357c85d5d082467fc4560\": container with ID starting with 486ac9c45cc8e6cc88a199b152343c1db14c51125b4357c85d5d082467fc4560 not found: ID does not exist" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.903114 4752 scope.go:117] "RemoveContainer" containerID="b46368b26939edaf377aa86ef45fc9dc3ec4fa274dfe1cba458bafb8d32309e4" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.903342 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b46368b26939edaf377aa86ef45fc9dc3ec4fa274dfe1cba458bafb8d32309e4"} err="failed to get container status \"b46368b26939edaf377aa86ef45fc9dc3ec4fa274dfe1cba458bafb8d32309e4\": rpc error: code = NotFound desc = could not find container \"b46368b26939edaf377aa86ef45fc9dc3ec4fa274dfe1cba458bafb8d32309e4\": container with ID starting with b46368b26939edaf377aa86ef45fc9dc3ec4fa274dfe1cba458bafb8d32309e4 not found: ID does not exist" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.903369 4752 scope.go:117] "RemoveContainer" containerID="8a98f237ee9baeb799b2ea76ccbe7b349ed70b50f47738fc514ae56b46ee8d1a" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.903623 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a98f237ee9baeb799b2ea76ccbe7b349ed70b50f47738fc514ae56b46ee8d1a"} err="failed to get container status \"8a98f237ee9baeb799b2ea76ccbe7b349ed70b50f47738fc514ae56b46ee8d1a\": rpc error: code = NotFound desc = could not find container \"8a98f237ee9baeb799b2ea76ccbe7b349ed70b50f47738fc514ae56b46ee8d1a\": container with ID starting with 8a98f237ee9baeb799b2ea76ccbe7b349ed70b50f47738fc514ae56b46ee8d1a not found: ID does not exist" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.903642 4752 scope.go:117] "RemoveContainer" containerID="f22dfbbd26fb3ebf4869b46406913cc1963e33c11794193c815235be5acee338" Sep 29 10:55:47 crc kubenswrapper[4752]: I0929 10:55:47.903883 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f22dfbbd26fb3ebf4869b46406913cc1963e33c11794193c815235be5acee338"} err="failed to get container status \"f22dfbbd26fb3ebf4869b46406913cc1963e33c11794193c815235be5acee338\": rpc error: code = NotFound desc = could not find container \"f22dfbbd26fb3ebf4869b46406913cc1963e33c11794193c815235be5acee338\": container with ID starting with f22dfbbd26fb3ebf4869b46406913cc1963e33c11794193c815235be5acee338 not found: ID does not exist" Sep 29 10:55:48 crc kubenswrapper[4752]: I0929 10:55:48.038995 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94028c24-ec10-4d5c-b32c-1700e677d539" path="/var/lib/kubelet/pods/94028c24-ec10-4d5c-b32c-1700e677d539/volumes" Sep 29 10:55:48 crc kubenswrapper[4752]: I0929 10:55:48.540850 4752 generic.go:334] "Generic (PLEG): container finished" podID="d26fb315-5752-4082-a006-d18643556c6f" containerID="a345e109fe00901e3400b7d7d7d8bd7de07b25f082bf7bed1bddfbea0e03856a" exitCode=0 Sep 29 10:55:48 crc kubenswrapper[4752]: I0929 10:55:48.541099 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jxmdk" event={"ID":"d26fb315-5752-4082-a006-d18643556c6f","Type":"ContainerDied","Data":"a345e109fe00901e3400b7d7d7d8bd7de07b25f082bf7bed1bddfbea0e03856a"} Sep 29 10:55:48 crc kubenswrapper[4752]: I0929 10:55:48.541232 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jxmdk" event={"ID":"d26fb315-5752-4082-a006-d18643556c6f","Type":"ContainerStarted","Data":"983324bba66e27079d616811b4c911377d437d674f3c8f02965060b3861b6216"} Sep 29 10:55:49 crc kubenswrapper[4752]: I0929 10:55:49.552645 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jxmdk" event={"ID":"d26fb315-5752-4082-a006-d18643556c6f","Type":"ContainerStarted","Data":"79b41ca772a51c10b34a2e8fc47e959f61e95f76dafcc628f0c179c4b373c6a1"} Sep 29 10:55:49 crc kubenswrapper[4752]: I0929 10:55:49.553637 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jxmdk" event={"ID":"d26fb315-5752-4082-a006-d18643556c6f","Type":"ContainerStarted","Data":"895174a31685094dc69e4959e2ca9226ae5b94a6b9b2d2760836e4137e90cc30"} Sep 29 10:55:49 crc kubenswrapper[4752]: I0929 10:55:49.553661 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jxmdk" event={"ID":"d26fb315-5752-4082-a006-d18643556c6f","Type":"ContainerStarted","Data":"0cc6aa1b02dc842e835e4bb35638da992e6ac23403af42aaeaafb9b283bd5c63"} Sep 29 10:55:49 crc kubenswrapper[4752]: I0929 10:55:49.553673 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jxmdk" event={"ID":"d26fb315-5752-4082-a006-d18643556c6f","Type":"ContainerStarted","Data":"0f56e1e606c0f6ca5b630913f41d4ab7832308fc99592988e58f285adad5b470"} Sep 29 10:55:49 crc kubenswrapper[4752]: I0929 10:55:49.553684 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jxmdk" event={"ID":"d26fb315-5752-4082-a006-d18643556c6f","Type":"ContainerStarted","Data":"8bdc21abc7840619a0130fa28264e9d4ffd2d7c3c8ce0b1134f4c2a0cb853226"} Sep 29 10:55:50 crc kubenswrapper[4752]: I0929 10:55:50.561313 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jxmdk" event={"ID":"d26fb315-5752-4082-a006-d18643556c6f","Type":"ContainerStarted","Data":"8bc6ac68f84a4899fff4ba72ddb9ee65097ca23653e870c4ae768902be236dc0"} Sep 29 10:55:52 crc kubenswrapper[4752]: I0929 10:55:52.587649 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jxmdk" event={"ID":"d26fb315-5752-4082-a006-d18643556c6f","Type":"ContainerStarted","Data":"47aacdb22c334e5b2cdfc0d3a5ede72b1ff12f1f10dffaa43d345bb11ec0754c"} Sep 29 10:55:53 crc kubenswrapper[4752]: I0929 10:55:53.169508 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-48xsw"] Sep 29 10:55:53 crc kubenswrapper[4752]: I0929 10:55:53.170524 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-48xsw" Sep 29 10:55:53 crc kubenswrapper[4752]: I0929 10:55:53.173643 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Sep 29 10:55:53 crc kubenswrapper[4752]: I0929 10:55:53.173924 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Sep 29 10:55:53 crc kubenswrapper[4752]: I0929 10:55:53.174247 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-bxpz6" Sep 29 10:55:53 crc kubenswrapper[4752]: I0929 10:55:53.333796 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-66f87598ff-kwzp7"] Sep 29 10:55:53 crc kubenswrapper[4752]: I0929 10:55:53.334926 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-66f87598ff-kwzp7" Sep 29 10:55:53 crc kubenswrapper[4752]: I0929 10:55:53.337531 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-bxqcs" Sep 29 10:55:53 crc kubenswrapper[4752]: I0929 10:55:53.337548 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Sep 29 10:55:53 crc kubenswrapper[4752]: I0929 10:55:53.340282 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-66f87598ff-nzms9"] Sep 29 10:55:53 crc kubenswrapper[4752]: I0929 10:55:53.341283 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-66f87598ff-nzms9" Sep 29 10:55:53 crc kubenswrapper[4752]: I0929 10:55:53.373181 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcmkn\" (UniqueName: \"kubernetes.io/projected/6fed8192-c969-4d5e-90f8-8dfcbdb533f6-kube-api-access-vcmkn\") pod \"obo-prometheus-operator-7c8cf85677-48xsw\" (UID: \"6fed8192-c969-4d5e-90f8-8dfcbdb533f6\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-48xsw" Sep 29 10:55:53 crc kubenswrapper[4752]: I0929 10:55:53.373283 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/81a0fa10-1cf0-4d64-8d87-be74cb9f191c-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-66f87598ff-kwzp7\" (UID: \"81a0fa10-1cf0-4d64-8d87-be74cb9f191c\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-66f87598ff-kwzp7" Sep 29 10:55:53 crc kubenswrapper[4752]: I0929 10:55:53.373318 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/81a0fa10-1cf0-4d64-8d87-be74cb9f191c-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-66f87598ff-kwzp7\" (UID: \"81a0fa10-1cf0-4d64-8d87-be74cb9f191c\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-66f87598ff-kwzp7" Sep 29 10:55:53 crc kubenswrapper[4752]: I0929 10:55:53.373350 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0dcbf226-4af2-490a-8974-2c107af2a51f-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-66f87598ff-nzms9\" (UID: \"0dcbf226-4af2-490a-8974-2c107af2a51f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-66f87598ff-nzms9" Sep 29 10:55:53 crc kubenswrapper[4752]: I0929 10:55:53.373469 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0dcbf226-4af2-490a-8974-2c107af2a51f-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-66f87598ff-nzms9\" (UID: \"0dcbf226-4af2-490a-8974-2c107af2a51f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-66f87598ff-nzms9" Sep 29 10:55:53 crc kubenswrapper[4752]: I0929 10:55:53.474295 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0dcbf226-4af2-490a-8974-2c107af2a51f-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-66f87598ff-nzms9\" (UID: \"0dcbf226-4af2-490a-8974-2c107af2a51f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-66f87598ff-nzms9" Sep 29 10:55:53 crc kubenswrapper[4752]: I0929 10:55:53.474377 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcmkn\" (UniqueName: \"kubernetes.io/projected/6fed8192-c969-4d5e-90f8-8dfcbdb533f6-kube-api-access-vcmkn\") pod \"obo-prometheus-operator-7c8cf85677-48xsw\" (UID: \"6fed8192-c969-4d5e-90f8-8dfcbdb533f6\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-48xsw" Sep 29 10:55:53 crc kubenswrapper[4752]: I0929 10:55:53.474418 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/81a0fa10-1cf0-4d64-8d87-be74cb9f191c-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-66f87598ff-kwzp7\" (UID: \"81a0fa10-1cf0-4d64-8d87-be74cb9f191c\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-66f87598ff-kwzp7" Sep 29 10:55:53 crc kubenswrapper[4752]: I0929 10:55:53.474445 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/81a0fa10-1cf0-4d64-8d87-be74cb9f191c-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-66f87598ff-kwzp7\" (UID: \"81a0fa10-1cf0-4d64-8d87-be74cb9f191c\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-66f87598ff-kwzp7" Sep 29 10:55:53 crc kubenswrapper[4752]: I0929 10:55:53.474476 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0dcbf226-4af2-490a-8974-2c107af2a51f-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-66f87598ff-nzms9\" (UID: \"0dcbf226-4af2-490a-8974-2c107af2a51f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-66f87598ff-nzms9" Sep 29 10:55:53 crc kubenswrapper[4752]: I0929 10:55:53.483670 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0dcbf226-4af2-490a-8974-2c107af2a51f-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-66f87598ff-nzms9\" (UID: \"0dcbf226-4af2-490a-8974-2c107af2a51f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-66f87598ff-nzms9" Sep 29 10:55:53 crc kubenswrapper[4752]: I0929 10:55:53.489041 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0dcbf226-4af2-490a-8974-2c107af2a51f-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-66f87598ff-nzms9\" (UID: \"0dcbf226-4af2-490a-8974-2c107af2a51f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-66f87598ff-nzms9" Sep 29 10:55:53 crc kubenswrapper[4752]: I0929 10:55:53.489118 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/81a0fa10-1cf0-4d64-8d87-be74cb9f191c-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-66f87598ff-kwzp7\" (UID: \"81a0fa10-1cf0-4d64-8d87-be74cb9f191c\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-66f87598ff-kwzp7" Sep 29 10:55:53 crc kubenswrapper[4752]: I0929 10:55:53.489521 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/81a0fa10-1cf0-4d64-8d87-be74cb9f191c-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-66f87598ff-kwzp7\" (UID: \"81a0fa10-1cf0-4d64-8d87-be74cb9f191c\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-66f87598ff-kwzp7" Sep 29 10:55:53 crc kubenswrapper[4752]: I0929 10:55:53.498706 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcmkn\" (UniqueName: \"kubernetes.io/projected/6fed8192-c969-4d5e-90f8-8dfcbdb533f6-kube-api-access-vcmkn\") pod \"obo-prometheus-operator-7c8cf85677-48xsw\" (UID: \"6fed8192-c969-4d5e-90f8-8dfcbdb533f6\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-48xsw" Sep 29 10:55:53 crc kubenswrapper[4752]: I0929 10:55:53.511071 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-xlqjx"] Sep 29 10:55:53 crc kubenswrapper[4752]: I0929 10:55:53.511887 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-cc5f78dfc-xlqjx" Sep 29 10:55:53 crc kubenswrapper[4752]: I0929 10:55:53.514778 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Sep 29 10:55:53 crc kubenswrapper[4752]: I0929 10:55:53.515042 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-68x2j" Sep 29 10:55:53 crc kubenswrapper[4752]: I0929 10:55:53.575132 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/48a83e0b-a019-4677-9d2c-4eaafc3a36b9-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-xlqjx\" (UID: \"48a83e0b-a019-4677-9d2c-4eaafc3a36b9\") " pod="openshift-operators/observability-operator-cc5f78dfc-xlqjx" Sep 29 10:55:53 crc kubenswrapper[4752]: I0929 10:55:53.575216 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lq4fv\" (UniqueName: \"kubernetes.io/projected/48a83e0b-a019-4677-9d2c-4eaafc3a36b9-kube-api-access-lq4fv\") pod \"observability-operator-cc5f78dfc-xlqjx\" (UID: \"48a83e0b-a019-4677-9d2c-4eaafc3a36b9\") " pod="openshift-operators/observability-operator-cc5f78dfc-xlqjx" Sep 29 10:55:53 crc kubenswrapper[4752]: I0929 10:55:53.652524 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-66f87598ff-kwzp7" Sep 29 10:55:53 crc kubenswrapper[4752]: I0929 10:55:53.663245 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-66f87598ff-nzms9" Sep 29 10:55:53 crc kubenswrapper[4752]: I0929 10:55:53.677430 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/48a83e0b-a019-4677-9d2c-4eaafc3a36b9-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-xlqjx\" (UID: \"48a83e0b-a019-4677-9d2c-4eaafc3a36b9\") " pod="openshift-operators/observability-operator-cc5f78dfc-xlqjx" Sep 29 10:55:53 crc kubenswrapper[4752]: I0929 10:55:53.677520 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lq4fv\" (UniqueName: \"kubernetes.io/projected/48a83e0b-a019-4677-9d2c-4eaafc3a36b9-kube-api-access-lq4fv\") pod \"observability-operator-cc5f78dfc-xlqjx\" (UID: \"48a83e0b-a019-4677-9d2c-4eaafc3a36b9\") " pod="openshift-operators/observability-operator-cc5f78dfc-xlqjx" Sep 29 10:55:53 crc kubenswrapper[4752]: I0929 10:55:53.684889 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/48a83e0b-a019-4677-9d2c-4eaafc3a36b9-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-xlqjx\" (UID: \"48a83e0b-a019-4677-9d2c-4eaafc3a36b9\") " pod="openshift-operators/observability-operator-cc5f78dfc-xlqjx" Sep 29 10:55:53 crc kubenswrapper[4752]: I0929 10:55:53.715783 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lq4fv\" (UniqueName: \"kubernetes.io/projected/48a83e0b-a019-4677-9d2c-4eaafc3a36b9-kube-api-access-lq4fv\") pod \"observability-operator-cc5f78dfc-xlqjx\" (UID: \"48a83e0b-a019-4677-9d2c-4eaafc3a36b9\") " pod="openshift-operators/observability-operator-cc5f78dfc-xlqjx" Sep 29 10:55:53 crc kubenswrapper[4752]: I0929 10:55:53.722769 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-gvb7z"] Sep 29 10:55:53 crc kubenswrapper[4752]: I0929 10:55:53.723776 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54bc95c9fb-gvb7z" Sep 29 10:55:53 crc kubenswrapper[4752]: I0929 10:55:53.726126 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-7298g" Sep 29 10:55:53 crc kubenswrapper[4752]: E0929 10:55:53.740292 4752 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-66f87598ff-nzms9_openshift-operators_0dcbf226-4af2-490a-8974-2c107af2a51f_0(3bcbc8f2036979c1ddad86d379b609dfb3c92d15615ea02ab7712c9387982659): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Sep 29 10:55:53 crc kubenswrapper[4752]: E0929 10:55:53.740396 4752 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-66f87598ff-nzms9_openshift-operators_0dcbf226-4af2-490a-8974-2c107af2a51f_0(3bcbc8f2036979c1ddad86d379b609dfb3c92d15615ea02ab7712c9387982659): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-66f87598ff-nzms9" Sep 29 10:55:53 crc kubenswrapper[4752]: E0929 10:55:53.740430 4752 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-66f87598ff-nzms9_openshift-operators_0dcbf226-4af2-490a-8974-2c107af2a51f_0(3bcbc8f2036979c1ddad86d379b609dfb3c92d15615ea02ab7712c9387982659): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-66f87598ff-nzms9" Sep 29 10:55:53 crc kubenswrapper[4752]: E0929 10:55:53.740490 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-66f87598ff-nzms9_openshift-operators(0dcbf226-4af2-490a-8974-2c107af2a51f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-66f87598ff-nzms9_openshift-operators(0dcbf226-4af2-490a-8974-2c107af2a51f)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-66f87598ff-nzms9_openshift-operators_0dcbf226-4af2-490a-8974-2c107af2a51f_0(3bcbc8f2036979c1ddad86d379b609dfb3c92d15615ea02ab7712c9387982659): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-66f87598ff-nzms9" podUID="0dcbf226-4af2-490a-8974-2c107af2a51f" Sep 29 10:55:53 crc kubenswrapper[4752]: E0929 10:55:53.745519 4752 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-66f87598ff-kwzp7_openshift-operators_81a0fa10-1cf0-4d64-8d87-be74cb9f191c_0(abf5bdd5d93895072ba703da05c6f2d33879ca2c1b0c5136f048a911eefa60da): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Sep 29 10:55:53 crc kubenswrapper[4752]: E0929 10:55:53.745623 4752 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-66f87598ff-kwzp7_openshift-operators_81a0fa10-1cf0-4d64-8d87-be74cb9f191c_0(abf5bdd5d93895072ba703da05c6f2d33879ca2c1b0c5136f048a911eefa60da): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-66f87598ff-kwzp7" Sep 29 10:55:53 crc kubenswrapper[4752]: E0929 10:55:53.745654 4752 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-66f87598ff-kwzp7_openshift-operators_81a0fa10-1cf0-4d64-8d87-be74cb9f191c_0(abf5bdd5d93895072ba703da05c6f2d33879ca2c1b0c5136f048a911eefa60da): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-66f87598ff-kwzp7" Sep 29 10:55:53 crc kubenswrapper[4752]: E0929 10:55:53.745708 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-66f87598ff-kwzp7_openshift-operators(81a0fa10-1cf0-4d64-8d87-be74cb9f191c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-66f87598ff-kwzp7_openshift-operators(81a0fa10-1cf0-4d64-8d87-be74cb9f191c)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-66f87598ff-kwzp7_openshift-operators_81a0fa10-1cf0-4d64-8d87-be74cb9f191c_0(abf5bdd5d93895072ba703da05c6f2d33879ca2c1b0c5136f048a911eefa60da): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-66f87598ff-kwzp7" podUID="81a0fa10-1cf0-4d64-8d87-be74cb9f191c" Sep 29 10:55:53 crc kubenswrapper[4752]: I0929 10:55:53.789372 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-48xsw" Sep 29 10:55:53 crc kubenswrapper[4752]: E0929 10:55:53.813043 4752 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-7c8cf85677-48xsw_openshift-operators_6fed8192-c969-4d5e-90f8-8dfcbdb533f6_0(ed8f97cc836fd1fe0aa63d1c6c577deef92ed3ca3417087b817856647f48a45b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Sep 29 10:55:53 crc kubenswrapper[4752]: E0929 10:55:53.813155 4752 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-7c8cf85677-48xsw_openshift-operators_6fed8192-c969-4d5e-90f8-8dfcbdb533f6_0(ed8f97cc836fd1fe0aa63d1c6c577deef92ed3ca3417087b817856647f48a45b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-48xsw" Sep 29 10:55:53 crc kubenswrapper[4752]: E0929 10:55:53.813192 4752 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-7c8cf85677-48xsw_openshift-operators_6fed8192-c969-4d5e-90f8-8dfcbdb533f6_0(ed8f97cc836fd1fe0aa63d1c6c577deef92ed3ca3417087b817856647f48a45b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-48xsw" Sep 29 10:55:53 crc kubenswrapper[4752]: E0929 10:55:53.813251 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-7c8cf85677-48xsw_openshift-operators(6fed8192-c969-4d5e-90f8-8dfcbdb533f6)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-7c8cf85677-48xsw_openshift-operators(6fed8192-c969-4d5e-90f8-8dfcbdb533f6)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-7c8cf85677-48xsw_openshift-operators_6fed8192-c969-4d5e-90f8-8dfcbdb533f6_0(ed8f97cc836fd1fe0aa63d1c6c577deef92ed3ca3417087b817856647f48a45b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-48xsw" podUID="6fed8192-c969-4d5e-90f8-8dfcbdb533f6" Sep 29 10:55:53 crc kubenswrapper[4752]: I0929 10:55:53.842985 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-cc5f78dfc-xlqjx" Sep 29 10:55:53 crc kubenswrapper[4752]: E0929 10:55:53.872530 4752 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-cc5f78dfc-xlqjx_openshift-operators_48a83e0b-a019-4677-9d2c-4eaafc3a36b9_0(f35a605928ab6a569355686c3960f82cc579d4bccb069fd28a1f47aa16bdb62a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Sep 29 10:55:53 crc kubenswrapper[4752]: E0929 10:55:53.872998 4752 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-cc5f78dfc-xlqjx_openshift-operators_48a83e0b-a019-4677-9d2c-4eaafc3a36b9_0(f35a605928ab6a569355686c3960f82cc579d4bccb069fd28a1f47aa16bdb62a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-cc5f78dfc-xlqjx" Sep 29 10:55:53 crc kubenswrapper[4752]: E0929 10:55:53.873022 4752 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-cc5f78dfc-xlqjx_openshift-operators_48a83e0b-a019-4677-9d2c-4eaafc3a36b9_0(f35a605928ab6a569355686c3960f82cc579d4bccb069fd28a1f47aa16bdb62a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-cc5f78dfc-xlqjx" Sep 29 10:55:53 crc kubenswrapper[4752]: E0929 10:55:53.873071 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-cc5f78dfc-xlqjx_openshift-operators(48a83e0b-a019-4677-9d2c-4eaafc3a36b9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-cc5f78dfc-xlqjx_openshift-operators(48a83e0b-a019-4677-9d2c-4eaafc3a36b9)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-cc5f78dfc-xlqjx_openshift-operators_48a83e0b-a019-4677-9d2c-4eaafc3a36b9_0(f35a605928ab6a569355686c3960f82cc579d4bccb069fd28a1f47aa16bdb62a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-cc5f78dfc-xlqjx" podUID="48a83e0b-a019-4677-9d2c-4eaafc3a36b9" Sep 29 10:55:53 crc kubenswrapper[4752]: I0929 10:55:53.879587 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/094aaacc-0758-4722-bc88-f4d4fc529d36-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-gvb7z\" (UID: \"094aaacc-0758-4722-bc88-f4d4fc529d36\") " pod="openshift-operators/perses-operator-54bc95c9fb-gvb7z" Sep 29 10:55:53 crc kubenswrapper[4752]: I0929 10:55:53.879958 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dglg\" (UniqueName: \"kubernetes.io/projected/094aaacc-0758-4722-bc88-f4d4fc529d36-kube-api-access-4dglg\") pod \"perses-operator-54bc95c9fb-gvb7z\" (UID: \"094aaacc-0758-4722-bc88-f4d4fc529d36\") " pod="openshift-operators/perses-operator-54bc95c9fb-gvb7z" Sep 29 10:55:53 crc kubenswrapper[4752]: I0929 10:55:53.981497 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/094aaacc-0758-4722-bc88-f4d4fc529d36-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-gvb7z\" (UID: \"094aaacc-0758-4722-bc88-f4d4fc529d36\") " pod="openshift-operators/perses-operator-54bc95c9fb-gvb7z" Sep 29 10:55:53 crc kubenswrapper[4752]: I0929 10:55:53.981596 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dglg\" (UniqueName: \"kubernetes.io/projected/094aaacc-0758-4722-bc88-f4d4fc529d36-kube-api-access-4dglg\") pod \"perses-operator-54bc95c9fb-gvb7z\" (UID: \"094aaacc-0758-4722-bc88-f4d4fc529d36\") " pod="openshift-operators/perses-operator-54bc95c9fb-gvb7z" Sep 29 10:55:53 crc kubenswrapper[4752]: I0929 10:55:53.982620 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/094aaacc-0758-4722-bc88-f4d4fc529d36-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-gvb7z\" (UID: \"094aaacc-0758-4722-bc88-f4d4fc529d36\") " pod="openshift-operators/perses-operator-54bc95c9fb-gvb7z" Sep 29 10:55:54 crc kubenswrapper[4752]: I0929 10:55:54.002170 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dglg\" (UniqueName: \"kubernetes.io/projected/094aaacc-0758-4722-bc88-f4d4fc529d36-kube-api-access-4dglg\") pod \"perses-operator-54bc95c9fb-gvb7z\" (UID: \"094aaacc-0758-4722-bc88-f4d4fc529d36\") " pod="openshift-operators/perses-operator-54bc95c9fb-gvb7z" Sep 29 10:55:54 crc kubenswrapper[4752]: I0929 10:55:54.047133 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54bc95c9fb-gvb7z" Sep 29 10:55:54 crc kubenswrapper[4752]: E0929 10:55:54.069382 4752 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-54bc95c9fb-gvb7z_openshift-operators_094aaacc-0758-4722-bc88-f4d4fc529d36_0(d0dbc8fddaadf60c6bbaee3e8b6c25afb586dfe8cc9ac739b9a1be3c8cd7a7c5): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Sep 29 10:55:54 crc kubenswrapper[4752]: E0929 10:55:54.069475 4752 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-54bc95c9fb-gvb7z_openshift-operators_094aaacc-0758-4722-bc88-f4d4fc529d36_0(d0dbc8fddaadf60c6bbaee3e8b6c25afb586dfe8cc9ac739b9a1be3c8cd7a7c5): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-54bc95c9fb-gvb7z" Sep 29 10:55:54 crc kubenswrapper[4752]: E0929 10:55:54.069529 4752 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-54bc95c9fb-gvb7z_openshift-operators_094aaacc-0758-4722-bc88-f4d4fc529d36_0(d0dbc8fddaadf60c6bbaee3e8b6c25afb586dfe8cc9ac739b9a1be3c8cd7a7c5): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-54bc95c9fb-gvb7z" Sep 29 10:55:54 crc kubenswrapper[4752]: E0929 10:55:54.069592 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-54bc95c9fb-gvb7z_openshift-operators(094aaacc-0758-4722-bc88-f4d4fc529d36)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-54bc95c9fb-gvb7z_openshift-operators(094aaacc-0758-4722-bc88-f4d4fc529d36)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-54bc95c9fb-gvb7z_openshift-operators_094aaacc-0758-4722-bc88-f4d4fc529d36_0(d0dbc8fddaadf60c6bbaee3e8b6c25afb586dfe8cc9ac739b9a1be3c8cd7a7c5): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-54bc95c9fb-gvb7z" podUID="094aaacc-0758-4722-bc88-f4d4fc529d36" Sep 29 10:55:54 crc kubenswrapper[4752]: I0929 10:55:54.609502 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jxmdk" event={"ID":"d26fb315-5752-4082-a006-d18643556c6f","Type":"ContainerStarted","Data":"81e846333c7fe7084b36e6a9692861b921f60a5ad80f211f60c84aa17538a955"} Sep 29 10:55:54 crc kubenswrapper[4752]: I0929 10:55:54.609918 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-jxmdk" Sep 29 10:55:54 crc kubenswrapper[4752]: I0929 10:55:54.609979 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-jxmdk" Sep 29 10:55:54 crc kubenswrapper[4752]: I0929 10:55:54.609993 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-jxmdk" Sep 29 10:55:54 crc kubenswrapper[4752]: I0929 10:55:54.639401 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jxmdk" Sep 29 10:55:54 crc kubenswrapper[4752]: I0929 10:55:54.640721 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jxmdk" Sep 29 10:55:54 crc kubenswrapper[4752]: I0929 10:55:54.656681 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-jxmdk" podStartSLOduration=7.656643077 podStartE2EDuration="7.656643077s" podCreationTimestamp="2025-09-29 10:55:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:55:54.645294321 +0000 UTC m=+695.434435988" watchObservedRunningTime="2025-09-29 10:55:54.656643077 +0000 UTC m=+695.445784895" Sep 29 10:55:54 crc kubenswrapper[4752]: I0929 10:55:54.713678 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-gvb7z"] Sep 29 10:55:54 crc kubenswrapper[4752]: I0929 10:55:54.713885 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54bc95c9fb-gvb7z" Sep 29 10:55:54 crc kubenswrapper[4752]: I0929 10:55:54.714659 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54bc95c9fb-gvb7z" Sep 29 10:55:54 crc kubenswrapper[4752]: I0929 10:55:54.725901 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-66f87598ff-nzms9"] Sep 29 10:55:54 crc kubenswrapper[4752]: I0929 10:55:54.726073 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-66f87598ff-nzms9" Sep 29 10:55:54 crc kubenswrapper[4752]: I0929 10:55:54.726634 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-66f87598ff-nzms9" Sep 29 10:55:54 crc kubenswrapper[4752]: E0929 10:55:54.755166 4752 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-54bc95c9fb-gvb7z_openshift-operators_094aaacc-0758-4722-bc88-f4d4fc529d36_0(066d8e7157f002cba6a75172299321f631267cf663d7f6d1dcda49b889e8bc5e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Sep 29 10:55:54 crc kubenswrapper[4752]: E0929 10:55:54.755631 4752 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-54bc95c9fb-gvb7z_openshift-operators_094aaacc-0758-4722-bc88-f4d4fc529d36_0(066d8e7157f002cba6a75172299321f631267cf663d7f6d1dcda49b889e8bc5e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-54bc95c9fb-gvb7z" Sep 29 10:55:54 crc kubenswrapper[4752]: E0929 10:55:54.755660 4752 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-54bc95c9fb-gvb7z_openshift-operators_094aaacc-0758-4722-bc88-f4d4fc529d36_0(066d8e7157f002cba6a75172299321f631267cf663d7f6d1dcda49b889e8bc5e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-54bc95c9fb-gvb7z" Sep 29 10:55:54 crc kubenswrapper[4752]: E0929 10:55:54.755767 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-54bc95c9fb-gvb7z_openshift-operators(094aaacc-0758-4722-bc88-f4d4fc529d36)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-54bc95c9fb-gvb7z_openshift-operators(094aaacc-0758-4722-bc88-f4d4fc529d36)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-54bc95c9fb-gvb7z_openshift-operators_094aaacc-0758-4722-bc88-f4d4fc529d36_0(066d8e7157f002cba6a75172299321f631267cf663d7f6d1dcda49b889e8bc5e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-54bc95c9fb-gvb7z" podUID="094aaacc-0758-4722-bc88-f4d4fc529d36" Sep 29 10:55:54 crc kubenswrapper[4752]: I0929 10:55:54.760562 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-xlqjx"] Sep 29 10:55:54 crc kubenswrapper[4752]: I0929 10:55:54.760713 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-cc5f78dfc-xlqjx" Sep 29 10:55:54 crc kubenswrapper[4752]: I0929 10:55:54.763350 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-cc5f78dfc-xlqjx" Sep 29 10:55:54 crc kubenswrapper[4752]: E0929 10:55:54.783447 4752 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-66f87598ff-nzms9_openshift-operators_0dcbf226-4af2-490a-8974-2c107af2a51f_0(1e71a4b2f5a6ae7cae67fef93387a7074c3ce9a7a9a04a747e7db0fbfff40e44): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Sep 29 10:55:54 crc kubenswrapper[4752]: E0929 10:55:54.783537 4752 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-66f87598ff-nzms9_openshift-operators_0dcbf226-4af2-490a-8974-2c107af2a51f_0(1e71a4b2f5a6ae7cae67fef93387a7074c3ce9a7a9a04a747e7db0fbfff40e44): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-66f87598ff-nzms9" Sep 29 10:55:54 crc kubenswrapper[4752]: E0929 10:55:54.783576 4752 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-66f87598ff-nzms9_openshift-operators_0dcbf226-4af2-490a-8974-2c107af2a51f_0(1e71a4b2f5a6ae7cae67fef93387a7074c3ce9a7a9a04a747e7db0fbfff40e44): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-66f87598ff-nzms9" Sep 29 10:55:54 crc kubenswrapper[4752]: E0929 10:55:54.783658 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-66f87598ff-nzms9_openshift-operators(0dcbf226-4af2-490a-8974-2c107af2a51f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-66f87598ff-nzms9_openshift-operators(0dcbf226-4af2-490a-8974-2c107af2a51f)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-66f87598ff-nzms9_openshift-operators_0dcbf226-4af2-490a-8974-2c107af2a51f_0(1e71a4b2f5a6ae7cae67fef93387a7074c3ce9a7a9a04a747e7db0fbfff40e44): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-66f87598ff-nzms9" podUID="0dcbf226-4af2-490a-8974-2c107af2a51f" Sep 29 10:55:54 crc kubenswrapper[4752]: E0929 10:55:54.791847 4752 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-cc5f78dfc-xlqjx_openshift-operators_48a83e0b-a019-4677-9d2c-4eaafc3a36b9_0(a39578ba73d1b7f49773aa8e778066ba4d9b8478944086dd639165228f37ebe5): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Sep 29 10:55:54 crc kubenswrapper[4752]: E0929 10:55:54.791946 4752 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-cc5f78dfc-xlqjx_openshift-operators_48a83e0b-a019-4677-9d2c-4eaafc3a36b9_0(a39578ba73d1b7f49773aa8e778066ba4d9b8478944086dd639165228f37ebe5): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-cc5f78dfc-xlqjx" Sep 29 10:55:54 crc kubenswrapper[4752]: E0929 10:55:54.791975 4752 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-cc5f78dfc-xlqjx_openshift-operators_48a83e0b-a019-4677-9d2c-4eaafc3a36b9_0(a39578ba73d1b7f49773aa8e778066ba4d9b8478944086dd639165228f37ebe5): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-cc5f78dfc-xlqjx" Sep 29 10:55:54 crc kubenswrapper[4752]: E0929 10:55:54.792035 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-cc5f78dfc-xlqjx_openshift-operators(48a83e0b-a019-4677-9d2c-4eaafc3a36b9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-cc5f78dfc-xlqjx_openshift-operators(48a83e0b-a019-4677-9d2c-4eaafc3a36b9)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-cc5f78dfc-xlqjx_openshift-operators_48a83e0b-a019-4677-9d2c-4eaafc3a36b9_0(a39578ba73d1b7f49773aa8e778066ba4d9b8478944086dd639165228f37ebe5): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-cc5f78dfc-xlqjx" podUID="48a83e0b-a019-4677-9d2c-4eaafc3a36b9" Sep 29 10:55:54 crc kubenswrapper[4752]: I0929 10:55:54.812352 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-66f87598ff-kwzp7"] Sep 29 10:55:54 crc kubenswrapper[4752]: I0929 10:55:54.812594 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-66f87598ff-kwzp7" Sep 29 10:55:54 crc kubenswrapper[4752]: I0929 10:55:54.813284 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-66f87598ff-kwzp7" Sep 29 10:55:54 crc kubenswrapper[4752]: I0929 10:55:54.817668 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-48xsw"] Sep 29 10:55:54 crc kubenswrapper[4752]: I0929 10:55:54.817895 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-48xsw" Sep 29 10:55:54 crc kubenswrapper[4752]: I0929 10:55:54.818482 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-48xsw" Sep 29 10:55:54 crc kubenswrapper[4752]: E0929 10:55:54.877351 4752 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-66f87598ff-kwzp7_openshift-operators_81a0fa10-1cf0-4d64-8d87-be74cb9f191c_0(f545f2571a2de19134045af6677e451faad0f541fc8bff2f492dc103ae0be6f1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Sep 29 10:55:54 crc kubenswrapper[4752]: E0929 10:55:54.877441 4752 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-66f87598ff-kwzp7_openshift-operators_81a0fa10-1cf0-4d64-8d87-be74cb9f191c_0(f545f2571a2de19134045af6677e451faad0f541fc8bff2f492dc103ae0be6f1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-66f87598ff-kwzp7" Sep 29 10:55:54 crc kubenswrapper[4752]: E0929 10:55:54.877471 4752 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-66f87598ff-kwzp7_openshift-operators_81a0fa10-1cf0-4d64-8d87-be74cb9f191c_0(f545f2571a2de19134045af6677e451faad0f541fc8bff2f492dc103ae0be6f1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-66f87598ff-kwzp7" Sep 29 10:55:54 crc kubenswrapper[4752]: E0929 10:55:54.877530 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-66f87598ff-kwzp7_openshift-operators(81a0fa10-1cf0-4d64-8d87-be74cb9f191c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-66f87598ff-kwzp7_openshift-operators(81a0fa10-1cf0-4d64-8d87-be74cb9f191c)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-66f87598ff-kwzp7_openshift-operators_81a0fa10-1cf0-4d64-8d87-be74cb9f191c_0(f545f2571a2de19134045af6677e451faad0f541fc8bff2f492dc103ae0be6f1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-66f87598ff-kwzp7" podUID="81a0fa10-1cf0-4d64-8d87-be74cb9f191c" Sep 29 10:55:54 crc kubenswrapper[4752]: E0929 10:55:54.890689 4752 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-7c8cf85677-48xsw_openshift-operators_6fed8192-c969-4d5e-90f8-8dfcbdb533f6_0(99cfa266a5538187505128c4e4781571df6c35edb3d344b74812a4add55fb76a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Sep 29 10:55:54 crc kubenswrapper[4752]: E0929 10:55:54.890770 4752 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-7c8cf85677-48xsw_openshift-operators_6fed8192-c969-4d5e-90f8-8dfcbdb533f6_0(99cfa266a5538187505128c4e4781571df6c35edb3d344b74812a4add55fb76a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-48xsw" Sep 29 10:55:54 crc kubenswrapper[4752]: E0929 10:55:54.890793 4752 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-7c8cf85677-48xsw_openshift-operators_6fed8192-c969-4d5e-90f8-8dfcbdb533f6_0(99cfa266a5538187505128c4e4781571df6c35edb3d344b74812a4add55fb76a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-48xsw" Sep 29 10:55:54 crc kubenswrapper[4752]: E0929 10:55:54.890858 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-7c8cf85677-48xsw_openshift-operators(6fed8192-c969-4d5e-90f8-8dfcbdb533f6)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-7c8cf85677-48xsw_openshift-operators(6fed8192-c969-4d5e-90f8-8dfcbdb533f6)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-7c8cf85677-48xsw_openshift-operators_6fed8192-c969-4d5e-90f8-8dfcbdb533f6_0(99cfa266a5538187505128c4e4781571df6c35edb3d344b74812a4add55fb76a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-48xsw" podUID="6fed8192-c969-4d5e-90f8-8dfcbdb533f6" Sep 29 10:55:56 crc kubenswrapper[4752]: I0929 10:55:56.175664 4752 patch_prober.go:28] interesting pod/machine-config-daemon-mgrvs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 10:55:56 crc kubenswrapper[4752]: I0929 10:55:56.175728 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 10:55:59 crc kubenswrapper[4752]: I0929 10:55:59.031281 4752 scope.go:117] "RemoveContainer" containerID="eff8591a1e7e061df63a2f3b4b4af9f4dd03197426fd89027902ac085abf289f" Sep 29 10:55:59 crc kubenswrapper[4752]: E0929 10:55:59.032038 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-xv5q7_openshift-multus(52fc9378-c37b-424b-afde-7b191bab5fde)\"" pod="openshift-multus/multus-xv5q7" podUID="52fc9378-c37b-424b-afde-7b191bab5fde" Sep 29 10:56:05 crc kubenswrapper[4752]: I0929 10:56:05.030145 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-cc5f78dfc-xlqjx" Sep 29 10:56:05 crc kubenswrapper[4752]: I0929 10:56:05.030189 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-66f87598ff-nzms9" Sep 29 10:56:05 crc kubenswrapper[4752]: I0929 10:56:05.031312 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-cc5f78dfc-xlqjx" Sep 29 10:56:05 crc kubenswrapper[4752]: I0929 10:56:05.031443 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-66f87598ff-nzms9" Sep 29 10:56:05 crc kubenswrapper[4752]: E0929 10:56:05.076701 4752 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-66f87598ff-nzms9_openshift-operators_0dcbf226-4af2-490a-8974-2c107af2a51f_0(887b856cbfa45daace5d2b410457b026b023370119316cc5d8c31ad50d652187): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Sep 29 10:56:05 crc kubenswrapper[4752]: E0929 10:56:05.077392 4752 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-66f87598ff-nzms9_openshift-operators_0dcbf226-4af2-490a-8974-2c107af2a51f_0(887b856cbfa45daace5d2b410457b026b023370119316cc5d8c31ad50d652187): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-66f87598ff-nzms9" Sep 29 10:56:05 crc kubenswrapper[4752]: E0929 10:56:05.077427 4752 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-66f87598ff-nzms9_openshift-operators_0dcbf226-4af2-490a-8974-2c107af2a51f_0(887b856cbfa45daace5d2b410457b026b023370119316cc5d8c31ad50d652187): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-66f87598ff-nzms9" Sep 29 10:56:05 crc kubenswrapper[4752]: E0929 10:56:05.077516 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-66f87598ff-nzms9_openshift-operators(0dcbf226-4af2-490a-8974-2c107af2a51f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-66f87598ff-nzms9_openshift-operators(0dcbf226-4af2-490a-8974-2c107af2a51f)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-66f87598ff-nzms9_openshift-operators_0dcbf226-4af2-490a-8974-2c107af2a51f_0(887b856cbfa45daace5d2b410457b026b023370119316cc5d8c31ad50d652187): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-66f87598ff-nzms9" podUID="0dcbf226-4af2-490a-8974-2c107af2a51f" Sep 29 10:56:05 crc kubenswrapper[4752]: E0929 10:56:05.084842 4752 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-cc5f78dfc-xlqjx_openshift-operators_48a83e0b-a019-4677-9d2c-4eaafc3a36b9_0(0ac5aa5642e0d6136d4c531ca95454377f32cb891866e39dfb75ad813c379755): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Sep 29 10:56:05 crc kubenswrapper[4752]: E0929 10:56:05.084940 4752 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-cc5f78dfc-xlqjx_openshift-operators_48a83e0b-a019-4677-9d2c-4eaafc3a36b9_0(0ac5aa5642e0d6136d4c531ca95454377f32cb891866e39dfb75ad813c379755): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-cc5f78dfc-xlqjx" Sep 29 10:56:05 crc kubenswrapper[4752]: E0929 10:56:05.084970 4752 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-cc5f78dfc-xlqjx_openshift-operators_48a83e0b-a019-4677-9d2c-4eaafc3a36b9_0(0ac5aa5642e0d6136d4c531ca95454377f32cb891866e39dfb75ad813c379755): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-cc5f78dfc-xlqjx" Sep 29 10:56:05 crc kubenswrapper[4752]: E0929 10:56:05.085038 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-cc5f78dfc-xlqjx_openshift-operators(48a83e0b-a019-4677-9d2c-4eaafc3a36b9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-cc5f78dfc-xlqjx_openshift-operators(48a83e0b-a019-4677-9d2c-4eaafc3a36b9)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-cc5f78dfc-xlqjx_openshift-operators_48a83e0b-a019-4677-9d2c-4eaafc3a36b9_0(0ac5aa5642e0d6136d4c531ca95454377f32cb891866e39dfb75ad813c379755): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-cc5f78dfc-xlqjx" podUID="48a83e0b-a019-4677-9d2c-4eaafc3a36b9" Sep 29 10:56:07 crc kubenswrapper[4752]: I0929 10:56:07.031033 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-48xsw" Sep 29 10:56:07 crc kubenswrapper[4752]: I0929 10:56:07.032297 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-48xsw" Sep 29 10:56:07 crc kubenswrapper[4752]: E0929 10:56:07.062499 4752 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-7c8cf85677-48xsw_openshift-operators_6fed8192-c969-4d5e-90f8-8dfcbdb533f6_0(2df3c31764eaa6905d14a752fa6daee1c1278f4ce93b72b97125531106fb0a01): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Sep 29 10:56:07 crc kubenswrapper[4752]: E0929 10:56:07.062581 4752 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-7c8cf85677-48xsw_openshift-operators_6fed8192-c969-4d5e-90f8-8dfcbdb533f6_0(2df3c31764eaa6905d14a752fa6daee1c1278f4ce93b72b97125531106fb0a01): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-48xsw" Sep 29 10:56:07 crc kubenswrapper[4752]: E0929 10:56:07.062606 4752 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-7c8cf85677-48xsw_openshift-operators_6fed8192-c969-4d5e-90f8-8dfcbdb533f6_0(2df3c31764eaa6905d14a752fa6daee1c1278f4ce93b72b97125531106fb0a01): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-48xsw" Sep 29 10:56:07 crc kubenswrapper[4752]: E0929 10:56:07.062696 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-7c8cf85677-48xsw_openshift-operators(6fed8192-c969-4d5e-90f8-8dfcbdb533f6)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-7c8cf85677-48xsw_openshift-operators(6fed8192-c969-4d5e-90f8-8dfcbdb533f6)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-7c8cf85677-48xsw_openshift-operators_6fed8192-c969-4d5e-90f8-8dfcbdb533f6_0(2df3c31764eaa6905d14a752fa6daee1c1278f4ce93b72b97125531106fb0a01): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-48xsw" podUID="6fed8192-c969-4d5e-90f8-8dfcbdb533f6" Sep 29 10:56:10 crc kubenswrapper[4752]: I0929 10:56:10.030276 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54bc95c9fb-gvb7z" Sep 29 10:56:10 crc kubenswrapper[4752]: I0929 10:56:10.030423 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-66f87598ff-kwzp7" Sep 29 10:56:10 crc kubenswrapper[4752]: I0929 10:56:10.033500 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-66f87598ff-kwzp7" Sep 29 10:56:10 crc kubenswrapper[4752]: I0929 10:56:10.033500 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54bc95c9fb-gvb7z" Sep 29 10:56:10 crc kubenswrapper[4752]: E0929 10:56:10.064947 4752 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-54bc95c9fb-gvb7z_openshift-operators_094aaacc-0758-4722-bc88-f4d4fc529d36_0(0bc9197f5a8749d2c9f580da08b0a9f8267e1d74ccfe215488c2872c3970b712): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Sep 29 10:56:10 crc kubenswrapper[4752]: E0929 10:56:10.065266 4752 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-54bc95c9fb-gvb7z_openshift-operators_094aaacc-0758-4722-bc88-f4d4fc529d36_0(0bc9197f5a8749d2c9f580da08b0a9f8267e1d74ccfe215488c2872c3970b712): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-54bc95c9fb-gvb7z" Sep 29 10:56:10 crc kubenswrapper[4752]: E0929 10:56:10.065292 4752 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-54bc95c9fb-gvb7z_openshift-operators_094aaacc-0758-4722-bc88-f4d4fc529d36_0(0bc9197f5a8749d2c9f580da08b0a9f8267e1d74ccfe215488c2872c3970b712): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-54bc95c9fb-gvb7z" Sep 29 10:56:10 crc kubenswrapper[4752]: E0929 10:56:10.065349 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-54bc95c9fb-gvb7z_openshift-operators(094aaacc-0758-4722-bc88-f4d4fc529d36)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-54bc95c9fb-gvb7z_openshift-operators(094aaacc-0758-4722-bc88-f4d4fc529d36)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-54bc95c9fb-gvb7z_openshift-operators_094aaacc-0758-4722-bc88-f4d4fc529d36_0(0bc9197f5a8749d2c9f580da08b0a9f8267e1d74ccfe215488c2872c3970b712): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-54bc95c9fb-gvb7z" podUID="094aaacc-0758-4722-bc88-f4d4fc529d36" Sep 29 10:56:10 crc kubenswrapper[4752]: E0929 10:56:10.071376 4752 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-66f87598ff-kwzp7_openshift-operators_81a0fa10-1cf0-4d64-8d87-be74cb9f191c_0(4c6a45f61043df5ee5d58cc322bbbb65f6ad177ffc1b1c2ae05d15d48f380a0c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Sep 29 10:56:10 crc kubenswrapper[4752]: E0929 10:56:10.071424 4752 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-66f87598ff-kwzp7_openshift-operators_81a0fa10-1cf0-4d64-8d87-be74cb9f191c_0(4c6a45f61043df5ee5d58cc322bbbb65f6ad177ffc1b1c2ae05d15d48f380a0c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-66f87598ff-kwzp7" Sep 29 10:56:10 crc kubenswrapper[4752]: E0929 10:56:10.071444 4752 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-66f87598ff-kwzp7_openshift-operators_81a0fa10-1cf0-4d64-8d87-be74cb9f191c_0(4c6a45f61043df5ee5d58cc322bbbb65f6ad177ffc1b1c2ae05d15d48f380a0c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-66f87598ff-kwzp7" Sep 29 10:56:10 crc kubenswrapper[4752]: E0929 10:56:10.071483 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-66f87598ff-kwzp7_openshift-operators(81a0fa10-1cf0-4d64-8d87-be74cb9f191c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-66f87598ff-kwzp7_openshift-operators(81a0fa10-1cf0-4d64-8d87-be74cb9f191c)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-66f87598ff-kwzp7_openshift-operators_81a0fa10-1cf0-4d64-8d87-be74cb9f191c_0(4c6a45f61043df5ee5d58cc322bbbb65f6ad177ffc1b1c2ae05d15d48f380a0c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-66f87598ff-kwzp7" podUID="81a0fa10-1cf0-4d64-8d87-be74cb9f191c" Sep 29 10:56:12 crc kubenswrapper[4752]: I0929 10:56:12.031562 4752 scope.go:117] "RemoveContainer" containerID="eff8591a1e7e061df63a2f3b4b4af9f4dd03197426fd89027902ac085abf289f" Sep 29 10:56:12 crc kubenswrapper[4752]: I0929 10:56:12.713659 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xv5q7_52fc9378-c37b-424b-afde-7b191bab5fde/kube-multus/2.log" Sep 29 10:56:12 crc kubenswrapper[4752]: I0929 10:56:12.714177 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xv5q7" event={"ID":"52fc9378-c37b-424b-afde-7b191bab5fde","Type":"ContainerStarted","Data":"18ae8b0fd947a4a72f0138cd13c868b6a6a6561ec0bd36132af5d37055efb4ce"} Sep 29 10:56:17 crc kubenswrapper[4752]: I0929 10:56:17.030648 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-cc5f78dfc-xlqjx" Sep 29 10:56:17 crc kubenswrapper[4752]: I0929 10:56:17.030703 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-66f87598ff-nzms9" Sep 29 10:56:17 crc kubenswrapper[4752]: I0929 10:56:17.031760 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-cc5f78dfc-xlqjx" Sep 29 10:56:17 crc kubenswrapper[4752]: I0929 10:56:17.031851 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-66f87598ff-nzms9" Sep 29 10:56:17 crc kubenswrapper[4752]: I0929 10:56:17.266382 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-xlqjx"] Sep 29 10:56:17 crc kubenswrapper[4752]: W0929 10:56:17.273197 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48a83e0b_a019_4677_9d2c_4eaafc3a36b9.slice/crio-e4c14d6eca6369bfae47d79642ece5b94b33351e12fb4e5fddc9822cf2c49025 WatchSource:0}: Error finding container e4c14d6eca6369bfae47d79642ece5b94b33351e12fb4e5fddc9822cf2c49025: Status 404 returned error can't find the container with id e4c14d6eca6369bfae47d79642ece5b94b33351e12fb4e5fddc9822cf2c49025 Sep 29 10:56:17 crc kubenswrapper[4752]: I0929 10:56:17.303051 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-66f87598ff-nzms9"] Sep 29 10:56:17 crc kubenswrapper[4752]: I0929 10:56:17.542340 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jxmdk" Sep 29 10:56:17 crc kubenswrapper[4752]: I0929 10:56:17.749547 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-cc5f78dfc-xlqjx" event={"ID":"48a83e0b-a019-4677-9d2c-4eaafc3a36b9","Type":"ContainerStarted","Data":"e4c14d6eca6369bfae47d79642ece5b94b33351e12fb4e5fddc9822cf2c49025"} Sep 29 10:56:17 crc kubenswrapper[4752]: I0929 10:56:17.752653 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-66f87598ff-nzms9" event={"ID":"0dcbf226-4af2-490a-8974-2c107af2a51f","Type":"ContainerStarted","Data":"46429c9aea0c9aeec1587c4cc3dae9c0e5d8ab2367055e6acb15dfbf48675351"} Sep 29 10:56:21 crc kubenswrapper[4752]: I0929 10:56:21.035156 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-48xsw" Sep 29 10:56:21 crc kubenswrapper[4752]: I0929 10:56:21.036301 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-48xsw" Sep 29 10:56:21 crc kubenswrapper[4752]: I0929 10:56:21.036701 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-66f87598ff-kwzp7" Sep 29 10:56:21 crc kubenswrapper[4752]: I0929 10:56:21.036916 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-66f87598ff-kwzp7" Sep 29 10:56:21 crc kubenswrapper[4752]: I0929 10:56:21.269255 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-66f87598ff-kwzp7"] Sep 29 10:56:21 crc kubenswrapper[4752]: I0929 10:56:21.330827 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-48xsw"] Sep 29 10:56:21 crc kubenswrapper[4752]: W0929 10:56:21.344464 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6fed8192_c969_4d5e_90f8_8dfcbdb533f6.slice/crio-4220dec728268484de06aba165b44038e01c4b595d652e700ee9c4a9ce791c2e WatchSource:0}: Error finding container 4220dec728268484de06aba165b44038e01c4b595d652e700ee9c4a9ce791c2e: Status 404 returned error can't find the container with id 4220dec728268484de06aba165b44038e01c4b595d652e700ee9c4a9ce791c2e Sep 29 10:56:21 crc kubenswrapper[4752]: I0929 10:56:21.787295 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-66f87598ff-kwzp7" event={"ID":"81a0fa10-1cf0-4d64-8d87-be74cb9f191c","Type":"ContainerStarted","Data":"11e39037fc84d1b4622bde2485671f940d41829d7e6e6253d1c0354a0a57bae4"} Sep 29 10:56:21 crc kubenswrapper[4752]: I0929 10:56:21.790588 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-48xsw" event={"ID":"6fed8192-c969-4d5e-90f8-8dfcbdb533f6","Type":"ContainerStarted","Data":"4220dec728268484de06aba165b44038e01c4b595d652e700ee9c4a9ce791c2e"} Sep 29 10:56:24 crc kubenswrapper[4752]: I0929 10:56:24.030946 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54bc95c9fb-gvb7z" Sep 29 10:56:24 crc kubenswrapper[4752]: I0929 10:56:24.031686 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54bc95c9fb-gvb7z" Sep 29 10:56:26 crc kubenswrapper[4752]: I0929 10:56:26.175881 4752 patch_prober.go:28] interesting pod/machine-config-daemon-mgrvs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 10:56:26 crc kubenswrapper[4752]: I0929 10:56:26.175954 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 10:56:27 crc kubenswrapper[4752]: I0929 10:56:27.080211 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-gvb7z"] Sep 29 10:56:27 crc kubenswrapper[4752]: W0929 10:56:27.102056 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod094aaacc_0758_4722_bc88_f4d4fc529d36.slice/crio-51de518e659a2ab6612b105b7d12f4649feec2b0a92d2440386e4a7762198428 WatchSource:0}: Error finding container 51de518e659a2ab6612b105b7d12f4649feec2b0a92d2440386e4a7762198428: Status 404 returned error can't find the container with id 51de518e659a2ab6612b105b7d12f4649feec2b0a92d2440386e4a7762198428 Sep 29 10:56:27 crc kubenswrapper[4752]: I0929 10:56:27.831539 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-48xsw" event={"ID":"6fed8192-c969-4d5e-90f8-8dfcbdb533f6","Type":"ContainerStarted","Data":"3c02dc6929cfc8309cd2dd5b0546cafa335126393bb3d626a1ad2113eaf34cec"} Sep 29 10:56:27 crc kubenswrapper[4752]: I0929 10:56:27.835190 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-66f87598ff-kwzp7" event={"ID":"81a0fa10-1cf0-4d64-8d87-be74cb9f191c","Type":"ContainerStarted","Data":"cf0dd06600ac9a10e04455199ca39a42a30c2fa98135ed609a60820344eb3c92"} Sep 29 10:56:27 crc kubenswrapper[4752]: I0929 10:56:27.837203 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-54bc95c9fb-gvb7z" event={"ID":"094aaacc-0758-4722-bc88-f4d4fc529d36","Type":"ContainerStarted","Data":"51de518e659a2ab6612b105b7d12f4649feec2b0a92d2440386e4a7762198428"} Sep 29 10:56:27 crc kubenswrapper[4752]: I0929 10:56:27.838931 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-cc5f78dfc-xlqjx" event={"ID":"48a83e0b-a019-4677-9d2c-4eaafc3a36b9","Type":"ContainerStarted","Data":"d01e35734bfd76fffcaee90b577d49ed7410e5d8add8c8d09ddc64dff4befc6c"} Sep 29 10:56:27 crc kubenswrapper[4752]: I0929 10:56:27.839185 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-cc5f78dfc-xlqjx" Sep 29 10:56:27 crc kubenswrapper[4752]: I0929 10:56:27.841306 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-66f87598ff-nzms9" event={"ID":"0dcbf226-4af2-490a-8974-2c107af2a51f","Type":"ContainerStarted","Data":"977905656dbd97e941461fc4256dc6b0dea6bcdbd5ee2e548ed87021b58abc35"} Sep 29 10:56:27 crc kubenswrapper[4752]: I0929 10:56:27.845513 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-cc5f78dfc-xlqjx" Sep 29 10:56:27 crc kubenswrapper[4752]: I0929 10:56:27.850406 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-48xsw" podStartSLOduration=29.201559759 podStartE2EDuration="34.850382345s" podCreationTimestamp="2025-09-29 10:55:53 +0000 UTC" firstStartedPulling="2025-09-29 10:56:21.348650281 +0000 UTC m=+722.137791948" lastFinishedPulling="2025-09-29 10:56:26.997472867 +0000 UTC m=+727.786614534" observedRunningTime="2025-09-29 10:56:27.848618768 +0000 UTC m=+728.637760435" watchObservedRunningTime="2025-09-29 10:56:27.850382345 +0000 UTC m=+728.639524012" Sep 29 10:56:27 crc kubenswrapper[4752]: I0929 10:56:27.872059 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-66f87598ff-kwzp7" podStartSLOduration=29.197411261 podStartE2EDuration="34.87203691s" podCreationTimestamp="2025-09-29 10:55:53 +0000 UTC" firstStartedPulling="2025-09-29 10:56:21.283136072 +0000 UTC m=+722.072277739" lastFinishedPulling="2025-09-29 10:56:26.957761721 +0000 UTC m=+727.746903388" observedRunningTime="2025-09-29 10:56:27.866836314 +0000 UTC m=+728.655977991" watchObservedRunningTime="2025-09-29 10:56:27.87203691 +0000 UTC m=+728.661178577" Sep 29 10:56:27 crc kubenswrapper[4752]: I0929 10:56:27.914918 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-cc5f78dfc-xlqjx" podStartSLOduration=25.231611306 podStartE2EDuration="34.914894368s" podCreationTimestamp="2025-09-29 10:55:53 +0000 UTC" firstStartedPulling="2025-09-29 10:56:17.277239271 +0000 UTC m=+718.066380938" lastFinishedPulling="2025-09-29 10:56:26.960522333 +0000 UTC m=+727.749664000" observedRunningTime="2025-09-29 10:56:27.909828055 +0000 UTC m=+728.698969742" watchObservedRunningTime="2025-09-29 10:56:27.914894368 +0000 UTC m=+728.704036045" Sep 29 10:56:27 crc kubenswrapper[4752]: I0929 10:56:27.935254 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-66f87598ff-nzms9" podStartSLOduration=25.328356209 podStartE2EDuration="34.935228418s" podCreationTimestamp="2025-09-29 10:55:53 +0000 UTC" firstStartedPulling="2025-09-29 10:56:17.312242154 +0000 UTC m=+718.101383821" lastFinishedPulling="2025-09-29 10:56:26.919114363 +0000 UTC m=+727.708256030" observedRunningTime="2025-09-29 10:56:27.932401884 +0000 UTC m=+728.721543551" watchObservedRunningTime="2025-09-29 10:56:27.935228418 +0000 UTC m=+728.724370085" Sep 29 10:56:30 crc kubenswrapper[4752]: I0929 10:56:30.861653 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-54bc95c9fb-gvb7z" event={"ID":"094aaacc-0758-4722-bc88-f4d4fc529d36","Type":"ContainerStarted","Data":"0f739fd6bd440b2ba0eb6682d353b6e9dc43c62325d7f33757f1155d3613b108"} Sep 29 10:56:30 crc kubenswrapper[4752]: I0929 10:56:30.862413 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-54bc95c9fb-gvb7z" Sep 29 10:56:30 crc kubenswrapper[4752]: I0929 10:56:30.882374 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-54bc95c9fb-gvb7z" podStartSLOduration=34.955506698 podStartE2EDuration="37.882349622s" podCreationTimestamp="2025-09-29 10:55:53 +0000 UTC" firstStartedPulling="2025-09-29 10:56:27.105946597 +0000 UTC m=+727.895088264" lastFinishedPulling="2025-09-29 10:56:30.032789521 +0000 UTC m=+730.821931188" observedRunningTime="2025-09-29 10:56:30.881334665 +0000 UTC m=+731.670476332" watchObservedRunningTime="2025-09-29 10:56:30.882349622 +0000 UTC m=+731.671491309" Sep 29 10:56:34 crc kubenswrapper[4752]: I0929 10:56:34.440082 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcwqf42"] Sep 29 10:56:34 crc kubenswrapper[4752]: I0929 10:56:34.441697 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcwqf42" Sep 29 10:56:34 crc kubenswrapper[4752]: I0929 10:56:34.443270 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Sep 29 10:56:34 crc kubenswrapper[4752]: I0929 10:56:34.451911 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcwqf42"] Sep 29 10:56:34 crc kubenswrapper[4752]: I0929 10:56:34.559901 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/146d6e75-b6e0-4be4-a82e-cafe303eb1c1-util\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcwqf42\" (UID: \"146d6e75-b6e0-4be4-a82e-cafe303eb1c1\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcwqf42" Sep 29 10:56:34 crc kubenswrapper[4752]: I0929 10:56:34.560056 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdpjb\" (UniqueName: \"kubernetes.io/projected/146d6e75-b6e0-4be4-a82e-cafe303eb1c1-kube-api-access-zdpjb\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcwqf42\" (UID: \"146d6e75-b6e0-4be4-a82e-cafe303eb1c1\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcwqf42" Sep 29 10:56:34 crc kubenswrapper[4752]: I0929 10:56:34.560088 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/146d6e75-b6e0-4be4-a82e-cafe303eb1c1-bundle\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcwqf42\" (UID: \"146d6e75-b6e0-4be4-a82e-cafe303eb1c1\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcwqf42" Sep 29 10:56:34 crc kubenswrapper[4752]: I0929 10:56:34.661842 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdpjb\" (UniqueName: \"kubernetes.io/projected/146d6e75-b6e0-4be4-a82e-cafe303eb1c1-kube-api-access-zdpjb\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcwqf42\" (UID: \"146d6e75-b6e0-4be4-a82e-cafe303eb1c1\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcwqf42" Sep 29 10:56:34 crc kubenswrapper[4752]: I0929 10:56:34.661939 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/146d6e75-b6e0-4be4-a82e-cafe303eb1c1-bundle\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcwqf42\" (UID: \"146d6e75-b6e0-4be4-a82e-cafe303eb1c1\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcwqf42" Sep 29 10:56:34 crc kubenswrapper[4752]: I0929 10:56:34.662009 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/146d6e75-b6e0-4be4-a82e-cafe303eb1c1-util\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcwqf42\" (UID: \"146d6e75-b6e0-4be4-a82e-cafe303eb1c1\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcwqf42" Sep 29 10:56:34 crc kubenswrapper[4752]: I0929 10:56:34.662714 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/146d6e75-b6e0-4be4-a82e-cafe303eb1c1-bundle\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcwqf42\" (UID: \"146d6e75-b6e0-4be4-a82e-cafe303eb1c1\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcwqf42" Sep 29 10:56:34 crc kubenswrapper[4752]: I0929 10:56:34.662879 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/146d6e75-b6e0-4be4-a82e-cafe303eb1c1-util\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcwqf42\" (UID: \"146d6e75-b6e0-4be4-a82e-cafe303eb1c1\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcwqf42" Sep 29 10:56:34 crc kubenswrapper[4752]: I0929 10:56:34.686443 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdpjb\" (UniqueName: \"kubernetes.io/projected/146d6e75-b6e0-4be4-a82e-cafe303eb1c1-kube-api-access-zdpjb\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcwqf42\" (UID: \"146d6e75-b6e0-4be4-a82e-cafe303eb1c1\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcwqf42" Sep 29 10:56:34 crc kubenswrapper[4752]: I0929 10:56:34.759406 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcwqf42" Sep 29 10:56:35 crc kubenswrapper[4752]: I0929 10:56:35.053985 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcwqf42"] Sep 29 10:56:35 crc kubenswrapper[4752]: W0929 10:56:35.060126 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod146d6e75_b6e0_4be4_a82e_cafe303eb1c1.slice/crio-3c87a34e809b8cbf723e23c701db826099b1d81d59de58556d305943e1a8c686 WatchSource:0}: Error finding container 3c87a34e809b8cbf723e23c701db826099b1d81d59de58556d305943e1a8c686: Status 404 returned error can't find the container with id 3c87a34e809b8cbf723e23c701db826099b1d81d59de58556d305943e1a8c686 Sep 29 10:56:35 crc kubenswrapper[4752]: I0929 10:56:35.893082 4752 generic.go:334] "Generic (PLEG): container finished" podID="146d6e75-b6e0-4be4-a82e-cafe303eb1c1" containerID="28142913443ab726bdec504b5dc69af83b86b3211fa02885724a2c71cec73ee1" exitCode=0 Sep 29 10:56:35 crc kubenswrapper[4752]: I0929 10:56:35.893144 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcwqf42" event={"ID":"146d6e75-b6e0-4be4-a82e-cafe303eb1c1","Type":"ContainerDied","Data":"28142913443ab726bdec504b5dc69af83b86b3211fa02885724a2c71cec73ee1"} Sep 29 10:56:35 crc kubenswrapper[4752]: I0929 10:56:35.893187 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcwqf42" event={"ID":"146d6e75-b6e0-4be4-a82e-cafe303eb1c1","Type":"ContainerStarted","Data":"3c87a34e809b8cbf723e23c701db826099b1d81d59de58556d305943e1a8c686"} Sep 29 10:56:37 crc kubenswrapper[4752]: I0929 10:56:37.910195 4752 generic.go:334] "Generic (PLEG): container finished" podID="146d6e75-b6e0-4be4-a82e-cafe303eb1c1" containerID="965d0016b1242aa3ad057eb797606ed6cba72e5a81300e63eddd0780990c37c3" exitCode=0 Sep 29 10:56:37 crc kubenswrapper[4752]: I0929 10:56:37.910299 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcwqf42" event={"ID":"146d6e75-b6e0-4be4-a82e-cafe303eb1c1","Type":"ContainerDied","Data":"965d0016b1242aa3ad057eb797606ed6cba72e5a81300e63eddd0780990c37c3"} Sep 29 10:56:38 crc kubenswrapper[4752]: I0929 10:56:38.919572 4752 generic.go:334] "Generic (PLEG): container finished" podID="146d6e75-b6e0-4be4-a82e-cafe303eb1c1" containerID="50258a8e7d3e412acb227c52154803a2fe298ad21d0473c903761b21cd36bf90" exitCode=0 Sep 29 10:56:38 crc kubenswrapper[4752]: I0929 10:56:38.919714 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcwqf42" event={"ID":"146d6e75-b6e0-4be4-a82e-cafe303eb1c1","Type":"ContainerDied","Data":"50258a8e7d3e412acb227c52154803a2fe298ad21d0473c903761b21cd36bf90"} Sep 29 10:56:40 crc kubenswrapper[4752]: I0929 10:56:40.156772 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcwqf42" Sep 29 10:56:40 crc kubenswrapper[4752]: I0929 10:56:40.261422 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/146d6e75-b6e0-4be4-a82e-cafe303eb1c1-util\") pod \"146d6e75-b6e0-4be4-a82e-cafe303eb1c1\" (UID: \"146d6e75-b6e0-4be4-a82e-cafe303eb1c1\") " Sep 29 10:56:40 crc kubenswrapper[4752]: I0929 10:56:40.261500 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zdpjb\" (UniqueName: \"kubernetes.io/projected/146d6e75-b6e0-4be4-a82e-cafe303eb1c1-kube-api-access-zdpjb\") pod \"146d6e75-b6e0-4be4-a82e-cafe303eb1c1\" (UID: \"146d6e75-b6e0-4be4-a82e-cafe303eb1c1\") " Sep 29 10:56:40 crc kubenswrapper[4752]: I0929 10:56:40.261526 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/146d6e75-b6e0-4be4-a82e-cafe303eb1c1-bundle\") pod \"146d6e75-b6e0-4be4-a82e-cafe303eb1c1\" (UID: \"146d6e75-b6e0-4be4-a82e-cafe303eb1c1\") " Sep 29 10:56:40 crc kubenswrapper[4752]: I0929 10:56:40.262213 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/146d6e75-b6e0-4be4-a82e-cafe303eb1c1-bundle" (OuterVolumeSpecName: "bundle") pod "146d6e75-b6e0-4be4-a82e-cafe303eb1c1" (UID: "146d6e75-b6e0-4be4-a82e-cafe303eb1c1"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:56:40 crc kubenswrapper[4752]: I0929 10:56:40.268686 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/146d6e75-b6e0-4be4-a82e-cafe303eb1c1-kube-api-access-zdpjb" (OuterVolumeSpecName: "kube-api-access-zdpjb") pod "146d6e75-b6e0-4be4-a82e-cafe303eb1c1" (UID: "146d6e75-b6e0-4be4-a82e-cafe303eb1c1"). InnerVolumeSpecName "kube-api-access-zdpjb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:56:40 crc kubenswrapper[4752]: I0929 10:56:40.282212 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/146d6e75-b6e0-4be4-a82e-cafe303eb1c1-util" (OuterVolumeSpecName: "util") pod "146d6e75-b6e0-4be4-a82e-cafe303eb1c1" (UID: "146d6e75-b6e0-4be4-a82e-cafe303eb1c1"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:56:40 crc kubenswrapper[4752]: I0929 10:56:40.363459 4752 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/146d6e75-b6e0-4be4-a82e-cafe303eb1c1-util\") on node \"crc\" DevicePath \"\"" Sep 29 10:56:40 crc kubenswrapper[4752]: I0929 10:56:40.363539 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zdpjb\" (UniqueName: \"kubernetes.io/projected/146d6e75-b6e0-4be4-a82e-cafe303eb1c1-kube-api-access-zdpjb\") on node \"crc\" DevicePath \"\"" Sep 29 10:56:40 crc kubenswrapper[4752]: I0929 10:56:40.363554 4752 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/146d6e75-b6e0-4be4-a82e-cafe303eb1c1-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:56:40 crc kubenswrapper[4752]: I0929 10:56:40.935293 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcwqf42" event={"ID":"146d6e75-b6e0-4be4-a82e-cafe303eb1c1","Type":"ContainerDied","Data":"3c87a34e809b8cbf723e23c701db826099b1d81d59de58556d305943e1a8c686"} Sep 29 10:56:40 crc kubenswrapper[4752]: I0929 10:56:40.935335 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c87a34e809b8cbf723e23c701db826099b1d81d59de58556d305943e1a8c686" Sep 29 10:56:40 crc kubenswrapper[4752]: I0929 10:56:40.935470 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcwqf42" Sep 29 10:56:44 crc kubenswrapper[4752]: I0929 10:56:44.050863 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-54bc95c9fb-gvb7z" Sep 29 10:56:46 crc kubenswrapper[4752]: I0929 10:56:46.504067 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-5d6f6cfd66-s6ttt"] Sep 29 10:56:46 crc kubenswrapper[4752]: E0929 10:56:46.504851 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="146d6e75-b6e0-4be4-a82e-cafe303eb1c1" containerName="pull" Sep 29 10:56:46 crc kubenswrapper[4752]: I0929 10:56:46.504869 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="146d6e75-b6e0-4be4-a82e-cafe303eb1c1" containerName="pull" Sep 29 10:56:46 crc kubenswrapper[4752]: E0929 10:56:46.504878 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="146d6e75-b6e0-4be4-a82e-cafe303eb1c1" containerName="util" Sep 29 10:56:46 crc kubenswrapper[4752]: I0929 10:56:46.504884 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="146d6e75-b6e0-4be4-a82e-cafe303eb1c1" containerName="util" Sep 29 10:56:46 crc kubenswrapper[4752]: E0929 10:56:46.504900 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="146d6e75-b6e0-4be4-a82e-cafe303eb1c1" containerName="extract" Sep 29 10:56:46 crc kubenswrapper[4752]: I0929 10:56:46.504909 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="146d6e75-b6e0-4be4-a82e-cafe303eb1c1" containerName="extract" Sep 29 10:56:46 crc kubenswrapper[4752]: I0929 10:56:46.505031 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="146d6e75-b6e0-4be4-a82e-cafe303eb1c1" containerName="extract" Sep 29 10:56:46 crc kubenswrapper[4752]: I0929 10:56:46.505557 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-s6ttt" Sep 29 10:56:46 crc kubenswrapper[4752]: I0929 10:56:46.507696 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Sep 29 10:56:46 crc kubenswrapper[4752]: I0929 10:56:46.507775 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-p6nlt" Sep 29 10:56:46 crc kubenswrapper[4752]: I0929 10:56:46.507832 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Sep 29 10:56:46 crc kubenswrapper[4752]: I0929 10:56:46.519394 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5d6f6cfd66-s6ttt"] Sep 29 10:56:46 crc kubenswrapper[4752]: I0929 10:56:46.551589 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrpvr\" (UniqueName: \"kubernetes.io/projected/c7616dcc-135c-41d4-bf4c-8f72270fa5fd-kube-api-access-rrpvr\") pod \"nmstate-operator-5d6f6cfd66-s6ttt\" (UID: \"c7616dcc-135c-41d4-bf4c-8f72270fa5fd\") " pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-s6ttt" Sep 29 10:56:46 crc kubenswrapper[4752]: I0929 10:56:46.652629 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrpvr\" (UniqueName: \"kubernetes.io/projected/c7616dcc-135c-41d4-bf4c-8f72270fa5fd-kube-api-access-rrpvr\") pod \"nmstate-operator-5d6f6cfd66-s6ttt\" (UID: \"c7616dcc-135c-41d4-bf4c-8f72270fa5fd\") " pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-s6ttt" Sep 29 10:56:46 crc kubenswrapper[4752]: I0929 10:56:46.675922 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrpvr\" (UniqueName: \"kubernetes.io/projected/c7616dcc-135c-41d4-bf4c-8f72270fa5fd-kube-api-access-rrpvr\") pod \"nmstate-operator-5d6f6cfd66-s6ttt\" (UID: \"c7616dcc-135c-41d4-bf4c-8f72270fa5fd\") " pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-s6ttt" Sep 29 10:56:46 crc kubenswrapper[4752]: I0929 10:56:46.825658 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-s6ttt" Sep 29 10:56:47 crc kubenswrapper[4752]: I0929 10:56:47.114987 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5d6f6cfd66-s6ttt"] Sep 29 10:56:47 crc kubenswrapper[4752]: I0929 10:56:47.979507 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-s6ttt" event={"ID":"c7616dcc-135c-41d4-bf4c-8f72270fa5fd","Type":"ContainerStarted","Data":"92a4c80079b385369b45f50ecdb033bac95e26e43ff94d25e0fb31be348364fa"} Sep 29 10:56:48 crc kubenswrapper[4752]: I0929 10:56:48.805481 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-rvflz"] Sep 29 10:56:48 crc kubenswrapper[4752]: I0929 10:56:48.807563 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-rvflz" podUID="f3f0dcdd-283d-4ed5-889a-da260dcf13b0" containerName="controller-manager" containerID="cri-o://f4db0405385faa272d014a5e7ff66239694e2731c81f5d9603858b5015cb9e7d" gracePeriod=30 Sep 29 10:56:48 crc kubenswrapper[4752]: I0929 10:56:48.902698 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-q8xbr"] Sep 29 10:56:48 crc kubenswrapper[4752]: I0929 10:56:48.902985 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-q8xbr" podUID="78a002dc-c902-472c-b269-9ec7c99ab835" containerName="route-controller-manager" containerID="cri-o://794b031167ccc021f2a398b84f937712a08a39f7251e530e88554fbdbd37ebb3" gracePeriod=30 Sep 29 10:56:48 crc kubenswrapper[4752]: I0929 10:56:48.996100 4752 generic.go:334] "Generic (PLEG): container finished" podID="f3f0dcdd-283d-4ed5-889a-da260dcf13b0" containerID="f4db0405385faa272d014a5e7ff66239694e2731c81f5d9603858b5015cb9e7d" exitCode=0 Sep 29 10:56:48 crc kubenswrapper[4752]: I0929 10:56:48.996161 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-rvflz" event={"ID":"f3f0dcdd-283d-4ed5-889a-da260dcf13b0","Type":"ContainerDied","Data":"f4db0405385faa272d014a5e7ff66239694e2731c81f5d9603858b5015cb9e7d"} Sep 29 10:56:49 crc kubenswrapper[4752]: I0929 10:56:49.298083 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-rvflz" Sep 29 10:56:49 crc kubenswrapper[4752]: I0929 10:56:49.348394 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-q8xbr" Sep 29 10:56:49 crc kubenswrapper[4752]: I0929 10:56:49.396854 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f3f0dcdd-283d-4ed5-889a-da260dcf13b0-client-ca\") pod \"f3f0dcdd-283d-4ed5-889a-da260dcf13b0\" (UID: \"f3f0dcdd-283d-4ed5-889a-da260dcf13b0\") " Sep 29 10:56:49 crc kubenswrapper[4752]: I0929 10:56:49.396920 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f3f0dcdd-283d-4ed5-889a-da260dcf13b0-proxy-ca-bundles\") pod \"f3f0dcdd-283d-4ed5-889a-da260dcf13b0\" (UID: \"f3f0dcdd-283d-4ed5-889a-da260dcf13b0\") " Sep 29 10:56:49 crc kubenswrapper[4752]: I0929 10:56:49.396954 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tktcs\" (UniqueName: \"kubernetes.io/projected/f3f0dcdd-283d-4ed5-889a-da260dcf13b0-kube-api-access-tktcs\") pod \"f3f0dcdd-283d-4ed5-889a-da260dcf13b0\" (UID: \"f3f0dcdd-283d-4ed5-889a-da260dcf13b0\") " Sep 29 10:56:49 crc kubenswrapper[4752]: I0929 10:56:49.396986 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f3f0dcdd-283d-4ed5-889a-da260dcf13b0-serving-cert\") pod \"f3f0dcdd-283d-4ed5-889a-da260dcf13b0\" (UID: \"f3f0dcdd-283d-4ed5-889a-da260dcf13b0\") " Sep 29 10:56:49 crc kubenswrapper[4752]: I0929 10:56:49.397019 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/78a002dc-c902-472c-b269-9ec7c99ab835-serving-cert\") pod \"78a002dc-c902-472c-b269-9ec7c99ab835\" (UID: \"78a002dc-c902-472c-b269-9ec7c99ab835\") " Sep 29 10:56:49 crc kubenswrapper[4752]: I0929 10:56:49.397058 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wwmb6\" (UniqueName: \"kubernetes.io/projected/78a002dc-c902-472c-b269-9ec7c99ab835-kube-api-access-wwmb6\") pod \"78a002dc-c902-472c-b269-9ec7c99ab835\" (UID: \"78a002dc-c902-472c-b269-9ec7c99ab835\") " Sep 29 10:56:49 crc kubenswrapper[4752]: I0929 10:56:49.397098 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3f0dcdd-283d-4ed5-889a-da260dcf13b0-config\") pod \"f3f0dcdd-283d-4ed5-889a-da260dcf13b0\" (UID: \"f3f0dcdd-283d-4ed5-889a-da260dcf13b0\") " Sep 29 10:56:49 crc kubenswrapper[4752]: I0929 10:56:49.397119 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/78a002dc-c902-472c-b269-9ec7c99ab835-client-ca\") pod \"78a002dc-c902-472c-b269-9ec7c99ab835\" (UID: \"78a002dc-c902-472c-b269-9ec7c99ab835\") " Sep 29 10:56:49 crc kubenswrapper[4752]: I0929 10:56:49.397164 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78a002dc-c902-472c-b269-9ec7c99ab835-config\") pod \"78a002dc-c902-472c-b269-9ec7c99ab835\" (UID: \"78a002dc-c902-472c-b269-9ec7c99ab835\") " Sep 29 10:56:49 crc kubenswrapper[4752]: I0929 10:56:49.398543 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78a002dc-c902-472c-b269-9ec7c99ab835-config" (OuterVolumeSpecName: "config") pod "78a002dc-c902-472c-b269-9ec7c99ab835" (UID: "78a002dc-c902-472c-b269-9ec7c99ab835"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:56:49 crc kubenswrapper[4752]: I0929 10:56:49.399852 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78a002dc-c902-472c-b269-9ec7c99ab835-client-ca" (OuterVolumeSpecName: "client-ca") pod "78a002dc-c902-472c-b269-9ec7c99ab835" (UID: "78a002dc-c902-472c-b269-9ec7c99ab835"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:56:49 crc kubenswrapper[4752]: I0929 10:56:49.399878 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3f0dcdd-283d-4ed5-889a-da260dcf13b0-client-ca" (OuterVolumeSpecName: "client-ca") pod "f3f0dcdd-283d-4ed5-889a-da260dcf13b0" (UID: "f3f0dcdd-283d-4ed5-889a-da260dcf13b0"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:56:49 crc kubenswrapper[4752]: I0929 10:56:49.400094 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3f0dcdd-283d-4ed5-889a-da260dcf13b0-config" (OuterVolumeSpecName: "config") pod "f3f0dcdd-283d-4ed5-889a-da260dcf13b0" (UID: "f3f0dcdd-283d-4ed5-889a-da260dcf13b0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:56:49 crc kubenswrapper[4752]: I0929 10:56:49.400114 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3f0dcdd-283d-4ed5-889a-da260dcf13b0-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "f3f0dcdd-283d-4ed5-889a-da260dcf13b0" (UID: "f3f0dcdd-283d-4ed5-889a-da260dcf13b0"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:56:49 crc kubenswrapper[4752]: I0929 10:56:49.407977 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78a002dc-c902-472c-b269-9ec7c99ab835-kube-api-access-wwmb6" (OuterVolumeSpecName: "kube-api-access-wwmb6") pod "78a002dc-c902-472c-b269-9ec7c99ab835" (UID: "78a002dc-c902-472c-b269-9ec7c99ab835"). InnerVolumeSpecName "kube-api-access-wwmb6". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:56:49 crc kubenswrapper[4752]: I0929 10:56:49.408222 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3f0dcdd-283d-4ed5-889a-da260dcf13b0-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f3f0dcdd-283d-4ed5-889a-da260dcf13b0" (UID: "f3f0dcdd-283d-4ed5-889a-da260dcf13b0"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:56:49 crc kubenswrapper[4752]: I0929 10:56:49.408360 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3f0dcdd-283d-4ed5-889a-da260dcf13b0-kube-api-access-tktcs" (OuterVolumeSpecName: "kube-api-access-tktcs") pod "f3f0dcdd-283d-4ed5-889a-da260dcf13b0" (UID: "f3f0dcdd-283d-4ed5-889a-da260dcf13b0"). InnerVolumeSpecName "kube-api-access-tktcs". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:56:49 crc kubenswrapper[4752]: I0929 10:56:49.409322 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78a002dc-c902-472c-b269-9ec7c99ab835-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "78a002dc-c902-472c-b269-9ec7c99ab835" (UID: "78a002dc-c902-472c-b269-9ec7c99ab835"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:56:49 crc kubenswrapper[4752]: I0929 10:56:49.498364 4752 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78a002dc-c902-472c-b269-9ec7c99ab835-config\") on node \"crc\" DevicePath \"\"" Sep 29 10:56:49 crc kubenswrapper[4752]: I0929 10:56:49.498406 4752 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f3f0dcdd-283d-4ed5-889a-da260dcf13b0-client-ca\") on node \"crc\" DevicePath \"\"" Sep 29 10:56:49 crc kubenswrapper[4752]: I0929 10:56:49.498419 4752 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f3f0dcdd-283d-4ed5-889a-da260dcf13b0-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Sep 29 10:56:49 crc kubenswrapper[4752]: I0929 10:56:49.498431 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tktcs\" (UniqueName: \"kubernetes.io/projected/f3f0dcdd-283d-4ed5-889a-da260dcf13b0-kube-api-access-tktcs\") on node \"crc\" DevicePath \"\"" Sep 29 10:56:49 crc kubenswrapper[4752]: I0929 10:56:49.498441 4752 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f3f0dcdd-283d-4ed5-889a-da260dcf13b0-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 29 10:56:49 crc kubenswrapper[4752]: I0929 10:56:49.498452 4752 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/78a002dc-c902-472c-b269-9ec7c99ab835-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 29 10:56:49 crc kubenswrapper[4752]: I0929 10:56:49.498461 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wwmb6\" (UniqueName: \"kubernetes.io/projected/78a002dc-c902-472c-b269-9ec7c99ab835-kube-api-access-wwmb6\") on node \"crc\" DevicePath \"\"" Sep 29 10:56:49 crc kubenswrapper[4752]: I0929 10:56:49.498469 4752 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3f0dcdd-283d-4ed5-889a-da260dcf13b0-config\") on node \"crc\" DevicePath \"\"" Sep 29 10:56:49 crc kubenswrapper[4752]: I0929 10:56:49.498477 4752 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/78a002dc-c902-472c-b269-9ec7c99ab835-client-ca\") on node \"crc\" DevicePath \"\"" Sep 29 10:56:49 crc kubenswrapper[4752]: I0929 10:56:49.943051 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-564895b49-gcndg"] Sep 29 10:56:49 crc kubenswrapper[4752]: E0929 10:56:49.943315 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3f0dcdd-283d-4ed5-889a-da260dcf13b0" containerName="controller-manager" Sep 29 10:56:49 crc kubenswrapper[4752]: I0929 10:56:49.943327 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3f0dcdd-283d-4ed5-889a-da260dcf13b0" containerName="controller-manager" Sep 29 10:56:49 crc kubenswrapper[4752]: E0929 10:56:49.943341 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78a002dc-c902-472c-b269-9ec7c99ab835" containerName="route-controller-manager" Sep 29 10:56:49 crc kubenswrapper[4752]: I0929 10:56:49.943347 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="78a002dc-c902-472c-b269-9ec7c99ab835" containerName="route-controller-manager" Sep 29 10:56:49 crc kubenswrapper[4752]: I0929 10:56:49.943442 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3f0dcdd-283d-4ed5-889a-da260dcf13b0" containerName="controller-manager" Sep 29 10:56:49 crc kubenswrapper[4752]: I0929 10:56:49.943460 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="78a002dc-c902-472c-b269-9ec7c99ab835" containerName="route-controller-manager" Sep 29 10:56:49 crc kubenswrapper[4752]: I0929 10:56:49.943950 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-564895b49-gcndg" Sep 29 10:56:49 crc kubenswrapper[4752]: I0929 10:56:49.959913 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-564895b49-gcndg"] Sep 29 10:56:50 crc kubenswrapper[4752]: I0929 10:56:50.025368 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-rvflz" event={"ID":"f3f0dcdd-283d-4ed5-889a-da260dcf13b0","Type":"ContainerDied","Data":"41a6dd3179fbf9f69b95fcd8c10f11a3a08fec1e76ddb4a9ea951c5e06ba3c8d"} Sep 29 10:56:50 crc kubenswrapper[4752]: I0929 10:56:50.025453 4752 scope.go:117] "RemoveContainer" containerID="f4db0405385faa272d014a5e7ff66239694e2731c81f5d9603858b5015cb9e7d" Sep 29 10:56:50 crc kubenswrapper[4752]: I0929 10:56:50.025607 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-rvflz" Sep 29 10:56:50 crc kubenswrapper[4752]: I0929 10:56:50.047201 4752 generic.go:334] "Generic (PLEG): container finished" podID="78a002dc-c902-472c-b269-9ec7c99ab835" containerID="794b031167ccc021f2a398b84f937712a08a39f7251e530e88554fbdbd37ebb3" exitCode=0 Sep 29 10:56:50 crc kubenswrapper[4752]: I0929 10:56:50.061210 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-q8xbr" event={"ID":"78a002dc-c902-472c-b269-9ec7c99ab835","Type":"ContainerDied","Data":"794b031167ccc021f2a398b84f937712a08a39f7251e530e88554fbdbd37ebb3"} Sep 29 10:56:50 crc kubenswrapper[4752]: I0929 10:56:50.061275 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-q8xbr" event={"ID":"78a002dc-c902-472c-b269-9ec7c99ab835","Type":"ContainerDied","Data":"561cdc5d6a664e370b09607e23e77485312a91a9525f6db77e36bd3a2c612949"} Sep 29 10:56:50 crc kubenswrapper[4752]: I0929 10:56:50.064927 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-q8xbr" Sep 29 10:56:50 crc kubenswrapper[4752]: I0929 10:56:50.110313 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d2718e4d-e37e-4110-9254-bf94c70f2fff-client-ca\") pod \"controller-manager-564895b49-gcndg\" (UID: \"d2718e4d-e37e-4110-9254-bf94c70f2fff\") " pod="openshift-controller-manager/controller-manager-564895b49-gcndg" Sep 29 10:56:50 crc kubenswrapper[4752]: I0929 10:56:50.110383 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fftb4\" (UniqueName: \"kubernetes.io/projected/d2718e4d-e37e-4110-9254-bf94c70f2fff-kube-api-access-fftb4\") pod \"controller-manager-564895b49-gcndg\" (UID: \"d2718e4d-e37e-4110-9254-bf94c70f2fff\") " pod="openshift-controller-manager/controller-manager-564895b49-gcndg" Sep 29 10:56:50 crc kubenswrapper[4752]: I0929 10:56:50.110419 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d2718e4d-e37e-4110-9254-bf94c70f2fff-proxy-ca-bundles\") pod \"controller-manager-564895b49-gcndg\" (UID: \"d2718e4d-e37e-4110-9254-bf94c70f2fff\") " pod="openshift-controller-manager/controller-manager-564895b49-gcndg" Sep 29 10:56:50 crc kubenswrapper[4752]: I0929 10:56:50.110442 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2718e4d-e37e-4110-9254-bf94c70f2fff-config\") pod \"controller-manager-564895b49-gcndg\" (UID: \"d2718e4d-e37e-4110-9254-bf94c70f2fff\") " pod="openshift-controller-manager/controller-manager-564895b49-gcndg" Sep 29 10:56:50 crc kubenswrapper[4752]: I0929 10:56:50.110492 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d2718e4d-e37e-4110-9254-bf94c70f2fff-serving-cert\") pod \"controller-manager-564895b49-gcndg\" (UID: \"d2718e4d-e37e-4110-9254-bf94c70f2fff\") " pod="openshift-controller-manager/controller-manager-564895b49-gcndg" Sep 29 10:56:50 crc kubenswrapper[4752]: I0929 10:56:50.146894 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-rvflz"] Sep 29 10:56:50 crc kubenswrapper[4752]: I0929 10:56:50.148441 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-rvflz"] Sep 29 10:56:50 crc kubenswrapper[4752]: I0929 10:56:50.217631 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d2718e4d-e37e-4110-9254-bf94c70f2fff-proxy-ca-bundles\") pod \"controller-manager-564895b49-gcndg\" (UID: \"d2718e4d-e37e-4110-9254-bf94c70f2fff\") " pod="openshift-controller-manager/controller-manager-564895b49-gcndg" Sep 29 10:56:50 crc kubenswrapper[4752]: I0929 10:56:50.217695 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2718e4d-e37e-4110-9254-bf94c70f2fff-config\") pod \"controller-manager-564895b49-gcndg\" (UID: \"d2718e4d-e37e-4110-9254-bf94c70f2fff\") " pod="openshift-controller-manager/controller-manager-564895b49-gcndg" Sep 29 10:56:50 crc kubenswrapper[4752]: I0929 10:56:50.217787 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d2718e4d-e37e-4110-9254-bf94c70f2fff-serving-cert\") pod \"controller-manager-564895b49-gcndg\" (UID: \"d2718e4d-e37e-4110-9254-bf94c70f2fff\") " pod="openshift-controller-manager/controller-manager-564895b49-gcndg" Sep 29 10:56:50 crc kubenswrapper[4752]: I0929 10:56:50.217836 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d2718e4d-e37e-4110-9254-bf94c70f2fff-client-ca\") pod \"controller-manager-564895b49-gcndg\" (UID: \"d2718e4d-e37e-4110-9254-bf94c70f2fff\") " pod="openshift-controller-manager/controller-manager-564895b49-gcndg" Sep 29 10:56:50 crc kubenswrapper[4752]: I0929 10:56:50.217866 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fftb4\" (UniqueName: \"kubernetes.io/projected/d2718e4d-e37e-4110-9254-bf94c70f2fff-kube-api-access-fftb4\") pod \"controller-manager-564895b49-gcndg\" (UID: \"d2718e4d-e37e-4110-9254-bf94c70f2fff\") " pod="openshift-controller-manager/controller-manager-564895b49-gcndg" Sep 29 10:56:50 crc kubenswrapper[4752]: I0929 10:56:50.220592 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d2718e4d-e37e-4110-9254-bf94c70f2fff-client-ca\") pod \"controller-manager-564895b49-gcndg\" (UID: \"d2718e4d-e37e-4110-9254-bf94c70f2fff\") " pod="openshift-controller-manager/controller-manager-564895b49-gcndg" Sep 29 10:56:50 crc kubenswrapper[4752]: I0929 10:56:50.220995 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-q8xbr"] Sep 29 10:56:50 crc kubenswrapper[4752]: I0929 10:56:50.221249 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d2718e4d-e37e-4110-9254-bf94c70f2fff-proxy-ca-bundles\") pod \"controller-manager-564895b49-gcndg\" (UID: \"d2718e4d-e37e-4110-9254-bf94c70f2fff\") " pod="openshift-controller-manager/controller-manager-564895b49-gcndg" Sep 29 10:56:50 crc kubenswrapper[4752]: I0929 10:56:50.221747 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2718e4d-e37e-4110-9254-bf94c70f2fff-config\") pod \"controller-manager-564895b49-gcndg\" (UID: \"d2718e4d-e37e-4110-9254-bf94c70f2fff\") " pod="openshift-controller-manager/controller-manager-564895b49-gcndg" Sep 29 10:56:50 crc kubenswrapper[4752]: I0929 10:56:50.224770 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-q8xbr"] Sep 29 10:56:50 crc kubenswrapper[4752]: I0929 10:56:50.226869 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d2718e4d-e37e-4110-9254-bf94c70f2fff-serving-cert\") pod \"controller-manager-564895b49-gcndg\" (UID: \"d2718e4d-e37e-4110-9254-bf94c70f2fff\") " pod="openshift-controller-manager/controller-manager-564895b49-gcndg" Sep 29 10:56:50 crc kubenswrapper[4752]: I0929 10:56:50.245767 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fftb4\" (UniqueName: \"kubernetes.io/projected/d2718e4d-e37e-4110-9254-bf94c70f2fff-kube-api-access-fftb4\") pod \"controller-manager-564895b49-gcndg\" (UID: \"d2718e4d-e37e-4110-9254-bf94c70f2fff\") " pod="openshift-controller-manager/controller-manager-564895b49-gcndg" Sep 29 10:56:50 crc kubenswrapper[4752]: I0929 10:56:50.264558 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-564895b49-gcndg" Sep 29 10:56:50 crc kubenswrapper[4752]: I0929 10:56:50.444109 4752 scope.go:117] "RemoveContainer" containerID="794b031167ccc021f2a398b84f937712a08a39f7251e530e88554fbdbd37ebb3" Sep 29 10:56:50 crc kubenswrapper[4752]: I0929 10:56:50.465230 4752 scope.go:117] "RemoveContainer" containerID="794b031167ccc021f2a398b84f937712a08a39f7251e530e88554fbdbd37ebb3" Sep 29 10:56:50 crc kubenswrapper[4752]: E0929 10:56:50.468321 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"794b031167ccc021f2a398b84f937712a08a39f7251e530e88554fbdbd37ebb3\": container with ID starting with 794b031167ccc021f2a398b84f937712a08a39f7251e530e88554fbdbd37ebb3 not found: ID does not exist" containerID="794b031167ccc021f2a398b84f937712a08a39f7251e530e88554fbdbd37ebb3" Sep 29 10:56:50 crc kubenswrapper[4752]: I0929 10:56:50.468389 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"794b031167ccc021f2a398b84f937712a08a39f7251e530e88554fbdbd37ebb3"} err="failed to get container status \"794b031167ccc021f2a398b84f937712a08a39f7251e530e88554fbdbd37ebb3\": rpc error: code = NotFound desc = could not find container \"794b031167ccc021f2a398b84f937712a08a39f7251e530e88554fbdbd37ebb3\": container with ID starting with 794b031167ccc021f2a398b84f937712a08a39f7251e530e88554fbdbd37ebb3 not found: ID does not exist" Sep 29 10:56:50 crc kubenswrapper[4752]: I0929 10:56:50.742843 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-564895b49-gcndg"] Sep 29 10:56:50 crc kubenswrapper[4752]: W0929 10:56:50.753979 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2718e4d_e37e_4110_9254_bf94c70f2fff.slice/crio-abc2b966e767fab66c9738db60626bd3d3a1e649bc766f104b2b388e2debadcd WatchSource:0}: Error finding container abc2b966e767fab66c9738db60626bd3d3a1e649bc766f104b2b388e2debadcd: Status 404 returned error can't find the container with id abc2b966e767fab66c9738db60626bd3d3a1e649bc766f104b2b388e2debadcd Sep 29 10:56:50 crc kubenswrapper[4752]: I0929 10:56:50.792997 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-564895b49-gcndg"] Sep 29 10:56:50 crc kubenswrapper[4752]: I0929 10:56:50.860831 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5cb4f7b9df-kxw6p"] Sep 29 10:56:50 crc kubenswrapper[4752]: I0929 10:56:50.861767 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5cb4f7b9df-kxw6p" Sep 29 10:56:50 crc kubenswrapper[4752]: I0929 10:56:50.866462 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Sep 29 10:56:50 crc kubenswrapper[4752]: I0929 10:56:50.867087 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Sep 29 10:56:50 crc kubenswrapper[4752]: I0929 10:56:50.867086 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Sep 29 10:56:50 crc kubenswrapper[4752]: I0929 10:56:50.867321 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Sep 29 10:56:50 crc kubenswrapper[4752]: I0929 10:56:50.867495 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Sep 29 10:56:50 crc kubenswrapper[4752]: I0929 10:56:50.867772 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Sep 29 10:56:50 crc kubenswrapper[4752]: I0929 10:56:50.911790 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5cb4f7b9df-kxw6p"] Sep 29 10:56:51 crc kubenswrapper[4752]: I0929 10:56:51.032375 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25e79050-a589-42d9-8d71-1faaf657b8e5-config\") pod \"route-controller-manager-5cb4f7b9df-kxw6p\" (UID: \"25e79050-a589-42d9-8d71-1faaf657b8e5\") " pod="openshift-route-controller-manager/route-controller-manager-5cb4f7b9df-kxw6p" Sep 29 10:56:51 crc kubenswrapper[4752]: I0929 10:56:51.033669 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-px6f4\" (UniqueName: \"kubernetes.io/projected/25e79050-a589-42d9-8d71-1faaf657b8e5-kube-api-access-px6f4\") pod \"route-controller-manager-5cb4f7b9df-kxw6p\" (UID: \"25e79050-a589-42d9-8d71-1faaf657b8e5\") " pod="openshift-route-controller-manager/route-controller-manager-5cb4f7b9df-kxw6p" Sep 29 10:56:51 crc kubenswrapper[4752]: I0929 10:56:51.033824 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25e79050-a589-42d9-8d71-1faaf657b8e5-serving-cert\") pod \"route-controller-manager-5cb4f7b9df-kxw6p\" (UID: \"25e79050-a589-42d9-8d71-1faaf657b8e5\") " pod="openshift-route-controller-manager/route-controller-manager-5cb4f7b9df-kxw6p" Sep 29 10:56:51 crc kubenswrapper[4752]: I0929 10:56:51.034011 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/25e79050-a589-42d9-8d71-1faaf657b8e5-client-ca\") pod \"route-controller-manager-5cb4f7b9df-kxw6p\" (UID: \"25e79050-a589-42d9-8d71-1faaf657b8e5\") " pod="openshift-route-controller-manager/route-controller-manager-5cb4f7b9df-kxw6p" Sep 29 10:56:51 crc kubenswrapper[4752]: I0929 10:56:51.054465 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-s6ttt" event={"ID":"c7616dcc-135c-41d4-bf4c-8f72270fa5fd","Type":"ContainerStarted","Data":"c0ffae568448f5005b7b0506f8a13a05310f617cc0c56a5150723f325c473436"} Sep 29 10:56:51 crc kubenswrapper[4752]: I0929 10:56:51.056185 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-564895b49-gcndg" podUID="d2718e4d-e37e-4110-9254-bf94c70f2fff" containerName="controller-manager" containerID="cri-o://8c522ae7806f51244a14cc6ec16f5360e2370a3ee74f79999627be9483e2e9be" gracePeriod=30 Sep 29 10:56:51 crc kubenswrapper[4752]: I0929 10:56:51.056500 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-564895b49-gcndg" event={"ID":"d2718e4d-e37e-4110-9254-bf94c70f2fff","Type":"ContainerStarted","Data":"8c522ae7806f51244a14cc6ec16f5360e2370a3ee74f79999627be9483e2e9be"} Sep 29 10:56:51 crc kubenswrapper[4752]: I0929 10:56:51.056541 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-564895b49-gcndg" event={"ID":"d2718e4d-e37e-4110-9254-bf94c70f2fff","Type":"ContainerStarted","Data":"abc2b966e767fab66c9738db60626bd3d3a1e649bc766f104b2b388e2debadcd"} Sep 29 10:56:51 crc kubenswrapper[4752]: I0929 10:56:51.057184 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-564895b49-gcndg" Sep 29 10:56:51 crc kubenswrapper[4752]: I0929 10:56:51.063481 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-564895b49-gcndg" Sep 29 10:56:51 crc kubenswrapper[4752]: I0929 10:56:51.077982 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-s6ttt" podStartSLOduration=1.678063557 podStartE2EDuration="5.0779641s" podCreationTimestamp="2025-09-29 10:56:46 +0000 UTC" firstStartedPulling="2025-09-29 10:56:47.140496154 +0000 UTC m=+747.929637821" lastFinishedPulling="2025-09-29 10:56:50.540396687 +0000 UTC m=+751.329538364" observedRunningTime="2025-09-29 10:56:51.07568052 +0000 UTC m=+751.864822187" watchObservedRunningTime="2025-09-29 10:56:51.0779641 +0000 UTC m=+751.867105767" Sep 29 10:56:51 crc kubenswrapper[4752]: I0929 10:56:51.099046 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-564895b49-gcndg" podStartSLOduration=3.099025319 podStartE2EDuration="3.099025319s" podCreationTimestamp="2025-09-29 10:56:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:56:51.098600968 +0000 UTC m=+751.887742635" watchObservedRunningTime="2025-09-29 10:56:51.099025319 +0000 UTC m=+751.888166986" Sep 29 10:56:51 crc kubenswrapper[4752]: I0929 10:56:51.135740 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/25e79050-a589-42d9-8d71-1faaf657b8e5-client-ca\") pod \"route-controller-manager-5cb4f7b9df-kxw6p\" (UID: \"25e79050-a589-42d9-8d71-1faaf657b8e5\") " pod="openshift-route-controller-manager/route-controller-manager-5cb4f7b9df-kxw6p" Sep 29 10:56:51 crc kubenswrapper[4752]: I0929 10:56:51.135867 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25e79050-a589-42d9-8d71-1faaf657b8e5-config\") pod \"route-controller-manager-5cb4f7b9df-kxw6p\" (UID: \"25e79050-a589-42d9-8d71-1faaf657b8e5\") " pod="openshift-route-controller-manager/route-controller-manager-5cb4f7b9df-kxw6p" Sep 29 10:56:51 crc kubenswrapper[4752]: I0929 10:56:51.135928 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-px6f4\" (UniqueName: \"kubernetes.io/projected/25e79050-a589-42d9-8d71-1faaf657b8e5-kube-api-access-px6f4\") pod \"route-controller-manager-5cb4f7b9df-kxw6p\" (UID: \"25e79050-a589-42d9-8d71-1faaf657b8e5\") " pod="openshift-route-controller-manager/route-controller-manager-5cb4f7b9df-kxw6p" Sep 29 10:56:51 crc kubenswrapper[4752]: I0929 10:56:51.135959 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25e79050-a589-42d9-8d71-1faaf657b8e5-serving-cert\") pod \"route-controller-manager-5cb4f7b9df-kxw6p\" (UID: \"25e79050-a589-42d9-8d71-1faaf657b8e5\") " pod="openshift-route-controller-manager/route-controller-manager-5cb4f7b9df-kxw6p" Sep 29 10:56:51 crc kubenswrapper[4752]: I0929 10:56:51.137076 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/25e79050-a589-42d9-8d71-1faaf657b8e5-client-ca\") pod \"route-controller-manager-5cb4f7b9df-kxw6p\" (UID: \"25e79050-a589-42d9-8d71-1faaf657b8e5\") " pod="openshift-route-controller-manager/route-controller-manager-5cb4f7b9df-kxw6p" Sep 29 10:56:51 crc kubenswrapper[4752]: I0929 10:56:51.137135 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25e79050-a589-42d9-8d71-1faaf657b8e5-config\") pod \"route-controller-manager-5cb4f7b9df-kxw6p\" (UID: \"25e79050-a589-42d9-8d71-1faaf657b8e5\") " pod="openshift-route-controller-manager/route-controller-manager-5cb4f7b9df-kxw6p" Sep 29 10:56:51 crc kubenswrapper[4752]: I0929 10:56:51.142866 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25e79050-a589-42d9-8d71-1faaf657b8e5-serving-cert\") pod \"route-controller-manager-5cb4f7b9df-kxw6p\" (UID: \"25e79050-a589-42d9-8d71-1faaf657b8e5\") " pod="openshift-route-controller-manager/route-controller-manager-5cb4f7b9df-kxw6p" Sep 29 10:56:51 crc kubenswrapper[4752]: I0929 10:56:51.155330 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-px6f4\" (UniqueName: \"kubernetes.io/projected/25e79050-a589-42d9-8d71-1faaf657b8e5-kube-api-access-px6f4\") pod \"route-controller-manager-5cb4f7b9df-kxw6p\" (UID: \"25e79050-a589-42d9-8d71-1faaf657b8e5\") " pod="openshift-route-controller-manager/route-controller-manager-5cb4f7b9df-kxw6p" Sep 29 10:56:51 crc kubenswrapper[4752]: I0929 10:56:51.177904 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5cb4f7b9df-kxw6p" Sep 29 10:56:51 crc kubenswrapper[4752]: I0929 10:56:51.423337 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-564895b49-gcndg" Sep 29 10:56:51 crc kubenswrapper[4752]: I0929 10:56:51.439613 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fftb4\" (UniqueName: \"kubernetes.io/projected/d2718e4d-e37e-4110-9254-bf94c70f2fff-kube-api-access-fftb4\") pod \"d2718e4d-e37e-4110-9254-bf94c70f2fff\" (UID: \"d2718e4d-e37e-4110-9254-bf94c70f2fff\") " Sep 29 10:56:51 crc kubenswrapper[4752]: I0929 10:56:51.439668 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d2718e4d-e37e-4110-9254-bf94c70f2fff-proxy-ca-bundles\") pod \"d2718e4d-e37e-4110-9254-bf94c70f2fff\" (UID: \"d2718e4d-e37e-4110-9254-bf94c70f2fff\") " Sep 29 10:56:51 crc kubenswrapper[4752]: I0929 10:56:51.439738 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d2718e4d-e37e-4110-9254-bf94c70f2fff-serving-cert\") pod \"d2718e4d-e37e-4110-9254-bf94c70f2fff\" (UID: \"d2718e4d-e37e-4110-9254-bf94c70f2fff\") " Sep 29 10:56:51 crc kubenswrapper[4752]: I0929 10:56:51.439764 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2718e4d-e37e-4110-9254-bf94c70f2fff-config\") pod \"d2718e4d-e37e-4110-9254-bf94c70f2fff\" (UID: \"d2718e4d-e37e-4110-9254-bf94c70f2fff\") " Sep 29 10:56:51 crc kubenswrapper[4752]: I0929 10:56:51.439794 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d2718e4d-e37e-4110-9254-bf94c70f2fff-client-ca\") pod \"d2718e4d-e37e-4110-9254-bf94c70f2fff\" (UID: \"d2718e4d-e37e-4110-9254-bf94c70f2fff\") " Sep 29 10:56:51 crc kubenswrapper[4752]: I0929 10:56:51.444915 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2718e4d-e37e-4110-9254-bf94c70f2fff-kube-api-access-fftb4" (OuterVolumeSpecName: "kube-api-access-fftb4") pod "d2718e4d-e37e-4110-9254-bf94c70f2fff" (UID: "d2718e4d-e37e-4110-9254-bf94c70f2fff"). InnerVolumeSpecName "kube-api-access-fftb4". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:56:51 crc kubenswrapper[4752]: I0929 10:56:51.448852 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5cb4f7b9df-kxw6p"] Sep 29 10:56:51 crc kubenswrapper[4752]: I0929 10:56:51.449867 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2718e4d-e37e-4110-9254-bf94c70f2fff-config" (OuterVolumeSpecName: "config") pod "d2718e4d-e37e-4110-9254-bf94c70f2fff" (UID: "d2718e4d-e37e-4110-9254-bf94c70f2fff"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:56:51 crc kubenswrapper[4752]: I0929 10:56:51.450162 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2718e4d-e37e-4110-9254-bf94c70f2fff-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "d2718e4d-e37e-4110-9254-bf94c70f2fff" (UID: "d2718e4d-e37e-4110-9254-bf94c70f2fff"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:56:51 crc kubenswrapper[4752]: I0929 10:56:51.451397 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2718e4d-e37e-4110-9254-bf94c70f2fff-client-ca" (OuterVolumeSpecName: "client-ca") pod "d2718e4d-e37e-4110-9254-bf94c70f2fff" (UID: "d2718e4d-e37e-4110-9254-bf94c70f2fff"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:56:51 crc kubenswrapper[4752]: I0929 10:56:51.451428 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2718e4d-e37e-4110-9254-bf94c70f2fff-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d2718e4d-e37e-4110-9254-bf94c70f2fff" (UID: "d2718e4d-e37e-4110-9254-bf94c70f2fff"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:56:51 crc kubenswrapper[4752]: I0929 10:56:51.541017 4752 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2718e4d-e37e-4110-9254-bf94c70f2fff-config\") on node \"crc\" DevicePath \"\"" Sep 29 10:56:51 crc kubenswrapper[4752]: I0929 10:56:51.541062 4752 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d2718e4d-e37e-4110-9254-bf94c70f2fff-client-ca\") on node \"crc\" DevicePath \"\"" Sep 29 10:56:51 crc kubenswrapper[4752]: I0929 10:56:51.541080 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fftb4\" (UniqueName: \"kubernetes.io/projected/d2718e4d-e37e-4110-9254-bf94c70f2fff-kube-api-access-fftb4\") on node \"crc\" DevicePath \"\"" Sep 29 10:56:51 crc kubenswrapper[4752]: I0929 10:56:51.541093 4752 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d2718e4d-e37e-4110-9254-bf94c70f2fff-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Sep 29 10:56:51 crc kubenswrapper[4752]: I0929 10:56:51.541104 4752 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d2718e4d-e37e-4110-9254-bf94c70f2fff-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 29 10:56:51 crc kubenswrapper[4752]: I0929 10:56:51.944311 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-777f5d9cd6-wlv7t"] Sep 29 10:56:51 crc kubenswrapper[4752]: E0929 10:56:51.944609 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2718e4d-e37e-4110-9254-bf94c70f2fff" containerName="controller-manager" Sep 29 10:56:51 crc kubenswrapper[4752]: I0929 10:56:51.944632 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2718e4d-e37e-4110-9254-bf94c70f2fff" containerName="controller-manager" Sep 29 10:56:51 crc kubenswrapper[4752]: I0929 10:56:51.944767 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2718e4d-e37e-4110-9254-bf94c70f2fff" containerName="controller-manager" Sep 29 10:56:51 crc kubenswrapper[4752]: I0929 10:56:51.945323 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-777f5d9cd6-wlv7t" Sep 29 10:56:51 crc kubenswrapper[4752]: I0929 10:56:51.961662 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-777f5d9cd6-wlv7t"] Sep 29 10:56:52 crc kubenswrapper[4752]: I0929 10:56:52.037722 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78a002dc-c902-472c-b269-9ec7c99ab835" path="/var/lib/kubelet/pods/78a002dc-c902-472c-b269-9ec7c99ab835/volumes" Sep 29 10:56:52 crc kubenswrapper[4752]: I0929 10:56:52.038561 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3f0dcdd-283d-4ed5-889a-da260dcf13b0" path="/var/lib/kubelet/pods/f3f0dcdd-283d-4ed5-889a-da260dcf13b0/volumes" Sep 29 10:56:52 crc kubenswrapper[4752]: I0929 10:56:52.046685 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f8687ff2-2c77-4b6e-b82b-16a9fe430013-client-ca\") pod \"controller-manager-777f5d9cd6-wlv7t\" (UID: \"f8687ff2-2c77-4b6e-b82b-16a9fe430013\") " pod="openshift-controller-manager/controller-manager-777f5d9cd6-wlv7t" Sep 29 10:56:52 crc kubenswrapper[4752]: I0929 10:56:52.046775 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f8687ff2-2c77-4b6e-b82b-16a9fe430013-serving-cert\") pod \"controller-manager-777f5d9cd6-wlv7t\" (UID: \"f8687ff2-2c77-4b6e-b82b-16a9fe430013\") " pod="openshift-controller-manager/controller-manager-777f5d9cd6-wlv7t" Sep 29 10:56:52 crc kubenswrapper[4752]: I0929 10:56:52.046898 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8687ff2-2c77-4b6e-b82b-16a9fe430013-config\") pod \"controller-manager-777f5d9cd6-wlv7t\" (UID: \"f8687ff2-2c77-4b6e-b82b-16a9fe430013\") " pod="openshift-controller-manager/controller-manager-777f5d9cd6-wlv7t" Sep 29 10:56:52 crc kubenswrapper[4752]: I0929 10:56:52.046945 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxlc9\" (UniqueName: \"kubernetes.io/projected/f8687ff2-2c77-4b6e-b82b-16a9fe430013-kube-api-access-fxlc9\") pod \"controller-manager-777f5d9cd6-wlv7t\" (UID: \"f8687ff2-2c77-4b6e-b82b-16a9fe430013\") " pod="openshift-controller-manager/controller-manager-777f5d9cd6-wlv7t" Sep 29 10:56:52 crc kubenswrapper[4752]: I0929 10:56:52.046991 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f8687ff2-2c77-4b6e-b82b-16a9fe430013-proxy-ca-bundles\") pod \"controller-manager-777f5d9cd6-wlv7t\" (UID: \"f8687ff2-2c77-4b6e-b82b-16a9fe430013\") " pod="openshift-controller-manager/controller-manager-777f5d9cd6-wlv7t" Sep 29 10:56:52 crc kubenswrapper[4752]: I0929 10:56:52.066907 4752 generic.go:334] "Generic (PLEG): container finished" podID="d2718e4d-e37e-4110-9254-bf94c70f2fff" containerID="8c522ae7806f51244a14cc6ec16f5360e2370a3ee74f79999627be9483e2e9be" exitCode=0 Sep 29 10:56:52 crc kubenswrapper[4752]: I0929 10:56:52.066983 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-564895b49-gcndg" event={"ID":"d2718e4d-e37e-4110-9254-bf94c70f2fff","Type":"ContainerDied","Data":"8c522ae7806f51244a14cc6ec16f5360e2370a3ee74f79999627be9483e2e9be"} Sep 29 10:56:52 crc kubenswrapper[4752]: I0929 10:56:52.067178 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-564895b49-gcndg" event={"ID":"d2718e4d-e37e-4110-9254-bf94c70f2fff","Type":"ContainerDied","Data":"abc2b966e767fab66c9738db60626bd3d3a1e649bc766f104b2b388e2debadcd"} Sep 29 10:56:52 crc kubenswrapper[4752]: I0929 10:56:52.067201 4752 scope.go:117] "RemoveContainer" containerID="8c522ae7806f51244a14cc6ec16f5360e2370a3ee74f79999627be9483e2e9be" Sep 29 10:56:52 crc kubenswrapper[4752]: I0929 10:56:52.067359 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-564895b49-gcndg" Sep 29 10:56:52 crc kubenswrapper[4752]: I0929 10:56:52.069457 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5cb4f7b9df-kxw6p" event={"ID":"25e79050-a589-42d9-8d71-1faaf657b8e5","Type":"ContainerStarted","Data":"c01d41b6a3ef2b46182c1b8aab2de519fd8aab636580360cda040010ec25fc5e"} Sep 29 10:56:52 crc kubenswrapper[4752]: I0929 10:56:52.069491 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5cb4f7b9df-kxw6p" event={"ID":"25e79050-a589-42d9-8d71-1faaf657b8e5","Type":"ContainerStarted","Data":"0ffd88c350849bca0a3e3367f8b579f21f63b5b02dd594cb6233ebcc3d184430"} Sep 29 10:56:52 crc kubenswrapper[4752]: I0929 10:56:52.069510 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5cb4f7b9df-kxw6p" Sep 29 10:56:52 crc kubenswrapper[4752]: I0929 10:56:52.088361 4752 scope.go:117] "RemoveContainer" containerID="8c522ae7806f51244a14cc6ec16f5360e2370a3ee74f79999627be9483e2e9be" Sep 29 10:56:52 crc kubenswrapper[4752]: E0929 10:56:52.089285 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c522ae7806f51244a14cc6ec16f5360e2370a3ee74f79999627be9483e2e9be\": container with ID starting with 8c522ae7806f51244a14cc6ec16f5360e2370a3ee74f79999627be9483e2e9be not found: ID does not exist" containerID="8c522ae7806f51244a14cc6ec16f5360e2370a3ee74f79999627be9483e2e9be" Sep 29 10:56:52 crc kubenswrapper[4752]: I0929 10:56:52.089317 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c522ae7806f51244a14cc6ec16f5360e2370a3ee74f79999627be9483e2e9be"} err="failed to get container status \"8c522ae7806f51244a14cc6ec16f5360e2370a3ee74f79999627be9483e2e9be\": rpc error: code = NotFound desc = could not find container \"8c522ae7806f51244a14cc6ec16f5360e2370a3ee74f79999627be9483e2e9be\": container with ID starting with 8c522ae7806f51244a14cc6ec16f5360e2370a3ee74f79999627be9483e2e9be not found: ID does not exist" Sep 29 10:56:52 crc kubenswrapper[4752]: I0929 10:56:52.097328 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5cb4f7b9df-kxw6p" Sep 29 10:56:52 crc kubenswrapper[4752]: I0929 10:56:52.103708 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-564895b49-gcndg"] Sep 29 10:56:52 crc kubenswrapper[4752]: I0929 10:56:52.110383 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-564895b49-gcndg"] Sep 29 10:56:52 crc kubenswrapper[4752]: I0929 10:56:52.124559 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5cb4f7b9df-kxw6p" podStartSLOduration=2.124538289 podStartE2EDuration="2.124538289s" podCreationTimestamp="2025-09-29 10:56:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:56:52.120580126 +0000 UTC m=+752.909721793" watchObservedRunningTime="2025-09-29 10:56:52.124538289 +0000 UTC m=+752.913679946" Sep 29 10:56:52 crc kubenswrapper[4752]: I0929 10:56:52.147562 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxlc9\" (UniqueName: \"kubernetes.io/projected/f8687ff2-2c77-4b6e-b82b-16a9fe430013-kube-api-access-fxlc9\") pod \"controller-manager-777f5d9cd6-wlv7t\" (UID: \"f8687ff2-2c77-4b6e-b82b-16a9fe430013\") " pod="openshift-controller-manager/controller-manager-777f5d9cd6-wlv7t" Sep 29 10:56:52 crc kubenswrapper[4752]: I0929 10:56:52.147643 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f8687ff2-2c77-4b6e-b82b-16a9fe430013-proxy-ca-bundles\") pod \"controller-manager-777f5d9cd6-wlv7t\" (UID: \"f8687ff2-2c77-4b6e-b82b-16a9fe430013\") " pod="openshift-controller-manager/controller-manager-777f5d9cd6-wlv7t" Sep 29 10:56:52 crc kubenswrapper[4752]: I0929 10:56:52.147685 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f8687ff2-2c77-4b6e-b82b-16a9fe430013-client-ca\") pod \"controller-manager-777f5d9cd6-wlv7t\" (UID: \"f8687ff2-2c77-4b6e-b82b-16a9fe430013\") " pod="openshift-controller-manager/controller-manager-777f5d9cd6-wlv7t" Sep 29 10:56:52 crc kubenswrapper[4752]: I0929 10:56:52.148825 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f8687ff2-2c77-4b6e-b82b-16a9fe430013-proxy-ca-bundles\") pod \"controller-manager-777f5d9cd6-wlv7t\" (UID: \"f8687ff2-2c77-4b6e-b82b-16a9fe430013\") " pod="openshift-controller-manager/controller-manager-777f5d9cd6-wlv7t" Sep 29 10:56:52 crc kubenswrapper[4752]: I0929 10:56:52.148867 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f8687ff2-2c77-4b6e-b82b-16a9fe430013-client-ca\") pod \"controller-manager-777f5d9cd6-wlv7t\" (UID: \"f8687ff2-2c77-4b6e-b82b-16a9fe430013\") " pod="openshift-controller-manager/controller-manager-777f5d9cd6-wlv7t" Sep 29 10:56:52 crc kubenswrapper[4752]: I0929 10:56:52.148882 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f8687ff2-2c77-4b6e-b82b-16a9fe430013-serving-cert\") pod \"controller-manager-777f5d9cd6-wlv7t\" (UID: \"f8687ff2-2c77-4b6e-b82b-16a9fe430013\") " pod="openshift-controller-manager/controller-manager-777f5d9cd6-wlv7t" Sep 29 10:56:52 crc kubenswrapper[4752]: I0929 10:56:52.148988 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8687ff2-2c77-4b6e-b82b-16a9fe430013-config\") pod \"controller-manager-777f5d9cd6-wlv7t\" (UID: \"f8687ff2-2c77-4b6e-b82b-16a9fe430013\") " pod="openshift-controller-manager/controller-manager-777f5d9cd6-wlv7t" Sep 29 10:56:52 crc kubenswrapper[4752]: I0929 10:56:52.150168 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8687ff2-2c77-4b6e-b82b-16a9fe430013-config\") pod \"controller-manager-777f5d9cd6-wlv7t\" (UID: \"f8687ff2-2c77-4b6e-b82b-16a9fe430013\") " pod="openshift-controller-manager/controller-manager-777f5d9cd6-wlv7t" Sep 29 10:56:52 crc kubenswrapper[4752]: I0929 10:56:52.154135 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f8687ff2-2c77-4b6e-b82b-16a9fe430013-serving-cert\") pod \"controller-manager-777f5d9cd6-wlv7t\" (UID: \"f8687ff2-2c77-4b6e-b82b-16a9fe430013\") " pod="openshift-controller-manager/controller-manager-777f5d9cd6-wlv7t" Sep 29 10:56:52 crc kubenswrapper[4752]: I0929 10:56:52.169668 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxlc9\" (UniqueName: \"kubernetes.io/projected/f8687ff2-2c77-4b6e-b82b-16a9fe430013-kube-api-access-fxlc9\") pod \"controller-manager-777f5d9cd6-wlv7t\" (UID: \"f8687ff2-2c77-4b6e-b82b-16a9fe430013\") " pod="openshift-controller-manager/controller-manager-777f5d9cd6-wlv7t" Sep 29 10:56:52 crc kubenswrapper[4752]: I0929 10:56:52.259623 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-777f5d9cd6-wlv7t" Sep 29 10:56:52 crc kubenswrapper[4752]: I0929 10:56:52.525550 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-777f5d9cd6-wlv7t"] Sep 29 10:56:52 crc kubenswrapper[4752]: W0929 10:56:52.532690 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8687ff2_2c77_4b6e_b82b_16a9fe430013.slice/crio-10f525791fdf36dca08ff439077990a3c009e85792bbdeec43c83af3a069caa4 WatchSource:0}: Error finding container 10f525791fdf36dca08ff439077990a3c009e85792bbdeec43c83af3a069caa4: Status 404 returned error can't find the container with id 10f525791fdf36dca08ff439077990a3c009e85792bbdeec43c83af3a069caa4 Sep 29 10:56:53 crc kubenswrapper[4752]: I0929 10:56:53.076504 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-777f5d9cd6-wlv7t" event={"ID":"f8687ff2-2c77-4b6e-b82b-16a9fe430013","Type":"ContainerStarted","Data":"ec66cb7b02206f1245703e95f6efc200eb311ea249a51b96dffa8f71368aa9e9"} Sep 29 10:56:53 crc kubenswrapper[4752]: I0929 10:56:53.076573 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-777f5d9cd6-wlv7t" event={"ID":"f8687ff2-2c77-4b6e-b82b-16a9fe430013","Type":"ContainerStarted","Data":"10f525791fdf36dca08ff439077990a3c009e85792bbdeec43c83af3a069caa4"} Sep 29 10:56:53 crc kubenswrapper[4752]: I0929 10:56:53.076734 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-777f5d9cd6-wlv7t" Sep 29 10:56:53 crc kubenswrapper[4752]: I0929 10:56:53.088172 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-777f5d9cd6-wlv7t" Sep 29 10:56:53 crc kubenswrapper[4752]: I0929 10:56:53.099699 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-777f5d9cd6-wlv7t" podStartSLOduration=3.099670705 podStartE2EDuration="3.099670705s" podCreationTimestamp="2025-09-29 10:56:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:56:53.095453515 +0000 UTC m=+753.884595202" watchObservedRunningTime="2025-09-29 10:56:53.099670705 +0000 UTC m=+753.888812412" Sep 29 10:56:54 crc kubenswrapper[4752]: I0929 10:56:54.040892 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2718e4d-e37e-4110-9254-bf94c70f2fff" path="/var/lib/kubelet/pods/d2718e4d-e37e-4110-9254-bf94c70f2fff/volumes" Sep 29 10:56:55 crc kubenswrapper[4752]: I0929 10:56:55.570276 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-58fcddf996-7pc9n"] Sep 29 10:56:55 crc kubenswrapper[4752]: I0929 10:56:55.571496 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58fcddf996-7pc9n" Sep 29 10:56:55 crc kubenswrapper[4752]: I0929 10:56:55.583246 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58fcddf996-7pc9n"] Sep 29 10:56:55 crc kubenswrapper[4752]: I0929 10:56:55.591746 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-6d689559c5-75qgt"] Sep 29 10:56:55 crc kubenswrapper[4752]: I0929 10:56:55.592930 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6d689559c5-75qgt" Sep 29 10:56:55 crc kubenswrapper[4752]: I0929 10:56:55.595261 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Sep 29 10:56:55 crc kubenswrapper[4752]: I0929 10:56:55.609821 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgwkr\" (UniqueName: \"kubernetes.io/projected/1031b1fd-032a-4972-835d-8f53171bc7de-kube-api-access-zgwkr\") pod \"nmstate-metrics-58fcddf996-7pc9n\" (UID: \"1031b1fd-032a-4972-835d-8f53171bc7de\") " pod="openshift-nmstate/nmstate-metrics-58fcddf996-7pc9n" Sep 29 10:56:55 crc kubenswrapper[4752]: I0929 10:56:55.609902 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sq7hg\" (UniqueName: \"kubernetes.io/projected/91fc3e7a-9ef1-435e-9fac-a273bdf2de57-kube-api-access-sq7hg\") pod \"nmstate-webhook-6d689559c5-75qgt\" (UID: \"91fc3e7a-9ef1-435e-9fac-a273bdf2de57\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-75qgt" Sep 29 10:56:55 crc kubenswrapper[4752]: I0929 10:56:55.610052 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/91fc3e7a-9ef1-435e-9fac-a273bdf2de57-tls-key-pair\") pod \"nmstate-webhook-6d689559c5-75qgt\" (UID: \"91fc3e7a-9ef1-435e-9fac-a273bdf2de57\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-75qgt" Sep 29 10:56:55 crc kubenswrapper[4752]: I0929 10:56:55.617851 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-nz9pc"] Sep 29 10:56:55 crc kubenswrapper[4752]: I0929 10:56:55.618860 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-nz9pc" Sep 29 10:56:55 crc kubenswrapper[4752]: I0929 10:56:55.623299 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6d689559c5-75qgt"] Sep 29 10:56:55 crc kubenswrapper[4752]: I0929 10:56:55.702514 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-864bb6dfb5-gwrnx"] Sep 29 10:56:55 crc kubenswrapper[4752]: I0929 10:56:55.703390 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-gwrnx" Sep 29 10:56:55 crc kubenswrapper[4752]: I0929 10:56:55.705634 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Sep 29 10:56:55 crc kubenswrapper[4752]: I0929 10:56:55.706703 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-wjjfk" Sep 29 10:56:55 crc kubenswrapper[4752]: I0929 10:56:55.706751 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Sep 29 10:56:55 crc kubenswrapper[4752]: I0929 10:56:55.711556 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-864bb6dfb5-gwrnx"] Sep 29 10:56:55 crc kubenswrapper[4752]: I0929 10:56:55.711889 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/91fc3e7a-9ef1-435e-9fac-a273bdf2de57-tls-key-pair\") pod \"nmstate-webhook-6d689559c5-75qgt\" (UID: \"91fc3e7a-9ef1-435e-9fac-a273bdf2de57\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-75qgt" Sep 29 10:56:55 crc kubenswrapper[4752]: I0929 10:56:55.711993 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgwkr\" (UniqueName: \"kubernetes.io/projected/1031b1fd-032a-4972-835d-8f53171bc7de-kube-api-access-zgwkr\") pod \"nmstate-metrics-58fcddf996-7pc9n\" (UID: \"1031b1fd-032a-4972-835d-8f53171bc7de\") " pod="openshift-nmstate/nmstate-metrics-58fcddf996-7pc9n" Sep 29 10:56:55 crc kubenswrapper[4752]: I0929 10:56:55.712068 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/5bace403-1b12-45b3-a28c-8a7c58393a22-nmstate-lock\") pod \"nmstate-handler-nz9pc\" (UID: \"5bace403-1b12-45b3-a28c-8a7c58393a22\") " pod="openshift-nmstate/nmstate-handler-nz9pc" Sep 29 10:56:55 crc kubenswrapper[4752]: I0929 10:56:55.712098 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/5bace403-1b12-45b3-a28c-8a7c58393a22-dbus-socket\") pod \"nmstate-handler-nz9pc\" (UID: \"5bace403-1b12-45b3-a28c-8a7c58393a22\") " pod="openshift-nmstate/nmstate-handler-nz9pc" Sep 29 10:56:55 crc kubenswrapper[4752]: I0929 10:56:55.712127 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vq6gp\" (UniqueName: \"kubernetes.io/projected/5bace403-1b12-45b3-a28c-8a7c58393a22-kube-api-access-vq6gp\") pod \"nmstate-handler-nz9pc\" (UID: \"5bace403-1b12-45b3-a28c-8a7c58393a22\") " pod="openshift-nmstate/nmstate-handler-nz9pc" Sep 29 10:56:55 crc kubenswrapper[4752]: I0929 10:56:55.712165 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/5bace403-1b12-45b3-a28c-8a7c58393a22-ovs-socket\") pod \"nmstate-handler-nz9pc\" (UID: \"5bace403-1b12-45b3-a28c-8a7c58393a22\") " pod="openshift-nmstate/nmstate-handler-nz9pc" Sep 29 10:56:55 crc kubenswrapper[4752]: I0929 10:56:55.712204 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sq7hg\" (UniqueName: \"kubernetes.io/projected/91fc3e7a-9ef1-435e-9fac-a273bdf2de57-kube-api-access-sq7hg\") pod \"nmstate-webhook-6d689559c5-75qgt\" (UID: \"91fc3e7a-9ef1-435e-9fac-a273bdf2de57\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-75qgt" Sep 29 10:56:55 crc kubenswrapper[4752]: I0929 10:56:55.730040 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/91fc3e7a-9ef1-435e-9fac-a273bdf2de57-tls-key-pair\") pod \"nmstate-webhook-6d689559c5-75qgt\" (UID: \"91fc3e7a-9ef1-435e-9fac-a273bdf2de57\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-75qgt" Sep 29 10:56:55 crc kubenswrapper[4752]: I0929 10:56:55.731507 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sq7hg\" (UniqueName: \"kubernetes.io/projected/91fc3e7a-9ef1-435e-9fac-a273bdf2de57-kube-api-access-sq7hg\") pod \"nmstate-webhook-6d689559c5-75qgt\" (UID: \"91fc3e7a-9ef1-435e-9fac-a273bdf2de57\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-75qgt" Sep 29 10:56:55 crc kubenswrapper[4752]: I0929 10:56:55.739787 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgwkr\" (UniqueName: \"kubernetes.io/projected/1031b1fd-032a-4972-835d-8f53171bc7de-kube-api-access-zgwkr\") pod \"nmstate-metrics-58fcddf996-7pc9n\" (UID: \"1031b1fd-032a-4972-835d-8f53171bc7de\") " pod="openshift-nmstate/nmstate-metrics-58fcddf996-7pc9n" Sep 29 10:56:55 crc kubenswrapper[4752]: I0929 10:56:55.814222 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d79mm\" (UniqueName: \"kubernetes.io/projected/1f74c3bc-756d-48fc-848c-c8e8a045dee4-kube-api-access-d79mm\") pod \"nmstate-console-plugin-864bb6dfb5-gwrnx\" (UID: \"1f74c3bc-756d-48fc-848c-c8e8a045dee4\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-gwrnx" Sep 29 10:56:55 crc kubenswrapper[4752]: I0929 10:56:55.814314 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/5bace403-1b12-45b3-a28c-8a7c58393a22-nmstate-lock\") pod \"nmstate-handler-nz9pc\" (UID: \"5bace403-1b12-45b3-a28c-8a7c58393a22\") " pod="openshift-nmstate/nmstate-handler-nz9pc" Sep 29 10:56:55 crc kubenswrapper[4752]: I0929 10:56:55.814369 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/5bace403-1b12-45b3-a28c-8a7c58393a22-dbus-socket\") pod \"nmstate-handler-nz9pc\" (UID: \"5bace403-1b12-45b3-a28c-8a7c58393a22\") " pod="openshift-nmstate/nmstate-handler-nz9pc" Sep 29 10:56:55 crc kubenswrapper[4752]: I0929 10:56:55.814397 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vq6gp\" (UniqueName: \"kubernetes.io/projected/5bace403-1b12-45b3-a28c-8a7c58393a22-kube-api-access-vq6gp\") pod \"nmstate-handler-nz9pc\" (UID: \"5bace403-1b12-45b3-a28c-8a7c58393a22\") " pod="openshift-nmstate/nmstate-handler-nz9pc" Sep 29 10:56:55 crc kubenswrapper[4752]: I0929 10:56:55.814436 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/5bace403-1b12-45b3-a28c-8a7c58393a22-ovs-socket\") pod \"nmstate-handler-nz9pc\" (UID: \"5bace403-1b12-45b3-a28c-8a7c58393a22\") " pod="openshift-nmstate/nmstate-handler-nz9pc" Sep 29 10:56:55 crc kubenswrapper[4752]: I0929 10:56:55.814486 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/1f74c3bc-756d-48fc-848c-c8e8a045dee4-plugin-serving-cert\") pod \"nmstate-console-plugin-864bb6dfb5-gwrnx\" (UID: \"1f74c3bc-756d-48fc-848c-c8e8a045dee4\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-gwrnx" Sep 29 10:56:55 crc kubenswrapper[4752]: I0929 10:56:55.814506 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/1f74c3bc-756d-48fc-848c-c8e8a045dee4-nginx-conf\") pod \"nmstate-console-plugin-864bb6dfb5-gwrnx\" (UID: \"1f74c3bc-756d-48fc-848c-c8e8a045dee4\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-gwrnx" Sep 29 10:56:55 crc kubenswrapper[4752]: I0929 10:56:55.814623 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/5bace403-1b12-45b3-a28c-8a7c58393a22-nmstate-lock\") pod \"nmstate-handler-nz9pc\" (UID: \"5bace403-1b12-45b3-a28c-8a7c58393a22\") " pod="openshift-nmstate/nmstate-handler-nz9pc" Sep 29 10:56:55 crc kubenswrapper[4752]: I0929 10:56:55.814985 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/5bace403-1b12-45b3-a28c-8a7c58393a22-dbus-socket\") pod \"nmstate-handler-nz9pc\" (UID: \"5bace403-1b12-45b3-a28c-8a7c58393a22\") " pod="openshift-nmstate/nmstate-handler-nz9pc" Sep 29 10:56:55 crc kubenswrapper[4752]: I0929 10:56:55.815025 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/5bace403-1b12-45b3-a28c-8a7c58393a22-ovs-socket\") pod \"nmstate-handler-nz9pc\" (UID: \"5bace403-1b12-45b3-a28c-8a7c58393a22\") " pod="openshift-nmstate/nmstate-handler-nz9pc" Sep 29 10:56:55 crc kubenswrapper[4752]: I0929 10:56:55.837718 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vq6gp\" (UniqueName: \"kubernetes.io/projected/5bace403-1b12-45b3-a28c-8a7c58393a22-kube-api-access-vq6gp\") pod \"nmstate-handler-nz9pc\" (UID: \"5bace403-1b12-45b3-a28c-8a7c58393a22\") " pod="openshift-nmstate/nmstate-handler-nz9pc" Sep 29 10:56:55 crc kubenswrapper[4752]: I0929 10:56:55.880001 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-85546cf746-dfhr8"] Sep 29 10:56:55 crc kubenswrapper[4752]: I0929 10:56:55.881243 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-85546cf746-dfhr8" Sep 29 10:56:55 crc kubenswrapper[4752]: I0929 10:56:55.893687 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-85546cf746-dfhr8"] Sep 29 10:56:55 crc kubenswrapper[4752]: I0929 10:56:55.897775 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58fcddf996-7pc9n" Sep 29 10:56:55 crc kubenswrapper[4752]: I0929 10:56:55.912563 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6d689559c5-75qgt" Sep 29 10:56:55 crc kubenswrapper[4752]: I0929 10:56:55.916441 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4f94737b-80ce-4e92-85df-18fa2b1cbe8e-service-ca\") pod \"console-85546cf746-dfhr8\" (UID: \"4f94737b-80ce-4e92-85df-18fa2b1cbe8e\") " pod="openshift-console/console-85546cf746-dfhr8" Sep 29 10:56:55 crc kubenswrapper[4752]: I0929 10:56:55.916544 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rjsw\" (UniqueName: \"kubernetes.io/projected/4f94737b-80ce-4e92-85df-18fa2b1cbe8e-kube-api-access-6rjsw\") pod \"console-85546cf746-dfhr8\" (UID: \"4f94737b-80ce-4e92-85df-18fa2b1cbe8e\") " pod="openshift-console/console-85546cf746-dfhr8" Sep 29 10:56:55 crc kubenswrapper[4752]: I0929 10:56:55.916588 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/1f74c3bc-756d-48fc-848c-c8e8a045dee4-plugin-serving-cert\") pod \"nmstate-console-plugin-864bb6dfb5-gwrnx\" (UID: \"1f74c3bc-756d-48fc-848c-c8e8a045dee4\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-gwrnx" Sep 29 10:56:55 crc kubenswrapper[4752]: I0929 10:56:55.916617 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4f94737b-80ce-4e92-85df-18fa2b1cbe8e-oauth-serving-cert\") pod \"console-85546cf746-dfhr8\" (UID: \"4f94737b-80ce-4e92-85df-18fa2b1cbe8e\") " pod="openshift-console/console-85546cf746-dfhr8" Sep 29 10:56:55 crc kubenswrapper[4752]: I0929 10:56:55.916910 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/1f74c3bc-756d-48fc-848c-c8e8a045dee4-nginx-conf\") pod \"nmstate-console-plugin-864bb6dfb5-gwrnx\" (UID: \"1f74c3bc-756d-48fc-848c-c8e8a045dee4\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-gwrnx" Sep 29 10:56:55 crc kubenswrapper[4752]: I0929 10:56:55.916963 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4f94737b-80ce-4e92-85df-18fa2b1cbe8e-trusted-ca-bundle\") pod \"console-85546cf746-dfhr8\" (UID: \"4f94737b-80ce-4e92-85df-18fa2b1cbe8e\") " pod="openshift-console/console-85546cf746-dfhr8" Sep 29 10:56:55 crc kubenswrapper[4752]: I0929 10:56:55.917048 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4f94737b-80ce-4e92-85df-18fa2b1cbe8e-console-serving-cert\") pod \"console-85546cf746-dfhr8\" (UID: \"4f94737b-80ce-4e92-85df-18fa2b1cbe8e\") " pod="openshift-console/console-85546cf746-dfhr8" Sep 29 10:56:55 crc kubenswrapper[4752]: I0929 10:56:55.917085 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4f94737b-80ce-4e92-85df-18fa2b1cbe8e-console-config\") pod \"console-85546cf746-dfhr8\" (UID: \"4f94737b-80ce-4e92-85df-18fa2b1cbe8e\") " pod="openshift-console/console-85546cf746-dfhr8" Sep 29 10:56:55 crc kubenswrapper[4752]: I0929 10:56:55.917117 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d79mm\" (UniqueName: \"kubernetes.io/projected/1f74c3bc-756d-48fc-848c-c8e8a045dee4-kube-api-access-d79mm\") pod \"nmstate-console-plugin-864bb6dfb5-gwrnx\" (UID: \"1f74c3bc-756d-48fc-848c-c8e8a045dee4\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-gwrnx" Sep 29 10:56:55 crc kubenswrapper[4752]: I0929 10:56:55.917165 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4f94737b-80ce-4e92-85df-18fa2b1cbe8e-console-oauth-config\") pod \"console-85546cf746-dfhr8\" (UID: \"4f94737b-80ce-4e92-85df-18fa2b1cbe8e\") " pod="openshift-console/console-85546cf746-dfhr8" Sep 29 10:56:55 crc kubenswrapper[4752]: I0929 10:56:55.918283 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/1f74c3bc-756d-48fc-848c-c8e8a045dee4-nginx-conf\") pod \"nmstate-console-plugin-864bb6dfb5-gwrnx\" (UID: \"1f74c3bc-756d-48fc-848c-c8e8a045dee4\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-gwrnx" Sep 29 10:56:55 crc kubenswrapper[4752]: I0929 10:56:55.921194 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/1f74c3bc-756d-48fc-848c-c8e8a045dee4-plugin-serving-cert\") pod \"nmstate-console-plugin-864bb6dfb5-gwrnx\" (UID: \"1f74c3bc-756d-48fc-848c-c8e8a045dee4\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-gwrnx" Sep 29 10:56:55 crc kubenswrapper[4752]: I0929 10:56:55.941968 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-nz9pc" Sep 29 10:56:55 crc kubenswrapper[4752]: I0929 10:56:55.956918 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d79mm\" (UniqueName: \"kubernetes.io/projected/1f74c3bc-756d-48fc-848c-c8e8a045dee4-kube-api-access-d79mm\") pod \"nmstate-console-plugin-864bb6dfb5-gwrnx\" (UID: \"1f74c3bc-756d-48fc-848c-c8e8a045dee4\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-gwrnx" Sep 29 10:56:56 crc kubenswrapper[4752]: I0929 10:56:56.021702 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4f94737b-80ce-4e92-85df-18fa2b1cbe8e-oauth-serving-cert\") pod \"console-85546cf746-dfhr8\" (UID: \"4f94737b-80ce-4e92-85df-18fa2b1cbe8e\") " pod="openshift-console/console-85546cf746-dfhr8" Sep 29 10:56:56 crc kubenswrapper[4752]: I0929 10:56:56.021763 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4f94737b-80ce-4e92-85df-18fa2b1cbe8e-trusted-ca-bundle\") pod \"console-85546cf746-dfhr8\" (UID: \"4f94737b-80ce-4e92-85df-18fa2b1cbe8e\") " pod="openshift-console/console-85546cf746-dfhr8" Sep 29 10:56:56 crc kubenswrapper[4752]: I0929 10:56:56.021819 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4f94737b-80ce-4e92-85df-18fa2b1cbe8e-console-serving-cert\") pod \"console-85546cf746-dfhr8\" (UID: \"4f94737b-80ce-4e92-85df-18fa2b1cbe8e\") " pod="openshift-console/console-85546cf746-dfhr8" Sep 29 10:56:56 crc kubenswrapper[4752]: I0929 10:56:56.021846 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4f94737b-80ce-4e92-85df-18fa2b1cbe8e-console-config\") pod \"console-85546cf746-dfhr8\" (UID: \"4f94737b-80ce-4e92-85df-18fa2b1cbe8e\") " pod="openshift-console/console-85546cf746-dfhr8" Sep 29 10:56:56 crc kubenswrapper[4752]: I0929 10:56:56.021875 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4f94737b-80ce-4e92-85df-18fa2b1cbe8e-console-oauth-config\") pod \"console-85546cf746-dfhr8\" (UID: \"4f94737b-80ce-4e92-85df-18fa2b1cbe8e\") " pod="openshift-console/console-85546cf746-dfhr8" Sep 29 10:56:56 crc kubenswrapper[4752]: I0929 10:56:56.021894 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4f94737b-80ce-4e92-85df-18fa2b1cbe8e-service-ca\") pod \"console-85546cf746-dfhr8\" (UID: \"4f94737b-80ce-4e92-85df-18fa2b1cbe8e\") " pod="openshift-console/console-85546cf746-dfhr8" Sep 29 10:56:56 crc kubenswrapper[4752]: I0929 10:56:56.021933 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rjsw\" (UniqueName: \"kubernetes.io/projected/4f94737b-80ce-4e92-85df-18fa2b1cbe8e-kube-api-access-6rjsw\") pod \"console-85546cf746-dfhr8\" (UID: \"4f94737b-80ce-4e92-85df-18fa2b1cbe8e\") " pod="openshift-console/console-85546cf746-dfhr8" Sep 29 10:56:56 crc kubenswrapper[4752]: I0929 10:56:56.026833 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4f94737b-80ce-4e92-85df-18fa2b1cbe8e-trusted-ca-bundle\") pod \"console-85546cf746-dfhr8\" (UID: \"4f94737b-80ce-4e92-85df-18fa2b1cbe8e\") " pod="openshift-console/console-85546cf746-dfhr8" Sep 29 10:56:56 crc kubenswrapper[4752]: I0929 10:56:56.027164 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-gwrnx" Sep 29 10:56:56 crc kubenswrapper[4752]: I0929 10:56:56.028401 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4f94737b-80ce-4e92-85df-18fa2b1cbe8e-oauth-serving-cert\") pod \"console-85546cf746-dfhr8\" (UID: \"4f94737b-80ce-4e92-85df-18fa2b1cbe8e\") " pod="openshift-console/console-85546cf746-dfhr8" Sep 29 10:56:56 crc kubenswrapper[4752]: I0929 10:56:56.029175 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4f94737b-80ce-4e92-85df-18fa2b1cbe8e-console-config\") pod \"console-85546cf746-dfhr8\" (UID: \"4f94737b-80ce-4e92-85df-18fa2b1cbe8e\") " pod="openshift-console/console-85546cf746-dfhr8" Sep 29 10:56:56 crc kubenswrapper[4752]: I0929 10:56:56.029376 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4f94737b-80ce-4e92-85df-18fa2b1cbe8e-service-ca\") pod \"console-85546cf746-dfhr8\" (UID: \"4f94737b-80ce-4e92-85df-18fa2b1cbe8e\") " pod="openshift-console/console-85546cf746-dfhr8" Sep 29 10:56:56 crc kubenswrapper[4752]: I0929 10:56:56.033377 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4f94737b-80ce-4e92-85df-18fa2b1cbe8e-console-serving-cert\") pod \"console-85546cf746-dfhr8\" (UID: \"4f94737b-80ce-4e92-85df-18fa2b1cbe8e\") " pod="openshift-console/console-85546cf746-dfhr8" Sep 29 10:56:56 crc kubenswrapper[4752]: I0929 10:56:56.034197 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4f94737b-80ce-4e92-85df-18fa2b1cbe8e-console-oauth-config\") pod \"console-85546cf746-dfhr8\" (UID: \"4f94737b-80ce-4e92-85df-18fa2b1cbe8e\") " pod="openshift-console/console-85546cf746-dfhr8" Sep 29 10:56:56 crc kubenswrapper[4752]: I0929 10:56:56.056622 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rjsw\" (UniqueName: \"kubernetes.io/projected/4f94737b-80ce-4e92-85df-18fa2b1cbe8e-kube-api-access-6rjsw\") pod \"console-85546cf746-dfhr8\" (UID: \"4f94737b-80ce-4e92-85df-18fa2b1cbe8e\") " pod="openshift-console/console-85546cf746-dfhr8" Sep 29 10:56:56 crc kubenswrapper[4752]: I0929 10:56:56.099948 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-nz9pc" event={"ID":"5bace403-1b12-45b3-a28c-8a7c58393a22","Type":"ContainerStarted","Data":"fbfce917a7e8be76e7d272055ce7b4e3f7a0ce33ad2445b6eb129f7e9bc621be"} Sep 29 10:56:56 crc kubenswrapper[4752]: I0929 10:56:56.175630 4752 patch_prober.go:28] interesting pod/machine-config-daemon-mgrvs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 10:56:56 crc kubenswrapper[4752]: I0929 10:56:56.175700 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 10:56:56 crc kubenswrapper[4752]: I0929 10:56:56.175761 4752 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" Sep 29 10:56:56 crc kubenswrapper[4752]: I0929 10:56:56.176644 4752 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c407b08be26fe95221bcb36f9b8690f867d6ce7b5902b3cd248dbfd3fb7865c7"} pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 29 10:56:56 crc kubenswrapper[4752]: I0929 10:56:56.176714 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" containerName="machine-config-daemon" containerID="cri-o://c407b08be26fe95221bcb36f9b8690f867d6ce7b5902b3cd248dbfd3fb7865c7" gracePeriod=600 Sep 29 10:56:56 crc kubenswrapper[4752]: I0929 10:56:56.210237 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-85546cf746-dfhr8" Sep 29 10:56:56 crc kubenswrapper[4752]: I0929 10:56:56.261643 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58fcddf996-7pc9n"] Sep 29 10:56:56 crc kubenswrapper[4752]: W0929 10:56:56.278039 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1031b1fd_032a_4972_835d_8f53171bc7de.slice/crio-8f5d5c50c63300e41fa34cec5685c7f6556d489ad1c24cc2d1b2410b24dfc45f WatchSource:0}: Error finding container 8f5d5c50c63300e41fa34cec5685c7f6556d489ad1c24cc2d1b2410b24dfc45f: Status 404 returned error can't find the container with id 8f5d5c50c63300e41fa34cec5685c7f6556d489ad1c24cc2d1b2410b24dfc45f Sep 29 10:56:56 crc kubenswrapper[4752]: I0929 10:56:56.293562 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6d689559c5-75qgt"] Sep 29 10:56:56 crc kubenswrapper[4752]: I0929 10:56:56.355994 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-864bb6dfb5-gwrnx"] Sep 29 10:56:56 crc kubenswrapper[4752]: I0929 10:56:56.490780 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-85546cf746-dfhr8"] Sep 29 10:56:57 crc kubenswrapper[4752]: I0929 10:56:57.108972 4752 generic.go:334] "Generic (PLEG): container finished" podID="5863c243-797d-462a-b11f-71aaf005f8d1" containerID="c407b08be26fe95221bcb36f9b8690f867d6ce7b5902b3cd248dbfd3fb7865c7" exitCode=0 Sep 29 10:56:57 crc kubenswrapper[4752]: I0929 10:56:57.109094 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" event={"ID":"5863c243-797d-462a-b11f-71aaf005f8d1","Type":"ContainerDied","Data":"c407b08be26fe95221bcb36f9b8690f867d6ce7b5902b3cd248dbfd3fb7865c7"} Sep 29 10:56:57 crc kubenswrapper[4752]: I0929 10:56:57.109817 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" event={"ID":"5863c243-797d-462a-b11f-71aaf005f8d1","Type":"ContainerStarted","Data":"dbba8a90f680e465e868c9761ab597851b2db8c336cda0417acd2b4d326ea54a"} Sep 29 10:56:57 crc kubenswrapper[4752]: I0929 10:56:57.109900 4752 scope.go:117] "RemoveContainer" containerID="3f331ecf545ae76e5229e9ac291d3a8fabb44711af838903a027a58783c88d03" Sep 29 10:56:57 crc kubenswrapper[4752]: I0929 10:56:57.112230 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-85546cf746-dfhr8" event={"ID":"4f94737b-80ce-4e92-85df-18fa2b1cbe8e","Type":"ContainerStarted","Data":"83b24978d536cb9d3776d3364ef7f94a3f28f84f3fd0ee80edc7ba5d4946f101"} Sep 29 10:56:57 crc kubenswrapper[4752]: I0929 10:56:57.112278 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-85546cf746-dfhr8" event={"ID":"4f94737b-80ce-4e92-85df-18fa2b1cbe8e","Type":"ContainerStarted","Data":"c9445b3e2f7a928bf6777b3822af936db527b5afef5bb86e0775789d605f8c6e"} Sep 29 10:56:57 crc kubenswrapper[4752]: I0929 10:56:57.113655 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-gwrnx" event={"ID":"1f74c3bc-756d-48fc-848c-c8e8a045dee4","Type":"ContainerStarted","Data":"9ef2e38323b7c5bc4afe881a88f38adb0b98082ad9d1b9acffc6490e18c00725"} Sep 29 10:56:57 crc kubenswrapper[4752]: I0929 10:56:57.115006 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58fcddf996-7pc9n" event={"ID":"1031b1fd-032a-4972-835d-8f53171bc7de","Type":"ContainerStarted","Data":"8f5d5c50c63300e41fa34cec5685c7f6556d489ad1c24cc2d1b2410b24dfc45f"} Sep 29 10:56:57 crc kubenswrapper[4752]: I0929 10:56:57.116731 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6d689559c5-75qgt" event={"ID":"91fc3e7a-9ef1-435e-9fac-a273bdf2de57","Type":"ContainerStarted","Data":"ee8f1354a673c85669ace74b948aaa9e93cc92c0ba1d61042f86d9e99265592b"} Sep 29 10:56:58 crc kubenswrapper[4752]: I0929 10:56:58.495931 4752 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Sep 29 10:57:00 crc kubenswrapper[4752]: I0929 10:57:00.061510 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-85546cf746-dfhr8" podStartSLOduration=5.061486799 podStartE2EDuration="5.061486799s" podCreationTimestamp="2025-09-29 10:56:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:56:57.150382995 +0000 UTC m=+757.939524672" watchObservedRunningTime="2025-09-29 10:57:00.061486799 +0000 UTC m=+760.850628456" Sep 29 10:57:02 crc kubenswrapper[4752]: I0929 10:57:02.153483 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6d689559c5-75qgt" event={"ID":"91fc3e7a-9ef1-435e-9fac-a273bdf2de57","Type":"ContainerStarted","Data":"454386faf1da33d78f512b062d144a3717c03d33e6327de91b67fe05269be58a"} Sep 29 10:57:02 crc kubenswrapper[4752]: I0929 10:57:02.154617 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-6d689559c5-75qgt" Sep 29 10:57:02 crc kubenswrapper[4752]: I0929 10:57:02.156264 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-gwrnx" event={"ID":"1f74c3bc-756d-48fc-848c-c8e8a045dee4","Type":"ContainerStarted","Data":"9318344e9708f75def90bcd761333c87e56ec45b6aff63a06767f760f345f9fe"} Sep 29 10:57:02 crc kubenswrapper[4752]: I0929 10:57:02.158182 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58fcddf996-7pc9n" event={"ID":"1031b1fd-032a-4972-835d-8f53171bc7de","Type":"ContainerStarted","Data":"5846ae3c64864d8fc3ee73faabfcaa42d0dc3c31992672927ab93f94a0c6c988"} Sep 29 10:57:02 crc kubenswrapper[4752]: I0929 10:57:02.159911 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-nz9pc" event={"ID":"5bace403-1b12-45b3-a28c-8a7c58393a22","Type":"ContainerStarted","Data":"1ae76b1b6169d6b839c21a008847a90fc133ccc377a5dcd93100724bff7725f5"} Sep 29 10:57:02 crc kubenswrapper[4752]: I0929 10:57:02.160144 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-nz9pc" Sep 29 10:57:02 crc kubenswrapper[4752]: I0929 10:57:02.183421 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-6d689559c5-75qgt" podStartSLOduration=3.809563014 podStartE2EDuration="7.183390658s" podCreationTimestamp="2025-09-29 10:56:55 +0000 UTC" firstStartedPulling="2025-09-29 10:56:56.30524849 +0000 UTC m=+757.094390157" lastFinishedPulling="2025-09-29 10:56:59.679076134 +0000 UTC m=+760.468217801" observedRunningTime="2025-09-29 10:57:02.177796672 +0000 UTC m=+762.966938339" watchObservedRunningTime="2025-09-29 10:57:02.183390658 +0000 UTC m=+762.972532325" Sep 29 10:57:02 crc kubenswrapper[4752]: I0929 10:57:02.212979 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-nz9pc" podStartSLOduration=3.586592068 podStartE2EDuration="7.212956259s" podCreationTimestamp="2025-09-29 10:56:55 +0000 UTC" firstStartedPulling="2025-09-29 10:56:56.082254963 +0000 UTC m=+756.871396630" lastFinishedPulling="2025-09-29 10:56:59.708619154 +0000 UTC m=+760.497760821" observedRunningTime="2025-09-29 10:57:02.209423136 +0000 UTC m=+762.998564823" watchObservedRunningTime="2025-09-29 10:57:02.212956259 +0000 UTC m=+763.002097926" Sep 29 10:57:02 crc kubenswrapper[4752]: I0929 10:57:02.229092 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-gwrnx" podStartSLOduration=3.917205541 podStartE2EDuration="7.229067709s" podCreationTimestamp="2025-09-29 10:56:55 +0000 UTC" firstStartedPulling="2025-09-29 10:56:56.365870861 +0000 UTC m=+757.155012528" lastFinishedPulling="2025-09-29 10:56:59.677733029 +0000 UTC m=+760.466874696" observedRunningTime="2025-09-29 10:57:02.226972165 +0000 UTC m=+763.016113832" watchObservedRunningTime="2025-09-29 10:57:02.229067709 +0000 UTC m=+763.018209376" Sep 29 10:57:05 crc kubenswrapper[4752]: I0929 10:57:05.180490 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58fcddf996-7pc9n" event={"ID":"1031b1fd-032a-4972-835d-8f53171bc7de","Type":"ContainerStarted","Data":"752019fbaa77f62894f431e3fc9e6c796acb3b3887273dfb4bbc13e12e11f2cb"} Sep 29 10:57:05 crc kubenswrapper[4752]: I0929 10:57:05.202590 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-58fcddf996-7pc9n" podStartSLOduration=2.032094309 podStartE2EDuration="10.202563911s" podCreationTimestamp="2025-09-29 10:56:55 +0000 UTC" firstStartedPulling="2025-09-29 10:56:56.286966023 +0000 UTC m=+757.076107690" lastFinishedPulling="2025-09-29 10:57:04.457435625 +0000 UTC m=+765.246577292" observedRunningTime="2025-09-29 10:57:05.197751376 +0000 UTC m=+765.986893043" watchObservedRunningTime="2025-09-29 10:57:05.202563911 +0000 UTC m=+765.991705578" Sep 29 10:57:06 crc kubenswrapper[4752]: I0929 10:57:06.211129 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-85546cf746-dfhr8" Sep 29 10:57:06 crc kubenswrapper[4752]: I0929 10:57:06.211184 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-85546cf746-dfhr8" Sep 29 10:57:06 crc kubenswrapper[4752]: I0929 10:57:06.215130 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-85546cf746-dfhr8" Sep 29 10:57:07 crc kubenswrapper[4752]: I0929 10:57:07.193912 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-85546cf746-dfhr8" Sep 29 10:57:07 crc kubenswrapper[4752]: I0929 10:57:07.246628 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-bp6hz"] Sep 29 10:57:10 crc kubenswrapper[4752]: I0929 10:57:10.969656 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-nz9pc" Sep 29 10:57:15 crc kubenswrapper[4752]: I0929 10:57:15.918954 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-6d689559c5-75qgt" Sep 29 10:57:27 crc kubenswrapper[4752]: I0929 10:57:27.376300 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-dsdrq"] Sep 29 10:57:27 crc kubenswrapper[4752]: I0929 10:57:27.378730 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dsdrq" Sep 29 10:57:27 crc kubenswrapper[4752]: I0929 10:57:27.390830 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dsdrq"] Sep 29 10:57:27 crc kubenswrapper[4752]: I0929 10:57:27.422613 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3114de4f-e714-4102-ae6e-53e08a01a180-catalog-content\") pod \"redhat-marketplace-dsdrq\" (UID: \"3114de4f-e714-4102-ae6e-53e08a01a180\") " pod="openshift-marketplace/redhat-marketplace-dsdrq" Sep 29 10:57:27 crc kubenswrapper[4752]: I0929 10:57:27.422672 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttv4f\" (UniqueName: \"kubernetes.io/projected/3114de4f-e714-4102-ae6e-53e08a01a180-kube-api-access-ttv4f\") pod \"redhat-marketplace-dsdrq\" (UID: \"3114de4f-e714-4102-ae6e-53e08a01a180\") " pod="openshift-marketplace/redhat-marketplace-dsdrq" Sep 29 10:57:27 crc kubenswrapper[4752]: I0929 10:57:27.422705 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3114de4f-e714-4102-ae6e-53e08a01a180-utilities\") pod \"redhat-marketplace-dsdrq\" (UID: \"3114de4f-e714-4102-ae6e-53e08a01a180\") " pod="openshift-marketplace/redhat-marketplace-dsdrq" Sep 29 10:57:27 crc kubenswrapper[4752]: I0929 10:57:27.524851 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3114de4f-e714-4102-ae6e-53e08a01a180-catalog-content\") pod \"redhat-marketplace-dsdrq\" (UID: \"3114de4f-e714-4102-ae6e-53e08a01a180\") " pod="openshift-marketplace/redhat-marketplace-dsdrq" Sep 29 10:57:27 crc kubenswrapper[4752]: I0929 10:57:27.524904 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttv4f\" (UniqueName: \"kubernetes.io/projected/3114de4f-e714-4102-ae6e-53e08a01a180-kube-api-access-ttv4f\") pod \"redhat-marketplace-dsdrq\" (UID: \"3114de4f-e714-4102-ae6e-53e08a01a180\") " pod="openshift-marketplace/redhat-marketplace-dsdrq" Sep 29 10:57:27 crc kubenswrapper[4752]: I0929 10:57:27.524941 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3114de4f-e714-4102-ae6e-53e08a01a180-utilities\") pod \"redhat-marketplace-dsdrq\" (UID: \"3114de4f-e714-4102-ae6e-53e08a01a180\") " pod="openshift-marketplace/redhat-marketplace-dsdrq" Sep 29 10:57:27 crc kubenswrapper[4752]: I0929 10:57:27.526045 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3114de4f-e714-4102-ae6e-53e08a01a180-utilities\") pod \"redhat-marketplace-dsdrq\" (UID: \"3114de4f-e714-4102-ae6e-53e08a01a180\") " pod="openshift-marketplace/redhat-marketplace-dsdrq" Sep 29 10:57:27 crc kubenswrapper[4752]: I0929 10:57:27.526398 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3114de4f-e714-4102-ae6e-53e08a01a180-catalog-content\") pod \"redhat-marketplace-dsdrq\" (UID: \"3114de4f-e714-4102-ae6e-53e08a01a180\") " pod="openshift-marketplace/redhat-marketplace-dsdrq" Sep 29 10:57:27 crc kubenswrapper[4752]: I0929 10:57:27.550750 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttv4f\" (UniqueName: \"kubernetes.io/projected/3114de4f-e714-4102-ae6e-53e08a01a180-kube-api-access-ttv4f\") pod \"redhat-marketplace-dsdrq\" (UID: \"3114de4f-e714-4102-ae6e-53e08a01a180\") " pod="openshift-marketplace/redhat-marketplace-dsdrq" Sep 29 10:57:27 crc kubenswrapper[4752]: I0929 10:57:27.708453 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dsdrq" Sep 29 10:57:27 crc kubenswrapper[4752]: I0929 10:57:27.947972 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dsdrq"] Sep 29 10:57:28 crc kubenswrapper[4752]: I0929 10:57:28.329619 4752 generic.go:334] "Generic (PLEG): container finished" podID="3114de4f-e714-4102-ae6e-53e08a01a180" containerID="1c99f5b857825fed73823e6d456a3aa5c2acc5e6798aa7c39fe73ca66e7d2734" exitCode=0 Sep 29 10:57:28 crc kubenswrapper[4752]: I0929 10:57:28.329712 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dsdrq" event={"ID":"3114de4f-e714-4102-ae6e-53e08a01a180","Type":"ContainerDied","Data":"1c99f5b857825fed73823e6d456a3aa5c2acc5e6798aa7c39fe73ca66e7d2734"} Sep 29 10:57:28 crc kubenswrapper[4752]: I0929 10:57:28.330061 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dsdrq" event={"ID":"3114de4f-e714-4102-ae6e-53e08a01a180","Type":"ContainerStarted","Data":"4894eefb8766fdd93f0073ab449bdbd83f0715bb21ef3a3a22c2256a49e85d8e"} Sep 29 10:57:29 crc kubenswrapper[4752]: I0929 10:57:29.338585 4752 generic.go:334] "Generic (PLEG): container finished" podID="3114de4f-e714-4102-ae6e-53e08a01a180" containerID="49693c9523a87b1d9e31c2393e5d28934a3cce37f664b7045ea23f2f07558786" exitCode=0 Sep 29 10:57:29 crc kubenswrapper[4752]: I0929 10:57:29.338884 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dsdrq" event={"ID":"3114de4f-e714-4102-ae6e-53e08a01a180","Type":"ContainerDied","Data":"49693c9523a87b1d9e31c2393e5d28934a3cce37f664b7045ea23f2f07558786"} Sep 29 10:57:30 crc kubenswrapper[4752]: I0929 10:57:30.347405 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dsdrq" event={"ID":"3114de4f-e714-4102-ae6e-53e08a01a180","Type":"ContainerStarted","Data":"ace76e5d37482d9ac42c361b8deef4423420d53e88901f5a668a505b66048619"} Sep 29 10:57:30 crc kubenswrapper[4752]: I0929 10:57:30.367912 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-dsdrq" podStartSLOduration=1.85722947 podStartE2EDuration="3.367886795s" podCreationTimestamp="2025-09-29 10:57:27 +0000 UTC" firstStartedPulling="2025-09-29 10:57:28.332611627 +0000 UTC m=+789.121753294" lastFinishedPulling="2025-09-29 10:57:29.843268952 +0000 UTC m=+790.632410619" observedRunningTime="2025-09-29 10:57:30.367372162 +0000 UTC m=+791.156513849" watchObservedRunningTime="2025-09-29 10:57:30.367886795 +0000 UTC m=+791.157028472" Sep 29 10:57:31 crc kubenswrapper[4752]: I0929 10:57:31.804927 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d966hkt4"] Sep 29 10:57:31 crc kubenswrapper[4752]: I0929 10:57:31.806470 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d966hkt4" Sep 29 10:57:31 crc kubenswrapper[4752]: I0929 10:57:31.810100 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Sep 29 10:57:31 crc kubenswrapper[4752]: I0929 10:57:31.817502 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d966hkt4"] Sep 29 10:57:31 crc kubenswrapper[4752]: I0929 10:57:31.892188 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/71f15cd4-4eb9-4fcb-b2e9-789c06fa9670-util\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d966hkt4\" (UID: \"71f15cd4-4eb9-4fcb-b2e9-789c06fa9670\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d966hkt4" Sep 29 10:57:31 crc kubenswrapper[4752]: I0929 10:57:31.892261 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/71f15cd4-4eb9-4fcb-b2e9-789c06fa9670-bundle\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d966hkt4\" (UID: \"71f15cd4-4eb9-4fcb-b2e9-789c06fa9670\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d966hkt4" Sep 29 10:57:31 crc kubenswrapper[4752]: I0929 10:57:31.892396 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrwkg\" (UniqueName: \"kubernetes.io/projected/71f15cd4-4eb9-4fcb-b2e9-789c06fa9670-kube-api-access-vrwkg\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d966hkt4\" (UID: \"71f15cd4-4eb9-4fcb-b2e9-789c06fa9670\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d966hkt4" Sep 29 10:57:31 crc kubenswrapper[4752]: I0929 10:57:31.994030 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrwkg\" (UniqueName: \"kubernetes.io/projected/71f15cd4-4eb9-4fcb-b2e9-789c06fa9670-kube-api-access-vrwkg\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d966hkt4\" (UID: \"71f15cd4-4eb9-4fcb-b2e9-789c06fa9670\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d966hkt4" Sep 29 10:57:31 crc kubenswrapper[4752]: I0929 10:57:31.994103 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/71f15cd4-4eb9-4fcb-b2e9-789c06fa9670-util\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d966hkt4\" (UID: \"71f15cd4-4eb9-4fcb-b2e9-789c06fa9670\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d966hkt4" Sep 29 10:57:31 crc kubenswrapper[4752]: I0929 10:57:31.994130 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/71f15cd4-4eb9-4fcb-b2e9-789c06fa9670-bundle\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d966hkt4\" (UID: \"71f15cd4-4eb9-4fcb-b2e9-789c06fa9670\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d966hkt4" Sep 29 10:57:31 crc kubenswrapper[4752]: I0929 10:57:31.994667 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/71f15cd4-4eb9-4fcb-b2e9-789c06fa9670-bundle\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d966hkt4\" (UID: \"71f15cd4-4eb9-4fcb-b2e9-789c06fa9670\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d966hkt4" Sep 29 10:57:31 crc kubenswrapper[4752]: I0929 10:57:31.994782 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/71f15cd4-4eb9-4fcb-b2e9-789c06fa9670-util\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d966hkt4\" (UID: \"71f15cd4-4eb9-4fcb-b2e9-789c06fa9670\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d966hkt4" Sep 29 10:57:32 crc kubenswrapper[4752]: I0929 10:57:32.015326 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrwkg\" (UniqueName: \"kubernetes.io/projected/71f15cd4-4eb9-4fcb-b2e9-789c06fa9670-kube-api-access-vrwkg\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d966hkt4\" (UID: \"71f15cd4-4eb9-4fcb-b2e9-789c06fa9670\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d966hkt4" Sep 29 10:57:32 crc kubenswrapper[4752]: I0929 10:57:32.122854 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d966hkt4" Sep 29 10:57:32 crc kubenswrapper[4752]: I0929 10:57:32.296734 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-bp6hz" podUID="53fa29ee-8f5a-4c5b-9d74-3ff726f5ed28" containerName="console" containerID="cri-o://69f7996091c1816562d3e389be666e8d0a58e4432aa8c659b999aa8397210cd8" gracePeriod=15 Sep 29 10:57:32 crc kubenswrapper[4752]: I0929 10:57:32.349454 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d966hkt4"] Sep 29 10:57:32 crc kubenswrapper[4752]: W0929 10:57:32.388648 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71f15cd4_4eb9_4fcb_b2e9_789c06fa9670.slice/crio-dc4a50c762b6afa8e1de9043d3c7b21738767f41fa4ae1271387abcd94b298ec WatchSource:0}: Error finding container dc4a50c762b6afa8e1de9043d3c7b21738767f41fa4ae1271387abcd94b298ec: Status 404 returned error can't find the container with id dc4a50c762b6afa8e1de9043d3c7b21738767f41fa4ae1271387abcd94b298ec Sep 29 10:57:32 crc kubenswrapper[4752]: I0929 10:57:32.683219 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-bp6hz_53fa29ee-8f5a-4c5b-9d74-3ff726f5ed28/console/0.log" Sep 29 10:57:32 crc kubenswrapper[4752]: I0929 10:57:32.683351 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-bp6hz" Sep 29 10:57:32 crc kubenswrapper[4752]: I0929 10:57:32.703652 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/53fa29ee-8f5a-4c5b-9d74-3ff726f5ed28-oauth-serving-cert\") pod \"53fa29ee-8f5a-4c5b-9d74-3ff726f5ed28\" (UID: \"53fa29ee-8f5a-4c5b-9d74-3ff726f5ed28\") " Sep 29 10:57:32 crc kubenswrapper[4752]: I0929 10:57:32.703762 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/53fa29ee-8f5a-4c5b-9d74-3ff726f5ed28-console-config\") pod \"53fa29ee-8f5a-4c5b-9d74-3ff726f5ed28\" (UID: \"53fa29ee-8f5a-4c5b-9d74-3ff726f5ed28\") " Sep 29 10:57:32 crc kubenswrapper[4752]: I0929 10:57:32.703792 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/53fa29ee-8f5a-4c5b-9d74-3ff726f5ed28-console-serving-cert\") pod \"53fa29ee-8f5a-4c5b-9d74-3ff726f5ed28\" (UID: \"53fa29ee-8f5a-4c5b-9d74-3ff726f5ed28\") " Sep 29 10:57:32 crc kubenswrapper[4752]: I0929 10:57:32.703837 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/53fa29ee-8f5a-4c5b-9d74-3ff726f5ed28-trusted-ca-bundle\") pod \"53fa29ee-8f5a-4c5b-9d74-3ff726f5ed28\" (UID: \"53fa29ee-8f5a-4c5b-9d74-3ff726f5ed28\") " Sep 29 10:57:32 crc kubenswrapper[4752]: I0929 10:57:32.703869 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/53fa29ee-8f5a-4c5b-9d74-3ff726f5ed28-service-ca\") pod \"53fa29ee-8f5a-4c5b-9d74-3ff726f5ed28\" (UID: \"53fa29ee-8f5a-4c5b-9d74-3ff726f5ed28\") " Sep 29 10:57:32 crc kubenswrapper[4752]: I0929 10:57:32.703993 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/53fa29ee-8f5a-4c5b-9d74-3ff726f5ed28-console-oauth-config\") pod \"53fa29ee-8f5a-4c5b-9d74-3ff726f5ed28\" (UID: \"53fa29ee-8f5a-4c5b-9d74-3ff726f5ed28\") " Sep 29 10:57:32 crc kubenswrapper[4752]: I0929 10:57:32.704093 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hk6x6\" (UniqueName: \"kubernetes.io/projected/53fa29ee-8f5a-4c5b-9d74-3ff726f5ed28-kube-api-access-hk6x6\") pod \"53fa29ee-8f5a-4c5b-9d74-3ff726f5ed28\" (UID: \"53fa29ee-8f5a-4c5b-9d74-3ff726f5ed28\") " Sep 29 10:57:32 crc kubenswrapper[4752]: I0929 10:57:32.704960 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53fa29ee-8f5a-4c5b-9d74-3ff726f5ed28-service-ca" (OuterVolumeSpecName: "service-ca") pod "53fa29ee-8f5a-4c5b-9d74-3ff726f5ed28" (UID: "53fa29ee-8f5a-4c5b-9d74-3ff726f5ed28"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:57:32 crc kubenswrapper[4752]: I0929 10:57:32.705269 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53fa29ee-8f5a-4c5b-9d74-3ff726f5ed28-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "53fa29ee-8f5a-4c5b-9d74-3ff726f5ed28" (UID: "53fa29ee-8f5a-4c5b-9d74-3ff726f5ed28"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:57:32 crc kubenswrapper[4752]: I0929 10:57:32.705368 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53fa29ee-8f5a-4c5b-9d74-3ff726f5ed28-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "53fa29ee-8f5a-4c5b-9d74-3ff726f5ed28" (UID: "53fa29ee-8f5a-4c5b-9d74-3ff726f5ed28"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:57:32 crc kubenswrapper[4752]: I0929 10:57:32.705389 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53fa29ee-8f5a-4c5b-9d74-3ff726f5ed28-console-config" (OuterVolumeSpecName: "console-config") pod "53fa29ee-8f5a-4c5b-9d74-3ff726f5ed28" (UID: "53fa29ee-8f5a-4c5b-9d74-3ff726f5ed28"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:57:32 crc kubenswrapper[4752]: I0929 10:57:32.710354 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53fa29ee-8f5a-4c5b-9d74-3ff726f5ed28-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "53fa29ee-8f5a-4c5b-9d74-3ff726f5ed28" (UID: "53fa29ee-8f5a-4c5b-9d74-3ff726f5ed28"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:57:32 crc kubenswrapper[4752]: I0929 10:57:32.710566 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53fa29ee-8f5a-4c5b-9d74-3ff726f5ed28-kube-api-access-hk6x6" (OuterVolumeSpecName: "kube-api-access-hk6x6") pod "53fa29ee-8f5a-4c5b-9d74-3ff726f5ed28" (UID: "53fa29ee-8f5a-4c5b-9d74-3ff726f5ed28"). InnerVolumeSpecName "kube-api-access-hk6x6". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:57:32 crc kubenswrapper[4752]: I0929 10:57:32.711195 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53fa29ee-8f5a-4c5b-9d74-3ff726f5ed28-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "53fa29ee-8f5a-4c5b-9d74-3ff726f5ed28" (UID: "53fa29ee-8f5a-4c5b-9d74-3ff726f5ed28"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:57:32 crc kubenswrapper[4752]: I0929 10:57:32.805728 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hk6x6\" (UniqueName: \"kubernetes.io/projected/53fa29ee-8f5a-4c5b-9d74-3ff726f5ed28-kube-api-access-hk6x6\") on node \"crc\" DevicePath \"\"" Sep 29 10:57:32 crc kubenswrapper[4752]: I0929 10:57:32.806343 4752 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/53fa29ee-8f5a-4c5b-9d74-3ff726f5ed28-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 29 10:57:32 crc kubenswrapper[4752]: I0929 10:57:32.806730 4752 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/53fa29ee-8f5a-4c5b-9d74-3ff726f5ed28-console-config\") on node \"crc\" DevicePath \"\"" Sep 29 10:57:32 crc kubenswrapper[4752]: I0929 10:57:32.806744 4752 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/53fa29ee-8f5a-4c5b-9d74-3ff726f5ed28-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:57:32 crc kubenswrapper[4752]: I0929 10:57:32.806756 4752 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/53fa29ee-8f5a-4c5b-9d74-3ff726f5ed28-console-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 29 10:57:32 crc kubenswrapper[4752]: I0929 10:57:32.806769 4752 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/53fa29ee-8f5a-4c5b-9d74-3ff726f5ed28-service-ca\") on node \"crc\" DevicePath \"\"" Sep 29 10:57:32 crc kubenswrapper[4752]: I0929 10:57:32.806782 4752 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/53fa29ee-8f5a-4c5b-9d74-3ff726f5ed28-console-oauth-config\") on node \"crc\" DevicePath \"\"" Sep 29 10:57:33 crc kubenswrapper[4752]: I0929 10:57:33.373996 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-bp6hz_53fa29ee-8f5a-4c5b-9d74-3ff726f5ed28/console/0.log" Sep 29 10:57:33 crc kubenswrapper[4752]: I0929 10:57:33.374061 4752 generic.go:334] "Generic (PLEG): container finished" podID="53fa29ee-8f5a-4c5b-9d74-3ff726f5ed28" containerID="69f7996091c1816562d3e389be666e8d0a58e4432aa8c659b999aa8397210cd8" exitCode=2 Sep 29 10:57:33 crc kubenswrapper[4752]: I0929 10:57:33.374167 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-bp6hz" Sep 29 10:57:33 crc kubenswrapper[4752]: I0929 10:57:33.374167 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-bp6hz" event={"ID":"53fa29ee-8f5a-4c5b-9d74-3ff726f5ed28","Type":"ContainerDied","Data":"69f7996091c1816562d3e389be666e8d0a58e4432aa8c659b999aa8397210cd8"} Sep 29 10:57:33 crc kubenswrapper[4752]: I0929 10:57:33.374237 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-bp6hz" event={"ID":"53fa29ee-8f5a-4c5b-9d74-3ff726f5ed28","Type":"ContainerDied","Data":"bd4039e4b2e06554d906d80920591716a87dcfd00dadba613f66b85043d2facb"} Sep 29 10:57:33 crc kubenswrapper[4752]: I0929 10:57:33.374263 4752 scope.go:117] "RemoveContainer" containerID="69f7996091c1816562d3e389be666e8d0a58e4432aa8c659b999aa8397210cd8" Sep 29 10:57:33 crc kubenswrapper[4752]: I0929 10:57:33.376293 4752 generic.go:334] "Generic (PLEG): container finished" podID="71f15cd4-4eb9-4fcb-b2e9-789c06fa9670" containerID="e6e97bc8c99fa9de98a4b46d25d0f9af3fab4123e6184c8868eb2d8308d7b224" exitCode=0 Sep 29 10:57:33 crc kubenswrapper[4752]: I0929 10:57:33.376333 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d966hkt4" event={"ID":"71f15cd4-4eb9-4fcb-b2e9-789c06fa9670","Type":"ContainerDied","Data":"e6e97bc8c99fa9de98a4b46d25d0f9af3fab4123e6184c8868eb2d8308d7b224"} Sep 29 10:57:33 crc kubenswrapper[4752]: I0929 10:57:33.376373 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d966hkt4" event={"ID":"71f15cd4-4eb9-4fcb-b2e9-789c06fa9670","Type":"ContainerStarted","Data":"dc4a50c762b6afa8e1de9043d3c7b21738767f41fa4ae1271387abcd94b298ec"} Sep 29 10:57:33 crc kubenswrapper[4752]: I0929 10:57:33.403726 4752 scope.go:117] "RemoveContainer" containerID="69f7996091c1816562d3e389be666e8d0a58e4432aa8c659b999aa8397210cd8" Sep 29 10:57:33 crc kubenswrapper[4752]: E0929 10:57:33.405327 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69f7996091c1816562d3e389be666e8d0a58e4432aa8c659b999aa8397210cd8\": container with ID starting with 69f7996091c1816562d3e389be666e8d0a58e4432aa8c659b999aa8397210cd8 not found: ID does not exist" containerID="69f7996091c1816562d3e389be666e8d0a58e4432aa8c659b999aa8397210cd8" Sep 29 10:57:33 crc kubenswrapper[4752]: I0929 10:57:33.405468 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69f7996091c1816562d3e389be666e8d0a58e4432aa8c659b999aa8397210cd8"} err="failed to get container status \"69f7996091c1816562d3e389be666e8d0a58e4432aa8c659b999aa8397210cd8\": rpc error: code = NotFound desc = could not find container \"69f7996091c1816562d3e389be666e8d0a58e4432aa8c659b999aa8397210cd8\": container with ID starting with 69f7996091c1816562d3e389be666e8d0a58e4432aa8c659b999aa8397210cd8 not found: ID does not exist" Sep 29 10:57:33 crc kubenswrapper[4752]: I0929 10:57:33.416330 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-bp6hz"] Sep 29 10:57:33 crc kubenswrapper[4752]: I0929 10:57:33.429783 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-bp6hz"] Sep 29 10:57:33 crc kubenswrapper[4752]: I0929 10:57:33.577951 4752 patch_prober.go:28] interesting pod/console-f9d7485db-bp6hz container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.217.0.35:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Sep 29 10:57:33 crc kubenswrapper[4752]: I0929 10:57:33.578028 4752 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/console-f9d7485db-bp6hz" podUID="53fa29ee-8f5a-4c5b-9d74-3ff726f5ed28" containerName="console" probeResult="failure" output="Get \"https://10.217.0.35:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Sep 29 10:57:34 crc kubenswrapper[4752]: I0929 10:57:34.040394 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53fa29ee-8f5a-4c5b-9d74-3ff726f5ed28" path="/var/lib/kubelet/pods/53fa29ee-8f5a-4c5b-9d74-3ff726f5ed28/volumes" Sep 29 10:57:34 crc kubenswrapper[4752]: I0929 10:57:34.354754 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-hqlqv"] Sep 29 10:57:34 crc kubenswrapper[4752]: E0929 10:57:34.355116 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53fa29ee-8f5a-4c5b-9d74-3ff726f5ed28" containerName="console" Sep 29 10:57:34 crc kubenswrapper[4752]: I0929 10:57:34.355143 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="53fa29ee-8f5a-4c5b-9d74-3ff726f5ed28" containerName="console" Sep 29 10:57:34 crc kubenswrapper[4752]: I0929 10:57:34.355448 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="53fa29ee-8f5a-4c5b-9d74-3ff726f5ed28" containerName="console" Sep 29 10:57:34 crc kubenswrapper[4752]: I0929 10:57:34.356600 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hqlqv" Sep 29 10:57:34 crc kubenswrapper[4752]: I0929 10:57:34.366600 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hqlqv"] Sep 29 10:57:34 crc kubenswrapper[4752]: I0929 10:57:34.430223 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a95b36e-5f4a-4681-9dce-b4980f41ec2a-catalog-content\") pod \"redhat-operators-hqlqv\" (UID: \"4a95b36e-5f4a-4681-9dce-b4980f41ec2a\") " pod="openshift-marketplace/redhat-operators-hqlqv" Sep 29 10:57:34 crc kubenswrapper[4752]: I0929 10:57:34.430297 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a95b36e-5f4a-4681-9dce-b4980f41ec2a-utilities\") pod \"redhat-operators-hqlqv\" (UID: \"4a95b36e-5f4a-4681-9dce-b4980f41ec2a\") " pod="openshift-marketplace/redhat-operators-hqlqv" Sep 29 10:57:34 crc kubenswrapper[4752]: I0929 10:57:34.430321 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcrmx\" (UniqueName: \"kubernetes.io/projected/4a95b36e-5f4a-4681-9dce-b4980f41ec2a-kube-api-access-kcrmx\") pod \"redhat-operators-hqlqv\" (UID: \"4a95b36e-5f4a-4681-9dce-b4980f41ec2a\") " pod="openshift-marketplace/redhat-operators-hqlqv" Sep 29 10:57:34 crc kubenswrapper[4752]: I0929 10:57:34.531612 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a95b36e-5f4a-4681-9dce-b4980f41ec2a-utilities\") pod \"redhat-operators-hqlqv\" (UID: \"4a95b36e-5f4a-4681-9dce-b4980f41ec2a\") " pod="openshift-marketplace/redhat-operators-hqlqv" Sep 29 10:57:34 crc kubenswrapper[4752]: I0929 10:57:34.531677 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kcrmx\" (UniqueName: \"kubernetes.io/projected/4a95b36e-5f4a-4681-9dce-b4980f41ec2a-kube-api-access-kcrmx\") pod \"redhat-operators-hqlqv\" (UID: \"4a95b36e-5f4a-4681-9dce-b4980f41ec2a\") " pod="openshift-marketplace/redhat-operators-hqlqv" Sep 29 10:57:34 crc kubenswrapper[4752]: I0929 10:57:34.531761 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a95b36e-5f4a-4681-9dce-b4980f41ec2a-catalog-content\") pod \"redhat-operators-hqlqv\" (UID: \"4a95b36e-5f4a-4681-9dce-b4980f41ec2a\") " pod="openshift-marketplace/redhat-operators-hqlqv" Sep 29 10:57:34 crc kubenswrapper[4752]: I0929 10:57:34.532194 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a95b36e-5f4a-4681-9dce-b4980f41ec2a-utilities\") pod \"redhat-operators-hqlqv\" (UID: \"4a95b36e-5f4a-4681-9dce-b4980f41ec2a\") " pod="openshift-marketplace/redhat-operators-hqlqv" Sep 29 10:57:34 crc kubenswrapper[4752]: I0929 10:57:34.532289 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a95b36e-5f4a-4681-9dce-b4980f41ec2a-catalog-content\") pod \"redhat-operators-hqlqv\" (UID: \"4a95b36e-5f4a-4681-9dce-b4980f41ec2a\") " pod="openshift-marketplace/redhat-operators-hqlqv" Sep 29 10:57:34 crc kubenswrapper[4752]: I0929 10:57:34.557454 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcrmx\" (UniqueName: \"kubernetes.io/projected/4a95b36e-5f4a-4681-9dce-b4980f41ec2a-kube-api-access-kcrmx\") pod \"redhat-operators-hqlqv\" (UID: \"4a95b36e-5f4a-4681-9dce-b4980f41ec2a\") " pod="openshift-marketplace/redhat-operators-hqlqv" Sep 29 10:57:34 crc kubenswrapper[4752]: I0929 10:57:34.677766 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hqlqv" Sep 29 10:57:34 crc kubenswrapper[4752]: I0929 10:57:34.889086 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hqlqv"] Sep 29 10:57:35 crc kubenswrapper[4752]: I0929 10:57:35.393158 4752 generic.go:334] "Generic (PLEG): container finished" podID="71f15cd4-4eb9-4fcb-b2e9-789c06fa9670" containerID="4d1be4a78f62e2f8c0698ccb6e238a7bd562de54a750dab5c00f5936270fc138" exitCode=0 Sep 29 10:57:35 crc kubenswrapper[4752]: I0929 10:57:35.393208 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d966hkt4" event={"ID":"71f15cd4-4eb9-4fcb-b2e9-789c06fa9670","Type":"ContainerDied","Data":"4d1be4a78f62e2f8c0698ccb6e238a7bd562de54a750dab5c00f5936270fc138"} Sep 29 10:57:35 crc kubenswrapper[4752]: I0929 10:57:35.395724 4752 generic.go:334] "Generic (PLEG): container finished" podID="4a95b36e-5f4a-4681-9dce-b4980f41ec2a" containerID="d3a35ca2bc1823b383760802c9f59509f3fc6e9a23639c4a471328488e52dd52" exitCode=0 Sep 29 10:57:35 crc kubenswrapper[4752]: I0929 10:57:35.395767 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hqlqv" event={"ID":"4a95b36e-5f4a-4681-9dce-b4980f41ec2a","Type":"ContainerDied","Data":"d3a35ca2bc1823b383760802c9f59509f3fc6e9a23639c4a471328488e52dd52"} Sep 29 10:57:35 crc kubenswrapper[4752]: I0929 10:57:35.395794 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hqlqv" event={"ID":"4a95b36e-5f4a-4681-9dce-b4980f41ec2a","Type":"ContainerStarted","Data":"760c1dd8b7ea6c62e454da7ef24b1a9d768761f964838793989eec52b7cf0561"} Sep 29 10:57:36 crc kubenswrapper[4752]: I0929 10:57:36.406248 4752 generic.go:334] "Generic (PLEG): container finished" podID="71f15cd4-4eb9-4fcb-b2e9-789c06fa9670" containerID="ecef968e6d79f3c1663c5723b7978754ae990ed0bc386fb1607bc8310d9a1b13" exitCode=0 Sep 29 10:57:36 crc kubenswrapper[4752]: I0929 10:57:36.406312 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d966hkt4" event={"ID":"71f15cd4-4eb9-4fcb-b2e9-789c06fa9670","Type":"ContainerDied","Data":"ecef968e6d79f3c1663c5723b7978754ae990ed0bc386fb1607bc8310d9a1b13"} Sep 29 10:57:37 crc kubenswrapper[4752]: I0929 10:57:37.650575 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d966hkt4" Sep 29 10:57:37 crc kubenswrapper[4752]: I0929 10:57:37.684203 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/71f15cd4-4eb9-4fcb-b2e9-789c06fa9670-util\") pod \"71f15cd4-4eb9-4fcb-b2e9-789c06fa9670\" (UID: \"71f15cd4-4eb9-4fcb-b2e9-789c06fa9670\") " Sep 29 10:57:37 crc kubenswrapper[4752]: I0929 10:57:37.684348 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vrwkg\" (UniqueName: \"kubernetes.io/projected/71f15cd4-4eb9-4fcb-b2e9-789c06fa9670-kube-api-access-vrwkg\") pod \"71f15cd4-4eb9-4fcb-b2e9-789c06fa9670\" (UID: \"71f15cd4-4eb9-4fcb-b2e9-789c06fa9670\") " Sep 29 10:57:37 crc kubenswrapper[4752]: I0929 10:57:37.684378 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/71f15cd4-4eb9-4fcb-b2e9-789c06fa9670-bundle\") pod \"71f15cd4-4eb9-4fcb-b2e9-789c06fa9670\" (UID: \"71f15cd4-4eb9-4fcb-b2e9-789c06fa9670\") " Sep 29 10:57:37 crc kubenswrapper[4752]: I0929 10:57:37.685611 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71f15cd4-4eb9-4fcb-b2e9-789c06fa9670-bundle" (OuterVolumeSpecName: "bundle") pod "71f15cd4-4eb9-4fcb-b2e9-789c06fa9670" (UID: "71f15cd4-4eb9-4fcb-b2e9-789c06fa9670"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:57:37 crc kubenswrapper[4752]: I0929 10:57:37.692918 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71f15cd4-4eb9-4fcb-b2e9-789c06fa9670-kube-api-access-vrwkg" (OuterVolumeSpecName: "kube-api-access-vrwkg") pod "71f15cd4-4eb9-4fcb-b2e9-789c06fa9670" (UID: "71f15cd4-4eb9-4fcb-b2e9-789c06fa9670"). InnerVolumeSpecName "kube-api-access-vrwkg". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:57:37 crc kubenswrapper[4752]: I0929 10:57:37.703094 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71f15cd4-4eb9-4fcb-b2e9-789c06fa9670-util" (OuterVolumeSpecName: "util") pod "71f15cd4-4eb9-4fcb-b2e9-789c06fa9670" (UID: "71f15cd4-4eb9-4fcb-b2e9-789c06fa9670"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:57:37 crc kubenswrapper[4752]: I0929 10:57:37.708928 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-dsdrq" Sep 29 10:57:37 crc kubenswrapper[4752]: I0929 10:57:37.709795 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-dsdrq" Sep 29 10:57:37 crc kubenswrapper[4752]: I0929 10:57:37.755545 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-dsdrq" Sep 29 10:57:37 crc kubenswrapper[4752]: I0929 10:57:37.786326 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vrwkg\" (UniqueName: \"kubernetes.io/projected/71f15cd4-4eb9-4fcb-b2e9-789c06fa9670-kube-api-access-vrwkg\") on node \"crc\" DevicePath \"\"" Sep 29 10:57:37 crc kubenswrapper[4752]: I0929 10:57:37.786872 4752 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/71f15cd4-4eb9-4fcb-b2e9-789c06fa9670-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:57:37 crc kubenswrapper[4752]: I0929 10:57:37.786884 4752 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/71f15cd4-4eb9-4fcb-b2e9-789c06fa9670-util\") on node \"crc\" DevicePath \"\"" Sep 29 10:57:38 crc kubenswrapper[4752]: I0929 10:57:38.421786 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d966hkt4" Sep 29 10:57:38 crc kubenswrapper[4752]: I0929 10:57:38.421789 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d966hkt4" event={"ID":"71f15cd4-4eb9-4fcb-b2e9-789c06fa9670","Type":"ContainerDied","Data":"dc4a50c762b6afa8e1de9043d3c7b21738767f41fa4ae1271387abcd94b298ec"} Sep 29 10:57:38 crc kubenswrapper[4752]: I0929 10:57:38.422027 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dc4a50c762b6afa8e1de9043d3c7b21738767f41fa4ae1271387abcd94b298ec" Sep 29 10:57:38 crc kubenswrapper[4752]: I0929 10:57:38.462847 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-dsdrq" Sep 29 10:57:40 crc kubenswrapper[4752]: I0929 10:57:40.550477 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dsdrq"] Sep 29 10:57:41 crc kubenswrapper[4752]: I0929 10:57:41.441247 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-dsdrq" podUID="3114de4f-e714-4102-ae6e-53e08a01a180" containerName="registry-server" containerID="cri-o://ace76e5d37482d9ac42c361b8deef4423420d53e88901f5a668a505b66048619" gracePeriod=2 Sep 29 10:57:42 crc kubenswrapper[4752]: I0929 10:57:42.386174 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dsdrq" Sep 29 10:57:42 crc kubenswrapper[4752]: I0929 10:57:42.454352 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hqlqv" event={"ID":"4a95b36e-5f4a-4681-9dce-b4980f41ec2a","Type":"ContainerStarted","Data":"d1f3241e953ba7fa801c74b126ef725337953bfd0224c1ccc589ce26995822f8"} Sep 29 10:57:42 crc kubenswrapper[4752]: I0929 10:57:42.457357 4752 generic.go:334] "Generic (PLEG): container finished" podID="3114de4f-e714-4102-ae6e-53e08a01a180" containerID="ace76e5d37482d9ac42c361b8deef4423420d53e88901f5a668a505b66048619" exitCode=0 Sep 29 10:57:42 crc kubenswrapper[4752]: I0929 10:57:42.457425 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dsdrq" Sep 29 10:57:42 crc kubenswrapper[4752]: I0929 10:57:42.457452 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dsdrq" event={"ID":"3114de4f-e714-4102-ae6e-53e08a01a180","Type":"ContainerDied","Data":"ace76e5d37482d9ac42c361b8deef4423420d53e88901f5a668a505b66048619"} Sep 29 10:57:42 crc kubenswrapper[4752]: I0929 10:57:42.457902 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dsdrq" event={"ID":"3114de4f-e714-4102-ae6e-53e08a01a180","Type":"ContainerDied","Data":"4894eefb8766fdd93f0073ab449bdbd83f0715bb21ef3a3a22c2256a49e85d8e"} Sep 29 10:57:42 crc kubenswrapper[4752]: I0929 10:57:42.457940 4752 scope.go:117] "RemoveContainer" containerID="ace76e5d37482d9ac42c361b8deef4423420d53e88901f5a668a505b66048619" Sep 29 10:57:42 crc kubenswrapper[4752]: I0929 10:57:42.474180 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3114de4f-e714-4102-ae6e-53e08a01a180-catalog-content\") pod \"3114de4f-e714-4102-ae6e-53e08a01a180\" (UID: \"3114de4f-e714-4102-ae6e-53e08a01a180\") " Sep 29 10:57:42 crc kubenswrapper[4752]: I0929 10:57:42.474247 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3114de4f-e714-4102-ae6e-53e08a01a180-utilities\") pod \"3114de4f-e714-4102-ae6e-53e08a01a180\" (UID: \"3114de4f-e714-4102-ae6e-53e08a01a180\") " Sep 29 10:57:42 crc kubenswrapper[4752]: I0929 10:57:42.474409 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ttv4f\" (UniqueName: \"kubernetes.io/projected/3114de4f-e714-4102-ae6e-53e08a01a180-kube-api-access-ttv4f\") pod \"3114de4f-e714-4102-ae6e-53e08a01a180\" (UID: \"3114de4f-e714-4102-ae6e-53e08a01a180\") " Sep 29 10:57:42 crc kubenswrapper[4752]: I0929 10:57:42.475709 4752 scope.go:117] "RemoveContainer" containerID="49693c9523a87b1d9e31c2393e5d28934a3cce37f664b7045ea23f2f07558786" Sep 29 10:57:42 crc kubenswrapper[4752]: I0929 10:57:42.478291 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3114de4f-e714-4102-ae6e-53e08a01a180-utilities" (OuterVolumeSpecName: "utilities") pod "3114de4f-e714-4102-ae6e-53e08a01a180" (UID: "3114de4f-e714-4102-ae6e-53e08a01a180"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:57:42 crc kubenswrapper[4752]: I0929 10:57:42.501148 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3114de4f-e714-4102-ae6e-53e08a01a180-kube-api-access-ttv4f" (OuterVolumeSpecName: "kube-api-access-ttv4f") pod "3114de4f-e714-4102-ae6e-53e08a01a180" (UID: "3114de4f-e714-4102-ae6e-53e08a01a180"). InnerVolumeSpecName "kube-api-access-ttv4f". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:57:42 crc kubenswrapper[4752]: I0929 10:57:42.501164 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3114de4f-e714-4102-ae6e-53e08a01a180-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3114de4f-e714-4102-ae6e-53e08a01a180" (UID: "3114de4f-e714-4102-ae6e-53e08a01a180"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:57:42 crc kubenswrapper[4752]: I0929 10:57:42.520847 4752 scope.go:117] "RemoveContainer" containerID="1c99f5b857825fed73823e6d456a3aa5c2acc5e6798aa7c39fe73ca66e7d2734" Sep 29 10:57:42 crc kubenswrapper[4752]: I0929 10:57:42.535244 4752 scope.go:117] "RemoveContainer" containerID="ace76e5d37482d9ac42c361b8deef4423420d53e88901f5a668a505b66048619" Sep 29 10:57:42 crc kubenswrapper[4752]: E0929 10:57:42.535951 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ace76e5d37482d9ac42c361b8deef4423420d53e88901f5a668a505b66048619\": container with ID starting with ace76e5d37482d9ac42c361b8deef4423420d53e88901f5a668a505b66048619 not found: ID does not exist" containerID="ace76e5d37482d9ac42c361b8deef4423420d53e88901f5a668a505b66048619" Sep 29 10:57:42 crc kubenswrapper[4752]: I0929 10:57:42.536036 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ace76e5d37482d9ac42c361b8deef4423420d53e88901f5a668a505b66048619"} err="failed to get container status \"ace76e5d37482d9ac42c361b8deef4423420d53e88901f5a668a505b66048619\": rpc error: code = NotFound desc = could not find container \"ace76e5d37482d9ac42c361b8deef4423420d53e88901f5a668a505b66048619\": container with ID starting with ace76e5d37482d9ac42c361b8deef4423420d53e88901f5a668a505b66048619 not found: ID does not exist" Sep 29 10:57:42 crc kubenswrapper[4752]: I0929 10:57:42.536099 4752 scope.go:117] "RemoveContainer" containerID="49693c9523a87b1d9e31c2393e5d28934a3cce37f664b7045ea23f2f07558786" Sep 29 10:57:42 crc kubenswrapper[4752]: E0929 10:57:42.536558 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49693c9523a87b1d9e31c2393e5d28934a3cce37f664b7045ea23f2f07558786\": container with ID starting with 49693c9523a87b1d9e31c2393e5d28934a3cce37f664b7045ea23f2f07558786 not found: ID does not exist" containerID="49693c9523a87b1d9e31c2393e5d28934a3cce37f664b7045ea23f2f07558786" Sep 29 10:57:42 crc kubenswrapper[4752]: I0929 10:57:42.536591 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49693c9523a87b1d9e31c2393e5d28934a3cce37f664b7045ea23f2f07558786"} err="failed to get container status \"49693c9523a87b1d9e31c2393e5d28934a3cce37f664b7045ea23f2f07558786\": rpc error: code = NotFound desc = could not find container \"49693c9523a87b1d9e31c2393e5d28934a3cce37f664b7045ea23f2f07558786\": container with ID starting with 49693c9523a87b1d9e31c2393e5d28934a3cce37f664b7045ea23f2f07558786 not found: ID does not exist" Sep 29 10:57:42 crc kubenswrapper[4752]: I0929 10:57:42.536618 4752 scope.go:117] "RemoveContainer" containerID="1c99f5b857825fed73823e6d456a3aa5c2acc5e6798aa7c39fe73ca66e7d2734" Sep 29 10:57:42 crc kubenswrapper[4752]: E0929 10:57:42.536995 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c99f5b857825fed73823e6d456a3aa5c2acc5e6798aa7c39fe73ca66e7d2734\": container with ID starting with 1c99f5b857825fed73823e6d456a3aa5c2acc5e6798aa7c39fe73ca66e7d2734 not found: ID does not exist" containerID="1c99f5b857825fed73823e6d456a3aa5c2acc5e6798aa7c39fe73ca66e7d2734" Sep 29 10:57:42 crc kubenswrapper[4752]: I0929 10:57:42.537038 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c99f5b857825fed73823e6d456a3aa5c2acc5e6798aa7c39fe73ca66e7d2734"} err="failed to get container status \"1c99f5b857825fed73823e6d456a3aa5c2acc5e6798aa7c39fe73ca66e7d2734\": rpc error: code = NotFound desc = could not find container \"1c99f5b857825fed73823e6d456a3aa5c2acc5e6798aa7c39fe73ca66e7d2734\": container with ID starting with 1c99f5b857825fed73823e6d456a3aa5c2acc5e6798aa7c39fe73ca66e7d2734 not found: ID does not exist" Sep 29 10:57:42 crc kubenswrapper[4752]: I0929 10:57:42.576680 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ttv4f\" (UniqueName: \"kubernetes.io/projected/3114de4f-e714-4102-ae6e-53e08a01a180-kube-api-access-ttv4f\") on node \"crc\" DevicePath \"\"" Sep 29 10:57:42 crc kubenswrapper[4752]: I0929 10:57:42.576729 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3114de4f-e714-4102-ae6e-53e08a01a180-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 10:57:42 crc kubenswrapper[4752]: I0929 10:57:42.576747 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3114de4f-e714-4102-ae6e-53e08a01a180-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 10:57:42 crc kubenswrapper[4752]: I0929 10:57:42.817537 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dsdrq"] Sep 29 10:57:42 crc kubenswrapper[4752]: I0929 10:57:42.835002 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-dsdrq"] Sep 29 10:57:43 crc kubenswrapper[4752]: I0929 10:57:43.467158 4752 generic.go:334] "Generic (PLEG): container finished" podID="4a95b36e-5f4a-4681-9dce-b4980f41ec2a" containerID="d1f3241e953ba7fa801c74b126ef725337953bfd0224c1ccc589ce26995822f8" exitCode=0 Sep 29 10:57:43 crc kubenswrapper[4752]: I0929 10:57:43.467199 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hqlqv" event={"ID":"4a95b36e-5f4a-4681-9dce-b4980f41ec2a","Type":"ContainerDied","Data":"d1f3241e953ba7fa801c74b126ef725337953bfd0224c1ccc589ce26995822f8"} Sep 29 10:57:44 crc kubenswrapper[4752]: I0929 10:57:44.045196 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3114de4f-e714-4102-ae6e-53e08a01a180" path="/var/lib/kubelet/pods/3114de4f-e714-4102-ae6e-53e08a01a180/volumes" Sep 29 10:57:44 crc kubenswrapper[4752]: I0929 10:57:44.477815 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hqlqv" event={"ID":"4a95b36e-5f4a-4681-9dce-b4980f41ec2a","Type":"ContainerStarted","Data":"0bcb145f683563cfda395d92f4e8f5d165dca42c8af853b806da900fc703bdae"} Sep 29 10:57:44 crc kubenswrapper[4752]: I0929 10:57:44.503328 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-hqlqv" podStartSLOduration=1.924785204 podStartE2EDuration="10.503297059s" podCreationTimestamp="2025-09-29 10:57:34 +0000 UTC" firstStartedPulling="2025-09-29 10:57:35.398030444 +0000 UTC m=+796.187172111" lastFinishedPulling="2025-09-29 10:57:43.976542299 +0000 UTC m=+804.765683966" observedRunningTime="2025-09-29 10:57:44.502117828 +0000 UTC m=+805.291259525" watchObservedRunningTime="2025-09-29 10:57:44.503297059 +0000 UTC m=+805.292438726" Sep 29 10:57:44 crc kubenswrapper[4752]: I0929 10:57:44.678543 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-hqlqv" Sep 29 10:57:44 crc kubenswrapper[4752]: I0929 10:57:44.678659 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-hqlqv" Sep 29 10:57:45 crc kubenswrapper[4752]: I0929 10:57:45.770999 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-hqlqv" podUID="4a95b36e-5f4a-4681-9dce-b4980f41ec2a" containerName="registry-server" probeResult="failure" output=< Sep 29 10:57:45 crc kubenswrapper[4752]: timeout: failed to connect service ":50051" within 1s Sep 29 10:57:45 crc kubenswrapper[4752]: > Sep 29 10:57:48 crc kubenswrapper[4752]: I0929 10:57:48.462316 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-74ffd6549c-bbnch"] Sep 29 10:57:48 crc kubenswrapper[4752]: E0929 10:57:48.464313 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3114de4f-e714-4102-ae6e-53e08a01a180" containerName="extract-content" Sep 29 10:57:48 crc kubenswrapper[4752]: I0929 10:57:48.464413 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="3114de4f-e714-4102-ae6e-53e08a01a180" containerName="extract-content" Sep 29 10:57:48 crc kubenswrapper[4752]: E0929 10:57:48.464515 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3114de4f-e714-4102-ae6e-53e08a01a180" containerName="registry-server" Sep 29 10:57:48 crc kubenswrapper[4752]: I0929 10:57:48.464569 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="3114de4f-e714-4102-ae6e-53e08a01a180" containerName="registry-server" Sep 29 10:57:48 crc kubenswrapper[4752]: E0929 10:57:48.464644 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71f15cd4-4eb9-4fcb-b2e9-789c06fa9670" containerName="pull" Sep 29 10:57:48 crc kubenswrapper[4752]: I0929 10:57:48.464709 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="71f15cd4-4eb9-4fcb-b2e9-789c06fa9670" containerName="pull" Sep 29 10:57:48 crc kubenswrapper[4752]: E0929 10:57:48.464769 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3114de4f-e714-4102-ae6e-53e08a01a180" containerName="extract-utilities" Sep 29 10:57:48 crc kubenswrapper[4752]: I0929 10:57:48.464837 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="3114de4f-e714-4102-ae6e-53e08a01a180" containerName="extract-utilities" Sep 29 10:57:48 crc kubenswrapper[4752]: E0929 10:57:48.464901 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71f15cd4-4eb9-4fcb-b2e9-789c06fa9670" containerName="util" Sep 29 10:57:48 crc kubenswrapper[4752]: I0929 10:57:48.464952 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="71f15cd4-4eb9-4fcb-b2e9-789c06fa9670" containerName="util" Sep 29 10:57:48 crc kubenswrapper[4752]: E0929 10:57:48.465007 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71f15cd4-4eb9-4fcb-b2e9-789c06fa9670" containerName="extract" Sep 29 10:57:48 crc kubenswrapper[4752]: I0929 10:57:48.465067 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="71f15cd4-4eb9-4fcb-b2e9-789c06fa9670" containerName="extract" Sep 29 10:57:48 crc kubenswrapper[4752]: I0929 10:57:48.465277 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="3114de4f-e714-4102-ae6e-53e08a01a180" containerName="registry-server" Sep 29 10:57:48 crc kubenswrapper[4752]: I0929 10:57:48.465371 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="71f15cd4-4eb9-4fcb-b2e9-789c06fa9670" containerName="extract" Sep 29 10:57:48 crc kubenswrapper[4752]: I0929 10:57:48.466020 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-74ffd6549c-bbnch" Sep 29 10:57:48 crc kubenswrapper[4752]: I0929 10:57:48.468989 4752 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Sep 29 10:57:48 crc kubenswrapper[4752]: I0929 10:57:48.469049 4752 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Sep 29 10:57:48 crc kubenswrapper[4752]: I0929 10:57:48.469644 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Sep 29 10:57:48 crc kubenswrapper[4752]: I0929 10:57:48.469894 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Sep 29 10:57:48 crc kubenswrapper[4752]: I0929 10:57:48.547732 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-74ffd6549c-bbnch"] Sep 29 10:57:48 crc kubenswrapper[4752]: I0929 10:57:48.573838 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/945488c0-5a99-455c-b0da-f3d188ff4438-apiservice-cert\") pod \"metallb-operator-controller-manager-74ffd6549c-bbnch\" (UID: \"945488c0-5a99-455c-b0da-f3d188ff4438\") " pod="metallb-system/metallb-operator-controller-manager-74ffd6549c-bbnch" Sep 29 10:57:48 crc kubenswrapper[4752]: I0929 10:57:48.574264 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/945488c0-5a99-455c-b0da-f3d188ff4438-webhook-cert\") pod \"metallb-operator-controller-manager-74ffd6549c-bbnch\" (UID: \"945488c0-5a99-455c-b0da-f3d188ff4438\") " pod="metallb-system/metallb-operator-controller-manager-74ffd6549c-bbnch" Sep 29 10:57:48 crc kubenswrapper[4752]: I0929 10:57:48.574395 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvzgx\" (UniqueName: \"kubernetes.io/projected/945488c0-5a99-455c-b0da-f3d188ff4438-kube-api-access-dvzgx\") pod \"metallb-operator-controller-manager-74ffd6549c-bbnch\" (UID: \"945488c0-5a99-455c-b0da-f3d188ff4438\") " pod="metallb-system/metallb-operator-controller-manager-74ffd6549c-bbnch" Sep 29 10:57:48 crc kubenswrapper[4752]: I0929 10:57:48.676123 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvzgx\" (UniqueName: \"kubernetes.io/projected/945488c0-5a99-455c-b0da-f3d188ff4438-kube-api-access-dvzgx\") pod \"metallb-operator-controller-manager-74ffd6549c-bbnch\" (UID: \"945488c0-5a99-455c-b0da-f3d188ff4438\") " pod="metallb-system/metallb-operator-controller-manager-74ffd6549c-bbnch" Sep 29 10:57:48 crc kubenswrapper[4752]: I0929 10:57:48.677535 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/945488c0-5a99-455c-b0da-f3d188ff4438-apiservice-cert\") pod \"metallb-operator-controller-manager-74ffd6549c-bbnch\" (UID: \"945488c0-5a99-455c-b0da-f3d188ff4438\") " pod="metallb-system/metallb-operator-controller-manager-74ffd6549c-bbnch" Sep 29 10:57:48 crc kubenswrapper[4752]: I0929 10:57:48.677695 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/945488c0-5a99-455c-b0da-f3d188ff4438-webhook-cert\") pod \"metallb-operator-controller-manager-74ffd6549c-bbnch\" (UID: \"945488c0-5a99-455c-b0da-f3d188ff4438\") " pod="metallb-system/metallb-operator-controller-manager-74ffd6549c-bbnch" Sep 29 10:57:48 crc kubenswrapper[4752]: I0929 10:57:48.685202 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/945488c0-5a99-455c-b0da-f3d188ff4438-apiservice-cert\") pod \"metallb-operator-controller-manager-74ffd6549c-bbnch\" (UID: \"945488c0-5a99-455c-b0da-f3d188ff4438\") " pod="metallb-system/metallb-operator-controller-manager-74ffd6549c-bbnch" Sep 29 10:57:48 crc kubenswrapper[4752]: I0929 10:57:48.690907 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/945488c0-5a99-455c-b0da-f3d188ff4438-webhook-cert\") pod \"metallb-operator-controller-manager-74ffd6549c-bbnch\" (UID: \"945488c0-5a99-455c-b0da-f3d188ff4438\") " pod="metallb-system/metallb-operator-controller-manager-74ffd6549c-bbnch" Sep 29 10:57:48 crc kubenswrapper[4752]: I0929 10:57:48.722575 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvzgx\" (UniqueName: \"kubernetes.io/projected/945488c0-5a99-455c-b0da-f3d188ff4438-kube-api-access-dvzgx\") pod \"metallb-operator-controller-manager-74ffd6549c-bbnch\" (UID: \"945488c0-5a99-455c-b0da-f3d188ff4438\") " pod="metallb-system/metallb-operator-controller-manager-74ffd6549c-bbnch" Sep 29 10:57:48 crc kubenswrapper[4752]: I0929 10:57:48.753838 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-7cb899cd5c-bldh9"] Sep 29 10:57:48 crc kubenswrapper[4752]: I0929 10:57:48.754863 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7cb899cd5c-bldh9" Sep 29 10:57:48 crc kubenswrapper[4752]: I0929 10:57:48.757305 4752 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Sep 29 10:57:48 crc kubenswrapper[4752]: I0929 10:57:48.757579 4752 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Sep 29 10:57:48 crc kubenswrapper[4752]: I0929 10:57:48.769899 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7cb899cd5c-bldh9"] Sep 29 10:57:48 crc kubenswrapper[4752]: I0929 10:57:48.786381 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-74ffd6549c-bbnch" Sep 29 10:57:48 crc kubenswrapper[4752]: I0929 10:57:48.881041 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9eeaaa19-fdbf-4d23-90bb-94c687033574-webhook-cert\") pod \"metallb-operator-webhook-server-7cb899cd5c-bldh9\" (UID: \"9eeaaa19-fdbf-4d23-90bb-94c687033574\") " pod="metallb-system/metallb-operator-webhook-server-7cb899cd5c-bldh9" Sep 29 10:57:48 crc kubenswrapper[4752]: I0929 10:57:48.881154 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9eeaaa19-fdbf-4d23-90bb-94c687033574-apiservice-cert\") pod \"metallb-operator-webhook-server-7cb899cd5c-bldh9\" (UID: \"9eeaaa19-fdbf-4d23-90bb-94c687033574\") " pod="metallb-system/metallb-operator-webhook-server-7cb899cd5c-bldh9" Sep 29 10:57:48 crc kubenswrapper[4752]: I0929 10:57:48.881407 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hv7l5\" (UniqueName: \"kubernetes.io/projected/9eeaaa19-fdbf-4d23-90bb-94c687033574-kube-api-access-hv7l5\") pod \"metallb-operator-webhook-server-7cb899cd5c-bldh9\" (UID: \"9eeaaa19-fdbf-4d23-90bb-94c687033574\") " pod="metallb-system/metallb-operator-webhook-server-7cb899cd5c-bldh9" Sep 29 10:57:48 crc kubenswrapper[4752]: I0929 10:57:48.982570 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9eeaaa19-fdbf-4d23-90bb-94c687033574-apiservice-cert\") pod \"metallb-operator-webhook-server-7cb899cd5c-bldh9\" (UID: \"9eeaaa19-fdbf-4d23-90bb-94c687033574\") " pod="metallb-system/metallb-operator-webhook-server-7cb899cd5c-bldh9" Sep 29 10:57:48 crc kubenswrapper[4752]: I0929 10:57:48.982750 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hv7l5\" (UniqueName: \"kubernetes.io/projected/9eeaaa19-fdbf-4d23-90bb-94c687033574-kube-api-access-hv7l5\") pod \"metallb-operator-webhook-server-7cb899cd5c-bldh9\" (UID: \"9eeaaa19-fdbf-4d23-90bb-94c687033574\") " pod="metallb-system/metallb-operator-webhook-server-7cb899cd5c-bldh9" Sep 29 10:57:48 crc kubenswrapper[4752]: I0929 10:57:48.982826 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9eeaaa19-fdbf-4d23-90bb-94c687033574-webhook-cert\") pod \"metallb-operator-webhook-server-7cb899cd5c-bldh9\" (UID: \"9eeaaa19-fdbf-4d23-90bb-94c687033574\") " pod="metallb-system/metallb-operator-webhook-server-7cb899cd5c-bldh9" Sep 29 10:57:48 crc kubenswrapper[4752]: I0929 10:57:48.994676 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9eeaaa19-fdbf-4d23-90bb-94c687033574-webhook-cert\") pod \"metallb-operator-webhook-server-7cb899cd5c-bldh9\" (UID: \"9eeaaa19-fdbf-4d23-90bb-94c687033574\") " pod="metallb-system/metallb-operator-webhook-server-7cb899cd5c-bldh9" Sep 29 10:57:48 crc kubenswrapper[4752]: I0929 10:57:48.996752 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9eeaaa19-fdbf-4d23-90bb-94c687033574-apiservice-cert\") pod \"metallb-operator-webhook-server-7cb899cd5c-bldh9\" (UID: \"9eeaaa19-fdbf-4d23-90bb-94c687033574\") " pod="metallb-system/metallb-operator-webhook-server-7cb899cd5c-bldh9" Sep 29 10:57:49 crc kubenswrapper[4752]: I0929 10:57:49.031755 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hv7l5\" (UniqueName: \"kubernetes.io/projected/9eeaaa19-fdbf-4d23-90bb-94c687033574-kube-api-access-hv7l5\") pod \"metallb-operator-webhook-server-7cb899cd5c-bldh9\" (UID: \"9eeaaa19-fdbf-4d23-90bb-94c687033574\") " pod="metallb-system/metallb-operator-webhook-server-7cb899cd5c-bldh9" Sep 29 10:57:49 crc kubenswrapper[4752]: I0929 10:57:49.086643 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7cb899cd5c-bldh9" Sep 29 10:57:49 crc kubenswrapper[4752]: I0929 10:57:49.273867 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-74ffd6549c-bbnch"] Sep 29 10:57:49 crc kubenswrapper[4752]: W0929 10:57:49.286454 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod945488c0_5a99_455c_b0da_f3d188ff4438.slice/crio-c9d56b004fa7bb316a9ea87eb30f4d9a09ed899242a1ca23c15299cc53b86cde WatchSource:0}: Error finding container c9d56b004fa7bb316a9ea87eb30f4d9a09ed899242a1ca23c15299cc53b86cde: Status 404 returned error can't find the container with id c9d56b004fa7bb316a9ea87eb30f4d9a09ed899242a1ca23c15299cc53b86cde Sep 29 10:57:49 crc kubenswrapper[4752]: I0929 10:57:49.467037 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7cb899cd5c-bldh9"] Sep 29 10:57:49 crc kubenswrapper[4752]: W0929 10:57:49.474989 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9eeaaa19_fdbf_4d23_90bb_94c687033574.slice/crio-383fc209a88fcd0b3ff2031484da57df0a0f1e2874d6f127d10b50174553537d WatchSource:0}: Error finding container 383fc209a88fcd0b3ff2031484da57df0a0f1e2874d6f127d10b50174553537d: Status 404 returned error can't find the container with id 383fc209a88fcd0b3ff2031484da57df0a0f1e2874d6f127d10b50174553537d Sep 29 10:57:49 crc kubenswrapper[4752]: I0929 10:57:49.521783 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7cb899cd5c-bldh9" event={"ID":"9eeaaa19-fdbf-4d23-90bb-94c687033574","Type":"ContainerStarted","Data":"383fc209a88fcd0b3ff2031484da57df0a0f1e2874d6f127d10b50174553537d"} Sep 29 10:57:49 crc kubenswrapper[4752]: I0929 10:57:49.524737 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-74ffd6549c-bbnch" event={"ID":"945488c0-5a99-455c-b0da-f3d188ff4438","Type":"ContainerStarted","Data":"c9d56b004fa7bb316a9ea87eb30f4d9a09ed899242a1ca23c15299cc53b86cde"} Sep 29 10:57:54 crc kubenswrapper[4752]: I0929 10:57:54.738375 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-hqlqv" Sep 29 10:57:54 crc kubenswrapper[4752]: I0929 10:57:54.788085 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-hqlqv" Sep 29 10:57:54 crc kubenswrapper[4752]: I0929 10:57:54.870934 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hqlqv"] Sep 29 10:57:54 crc kubenswrapper[4752]: I0929 10:57:54.972739 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-t95zn"] Sep 29 10:57:54 crc kubenswrapper[4752]: I0929 10:57:54.973388 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-t95zn" podUID="73deb0e5-ae20-4412-b500-571db85cc292" containerName="registry-server" containerID="cri-o://053570d9ffeec7129249cb9c99bef660d0ca55a7bc728bde0d6b106dbfa59a89" gracePeriod=2 Sep 29 10:57:55 crc kubenswrapper[4752]: I0929 10:57:55.465004 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t95zn" Sep 29 10:57:55 crc kubenswrapper[4752]: I0929 10:57:55.507526 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lntjf\" (UniqueName: \"kubernetes.io/projected/73deb0e5-ae20-4412-b500-571db85cc292-kube-api-access-lntjf\") pod \"73deb0e5-ae20-4412-b500-571db85cc292\" (UID: \"73deb0e5-ae20-4412-b500-571db85cc292\") " Sep 29 10:57:55 crc kubenswrapper[4752]: I0929 10:57:55.507698 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73deb0e5-ae20-4412-b500-571db85cc292-utilities\") pod \"73deb0e5-ae20-4412-b500-571db85cc292\" (UID: \"73deb0e5-ae20-4412-b500-571db85cc292\") " Sep 29 10:57:55 crc kubenswrapper[4752]: I0929 10:57:55.507730 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73deb0e5-ae20-4412-b500-571db85cc292-catalog-content\") pod \"73deb0e5-ae20-4412-b500-571db85cc292\" (UID: \"73deb0e5-ae20-4412-b500-571db85cc292\") " Sep 29 10:57:55 crc kubenswrapper[4752]: I0929 10:57:55.508979 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73deb0e5-ae20-4412-b500-571db85cc292-utilities" (OuterVolumeSpecName: "utilities") pod "73deb0e5-ae20-4412-b500-571db85cc292" (UID: "73deb0e5-ae20-4412-b500-571db85cc292"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:57:55 crc kubenswrapper[4752]: I0929 10:57:55.514114 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73deb0e5-ae20-4412-b500-571db85cc292-kube-api-access-lntjf" (OuterVolumeSpecName: "kube-api-access-lntjf") pod "73deb0e5-ae20-4412-b500-571db85cc292" (UID: "73deb0e5-ae20-4412-b500-571db85cc292"). InnerVolumeSpecName "kube-api-access-lntjf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:57:55 crc kubenswrapper[4752]: I0929 10:57:55.569759 4752 generic.go:334] "Generic (PLEG): container finished" podID="73deb0e5-ae20-4412-b500-571db85cc292" containerID="053570d9ffeec7129249cb9c99bef660d0ca55a7bc728bde0d6b106dbfa59a89" exitCode=0 Sep 29 10:57:55 crc kubenswrapper[4752]: I0929 10:57:55.569839 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t95zn" event={"ID":"73deb0e5-ae20-4412-b500-571db85cc292","Type":"ContainerDied","Data":"053570d9ffeec7129249cb9c99bef660d0ca55a7bc728bde0d6b106dbfa59a89"} Sep 29 10:57:55 crc kubenswrapper[4752]: I0929 10:57:55.569843 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t95zn" Sep 29 10:57:55 crc kubenswrapper[4752]: I0929 10:57:55.569903 4752 scope.go:117] "RemoveContainer" containerID="053570d9ffeec7129249cb9c99bef660d0ca55a7bc728bde0d6b106dbfa59a89" Sep 29 10:57:55 crc kubenswrapper[4752]: I0929 10:57:55.569890 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t95zn" event={"ID":"73deb0e5-ae20-4412-b500-571db85cc292","Type":"ContainerDied","Data":"08b88607ff7b915bfd7ab7b65097b91893b965fbad64d3ecf4962873365b76e6"} Sep 29 10:57:55 crc kubenswrapper[4752]: I0929 10:57:55.571407 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-74ffd6549c-bbnch" event={"ID":"945488c0-5a99-455c-b0da-f3d188ff4438","Type":"ContainerStarted","Data":"d422be8d698acff6e6e9bb04787e793c415e2083778e1fefc1b48411a71094a3"} Sep 29 10:57:55 crc kubenswrapper[4752]: I0929 10:57:55.571557 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-74ffd6549c-bbnch" Sep 29 10:57:55 crc kubenswrapper[4752]: I0929 10:57:55.580242 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7cb899cd5c-bldh9" event={"ID":"9eeaaa19-fdbf-4d23-90bb-94c687033574","Type":"ContainerStarted","Data":"6c124931e9c17518915ff2843b7ad4b61927a2fdd8bc117acbd91541f9be4a58"} Sep 29 10:57:55 crc kubenswrapper[4752]: I0929 10:57:55.580547 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-7cb899cd5c-bldh9" Sep 29 10:57:55 crc kubenswrapper[4752]: I0929 10:57:55.590452 4752 scope.go:117] "RemoveContainer" containerID="2ae1e204533a8c3917baf71c728617b25db8d20622ed0bb1d704957a792e07b5" Sep 29 10:57:55 crc kubenswrapper[4752]: I0929 10:57:55.603507 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-74ffd6549c-bbnch" podStartSLOduration=1.658077988 podStartE2EDuration="7.603484626s" podCreationTimestamp="2025-09-29 10:57:48 +0000 UTC" firstStartedPulling="2025-09-29 10:57:49.293150679 +0000 UTC m=+810.082292346" lastFinishedPulling="2025-09-29 10:57:55.238557317 +0000 UTC m=+816.027698984" observedRunningTime="2025-09-29 10:57:55.602435669 +0000 UTC m=+816.391577336" watchObservedRunningTime="2025-09-29 10:57:55.603484626 +0000 UTC m=+816.392626293" Sep 29 10:57:55 crc kubenswrapper[4752]: I0929 10:57:55.609419 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lntjf\" (UniqueName: \"kubernetes.io/projected/73deb0e5-ae20-4412-b500-571db85cc292-kube-api-access-lntjf\") on node \"crc\" DevicePath \"\"" Sep 29 10:57:55 crc kubenswrapper[4752]: I0929 10:57:55.609471 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73deb0e5-ae20-4412-b500-571db85cc292-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 10:57:55 crc kubenswrapper[4752]: I0929 10:57:55.617786 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73deb0e5-ae20-4412-b500-571db85cc292-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "73deb0e5-ae20-4412-b500-571db85cc292" (UID: "73deb0e5-ae20-4412-b500-571db85cc292"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:57:55 crc kubenswrapper[4752]: I0929 10:57:55.624078 4752 scope.go:117] "RemoveContainer" containerID="751f408aa56a540d619d84417039fefd6f66961578040e4ad9d41d1d1c0d967f" Sep 29 10:57:55 crc kubenswrapper[4752]: I0929 10:57:55.626735 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-7cb899cd5c-bldh9" podStartSLOduration=1.8340900690000002 podStartE2EDuration="7.626707003s" podCreationTimestamp="2025-09-29 10:57:48 +0000 UTC" firstStartedPulling="2025-09-29 10:57:49.478791291 +0000 UTC m=+810.267932958" lastFinishedPulling="2025-09-29 10:57:55.271408225 +0000 UTC m=+816.060549892" observedRunningTime="2025-09-29 10:57:55.621845525 +0000 UTC m=+816.410987202" watchObservedRunningTime="2025-09-29 10:57:55.626707003 +0000 UTC m=+816.415848670" Sep 29 10:57:55 crc kubenswrapper[4752]: I0929 10:57:55.640864 4752 scope.go:117] "RemoveContainer" containerID="053570d9ffeec7129249cb9c99bef660d0ca55a7bc728bde0d6b106dbfa59a89" Sep 29 10:57:55 crc kubenswrapper[4752]: E0929 10:57:55.642176 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"053570d9ffeec7129249cb9c99bef660d0ca55a7bc728bde0d6b106dbfa59a89\": container with ID starting with 053570d9ffeec7129249cb9c99bef660d0ca55a7bc728bde0d6b106dbfa59a89 not found: ID does not exist" containerID="053570d9ffeec7129249cb9c99bef660d0ca55a7bc728bde0d6b106dbfa59a89" Sep 29 10:57:55 crc kubenswrapper[4752]: I0929 10:57:55.642221 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"053570d9ffeec7129249cb9c99bef660d0ca55a7bc728bde0d6b106dbfa59a89"} err="failed to get container status \"053570d9ffeec7129249cb9c99bef660d0ca55a7bc728bde0d6b106dbfa59a89\": rpc error: code = NotFound desc = could not find container \"053570d9ffeec7129249cb9c99bef660d0ca55a7bc728bde0d6b106dbfa59a89\": container with ID starting with 053570d9ffeec7129249cb9c99bef660d0ca55a7bc728bde0d6b106dbfa59a89 not found: ID does not exist" Sep 29 10:57:55 crc kubenswrapper[4752]: I0929 10:57:55.642247 4752 scope.go:117] "RemoveContainer" containerID="2ae1e204533a8c3917baf71c728617b25db8d20622ed0bb1d704957a792e07b5" Sep 29 10:57:55 crc kubenswrapper[4752]: E0929 10:57:55.642722 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ae1e204533a8c3917baf71c728617b25db8d20622ed0bb1d704957a792e07b5\": container with ID starting with 2ae1e204533a8c3917baf71c728617b25db8d20622ed0bb1d704957a792e07b5 not found: ID does not exist" containerID="2ae1e204533a8c3917baf71c728617b25db8d20622ed0bb1d704957a792e07b5" Sep 29 10:57:55 crc kubenswrapper[4752]: I0929 10:57:55.642774 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ae1e204533a8c3917baf71c728617b25db8d20622ed0bb1d704957a792e07b5"} err="failed to get container status \"2ae1e204533a8c3917baf71c728617b25db8d20622ed0bb1d704957a792e07b5\": rpc error: code = NotFound desc = could not find container \"2ae1e204533a8c3917baf71c728617b25db8d20622ed0bb1d704957a792e07b5\": container with ID starting with 2ae1e204533a8c3917baf71c728617b25db8d20622ed0bb1d704957a792e07b5 not found: ID does not exist" Sep 29 10:57:55 crc kubenswrapper[4752]: I0929 10:57:55.642829 4752 scope.go:117] "RemoveContainer" containerID="751f408aa56a540d619d84417039fefd6f66961578040e4ad9d41d1d1c0d967f" Sep 29 10:57:55 crc kubenswrapper[4752]: E0929 10:57:55.643218 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"751f408aa56a540d619d84417039fefd6f66961578040e4ad9d41d1d1c0d967f\": container with ID starting with 751f408aa56a540d619d84417039fefd6f66961578040e4ad9d41d1d1c0d967f not found: ID does not exist" containerID="751f408aa56a540d619d84417039fefd6f66961578040e4ad9d41d1d1c0d967f" Sep 29 10:57:55 crc kubenswrapper[4752]: I0929 10:57:55.643247 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"751f408aa56a540d619d84417039fefd6f66961578040e4ad9d41d1d1c0d967f"} err="failed to get container status \"751f408aa56a540d619d84417039fefd6f66961578040e4ad9d41d1d1c0d967f\": rpc error: code = NotFound desc = could not find container \"751f408aa56a540d619d84417039fefd6f66961578040e4ad9d41d1d1c0d967f\": container with ID starting with 751f408aa56a540d619d84417039fefd6f66961578040e4ad9d41d1d1c0d967f not found: ID does not exist" Sep 29 10:57:55 crc kubenswrapper[4752]: I0929 10:57:55.710930 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73deb0e5-ae20-4412-b500-571db85cc292-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 10:57:55 crc kubenswrapper[4752]: I0929 10:57:55.898761 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-t95zn"] Sep 29 10:57:55 crc kubenswrapper[4752]: I0929 10:57:55.901775 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-t95zn"] Sep 29 10:57:56 crc kubenswrapper[4752]: I0929 10:57:56.039257 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73deb0e5-ae20-4412-b500-571db85cc292" path="/var/lib/kubelet/pods/73deb0e5-ae20-4412-b500-571db85cc292/volumes" Sep 29 10:58:05 crc kubenswrapper[4752]: I0929 10:58:05.579483 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-pvg87"] Sep 29 10:58:05 crc kubenswrapper[4752]: E0929 10:58:05.580486 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73deb0e5-ae20-4412-b500-571db85cc292" containerName="registry-server" Sep 29 10:58:05 crc kubenswrapper[4752]: I0929 10:58:05.580502 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="73deb0e5-ae20-4412-b500-571db85cc292" containerName="registry-server" Sep 29 10:58:05 crc kubenswrapper[4752]: E0929 10:58:05.580519 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73deb0e5-ae20-4412-b500-571db85cc292" containerName="extract-utilities" Sep 29 10:58:05 crc kubenswrapper[4752]: I0929 10:58:05.580526 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="73deb0e5-ae20-4412-b500-571db85cc292" containerName="extract-utilities" Sep 29 10:58:05 crc kubenswrapper[4752]: E0929 10:58:05.580543 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73deb0e5-ae20-4412-b500-571db85cc292" containerName="extract-content" Sep 29 10:58:05 crc kubenswrapper[4752]: I0929 10:58:05.580551 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="73deb0e5-ae20-4412-b500-571db85cc292" containerName="extract-content" Sep 29 10:58:05 crc kubenswrapper[4752]: I0929 10:58:05.580646 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="73deb0e5-ae20-4412-b500-571db85cc292" containerName="registry-server" Sep 29 10:58:05 crc kubenswrapper[4752]: I0929 10:58:05.581512 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pvg87" Sep 29 10:58:05 crc kubenswrapper[4752]: I0929 10:58:05.593932 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pvg87"] Sep 29 10:58:05 crc kubenswrapper[4752]: I0929 10:58:05.660712 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f921ac72-7292-4794-9593-dc9da850677c-utilities\") pod \"community-operators-pvg87\" (UID: \"f921ac72-7292-4794-9593-dc9da850677c\") " pod="openshift-marketplace/community-operators-pvg87" Sep 29 10:58:05 crc kubenswrapper[4752]: I0929 10:58:05.660769 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f921ac72-7292-4794-9593-dc9da850677c-catalog-content\") pod \"community-operators-pvg87\" (UID: \"f921ac72-7292-4794-9593-dc9da850677c\") " pod="openshift-marketplace/community-operators-pvg87" Sep 29 10:58:05 crc kubenswrapper[4752]: I0929 10:58:05.660886 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtn5b\" (UniqueName: \"kubernetes.io/projected/f921ac72-7292-4794-9593-dc9da850677c-kube-api-access-mtn5b\") pod \"community-operators-pvg87\" (UID: \"f921ac72-7292-4794-9593-dc9da850677c\") " pod="openshift-marketplace/community-operators-pvg87" Sep 29 10:58:05 crc kubenswrapper[4752]: I0929 10:58:05.761997 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f921ac72-7292-4794-9593-dc9da850677c-utilities\") pod \"community-operators-pvg87\" (UID: \"f921ac72-7292-4794-9593-dc9da850677c\") " pod="openshift-marketplace/community-operators-pvg87" Sep 29 10:58:05 crc kubenswrapper[4752]: I0929 10:58:05.762058 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f921ac72-7292-4794-9593-dc9da850677c-catalog-content\") pod \"community-operators-pvg87\" (UID: \"f921ac72-7292-4794-9593-dc9da850677c\") " pod="openshift-marketplace/community-operators-pvg87" Sep 29 10:58:05 crc kubenswrapper[4752]: I0929 10:58:05.762116 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtn5b\" (UniqueName: \"kubernetes.io/projected/f921ac72-7292-4794-9593-dc9da850677c-kube-api-access-mtn5b\") pod \"community-operators-pvg87\" (UID: \"f921ac72-7292-4794-9593-dc9da850677c\") " pod="openshift-marketplace/community-operators-pvg87" Sep 29 10:58:05 crc kubenswrapper[4752]: I0929 10:58:05.762526 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f921ac72-7292-4794-9593-dc9da850677c-utilities\") pod \"community-operators-pvg87\" (UID: \"f921ac72-7292-4794-9593-dc9da850677c\") " pod="openshift-marketplace/community-operators-pvg87" Sep 29 10:58:05 crc kubenswrapper[4752]: I0929 10:58:05.762630 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f921ac72-7292-4794-9593-dc9da850677c-catalog-content\") pod \"community-operators-pvg87\" (UID: \"f921ac72-7292-4794-9593-dc9da850677c\") " pod="openshift-marketplace/community-operators-pvg87" Sep 29 10:58:05 crc kubenswrapper[4752]: I0929 10:58:05.801109 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtn5b\" (UniqueName: \"kubernetes.io/projected/f921ac72-7292-4794-9593-dc9da850677c-kube-api-access-mtn5b\") pod \"community-operators-pvg87\" (UID: \"f921ac72-7292-4794-9593-dc9da850677c\") " pod="openshift-marketplace/community-operators-pvg87" Sep 29 10:58:05 crc kubenswrapper[4752]: I0929 10:58:05.901224 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pvg87" Sep 29 10:58:06 crc kubenswrapper[4752]: I0929 10:58:06.454485 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pvg87"] Sep 29 10:58:06 crc kubenswrapper[4752]: W0929 10:58:06.465437 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf921ac72_7292_4794_9593_dc9da850677c.slice/crio-284da388bf05c6d7c01a4f443e49628d434430b60ea492a3281ca2a862e47254 WatchSource:0}: Error finding container 284da388bf05c6d7c01a4f443e49628d434430b60ea492a3281ca2a862e47254: Status 404 returned error can't find the container with id 284da388bf05c6d7c01a4f443e49628d434430b60ea492a3281ca2a862e47254 Sep 29 10:58:06 crc kubenswrapper[4752]: I0929 10:58:06.650611 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pvg87" event={"ID":"f921ac72-7292-4794-9593-dc9da850677c","Type":"ContainerStarted","Data":"284da388bf05c6d7c01a4f443e49628d434430b60ea492a3281ca2a862e47254"} Sep 29 10:58:07 crc kubenswrapper[4752]: I0929 10:58:07.659395 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pvg87" event={"ID":"f921ac72-7292-4794-9593-dc9da850677c","Type":"ContainerDied","Data":"204c0a6b9aefb1ff28273b0378d7d2833995e411da7893b77f064b0cd41b6b28"} Sep 29 10:58:07 crc kubenswrapper[4752]: I0929 10:58:07.659277 4752 generic.go:334] "Generic (PLEG): container finished" podID="f921ac72-7292-4794-9593-dc9da850677c" containerID="204c0a6b9aefb1ff28273b0378d7d2833995e411da7893b77f064b0cd41b6b28" exitCode=0 Sep 29 10:58:09 crc kubenswrapper[4752]: I0929 10:58:09.093978 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-7cb899cd5c-bldh9" Sep 29 10:58:09 crc kubenswrapper[4752]: I0929 10:58:09.181454 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6cfkv"] Sep 29 10:58:09 crc kubenswrapper[4752]: I0929 10:58:09.183082 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6cfkv" Sep 29 10:58:09 crc kubenswrapper[4752]: I0929 10:58:09.215953 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d625ec1-322c-4653-b716-7e48615f94d8-catalog-content\") pod \"certified-operators-6cfkv\" (UID: \"2d625ec1-322c-4653-b716-7e48615f94d8\") " pod="openshift-marketplace/certified-operators-6cfkv" Sep 29 10:58:09 crc kubenswrapper[4752]: I0929 10:58:09.216028 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tfpw\" (UniqueName: \"kubernetes.io/projected/2d625ec1-322c-4653-b716-7e48615f94d8-kube-api-access-9tfpw\") pod \"certified-operators-6cfkv\" (UID: \"2d625ec1-322c-4653-b716-7e48615f94d8\") " pod="openshift-marketplace/certified-operators-6cfkv" Sep 29 10:58:09 crc kubenswrapper[4752]: I0929 10:58:09.216064 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d625ec1-322c-4653-b716-7e48615f94d8-utilities\") pod \"certified-operators-6cfkv\" (UID: \"2d625ec1-322c-4653-b716-7e48615f94d8\") " pod="openshift-marketplace/certified-operators-6cfkv" Sep 29 10:58:09 crc kubenswrapper[4752]: I0929 10:58:09.232481 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6cfkv"] Sep 29 10:58:09 crc kubenswrapper[4752]: I0929 10:58:09.317917 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d625ec1-322c-4653-b716-7e48615f94d8-catalog-content\") pod \"certified-operators-6cfkv\" (UID: \"2d625ec1-322c-4653-b716-7e48615f94d8\") " pod="openshift-marketplace/certified-operators-6cfkv" Sep 29 10:58:09 crc kubenswrapper[4752]: I0929 10:58:09.318334 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9tfpw\" (UniqueName: \"kubernetes.io/projected/2d625ec1-322c-4653-b716-7e48615f94d8-kube-api-access-9tfpw\") pod \"certified-operators-6cfkv\" (UID: \"2d625ec1-322c-4653-b716-7e48615f94d8\") " pod="openshift-marketplace/certified-operators-6cfkv" Sep 29 10:58:09 crc kubenswrapper[4752]: I0929 10:58:09.318472 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d625ec1-322c-4653-b716-7e48615f94d8-utilities\") pod \"certified-operators-6cfkv\" (UID: \"2d625ec1-322c-4653-b716-7e48615f94d8\") " pod="openshift-marketplace/certified-operators-6cfkv" Sep 29 10:58:09 crc kubenswrapper[4752]: I0929 10:58:09.318600 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d625ec1-322c-4653-b716-7e48615f94d8-catalog-content\") pod \"certified-operators-6cfkv\" (UID: \"2d625ec1-322c-4653-b716-7e48615f94d8\") " pod="openshift-marketplace/certified-operators-6cfkv" Sep 29 10:58:09 crc kubenswrapper[4752]: I0929 10:58:09.318858 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d625ec1-322c-4653-b716-7e48615f94d8-utilities\") pod \"certified-operators-6cfkv\" (UID: \"2d625ec1-322c-4653-b716-7e48615f94d8\") " pod="openshift-marketplace/certified-operators-6cfkv" Sep 29 10:58:09 crc kubenswrapper[4752]: I0929 10:58:09.342904 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tfpw\" (UniqueName: \"kubernetes.io/projected/2d625ec1-322c-4653-b716-7e48615f94d8-kube-api-access-9tfpw\") pod \"certified-operators-6cfkv\" (UID: \"2d625ec1-322c-4653-b716-7e48615f94d8\") " pod="openshift-marketplace/certified-operators-6cfkv" Sep 29 10:58:09 crc kubenswrapper[4752]: I0929 10:58:09.504765 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6cfkv" Sep 29 10:58:09 crc kubenswrapper[4752]: I0929 10:58:09.680215 4752 generic.go:334] "Generic (PLEG): container finished" podID="f921ac72-7292-4794-9593-dc9da850677c" containerID="0cf515fd623c814b0384f10dbaf813ad5ef9177f6ec61a172819a860d469b95e" exitCode=0 Sep 29 10:58:09 crc kubenswrapper[4752]: I0929 10:58:09.680742 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pvg87" event={"ID":"f921ac72-7292-4794-9593-dc9da850677c","Type":"ContainerDied","Data":"0cf515fd623c814b0384f10dbaf813ad5ef9177f6ec61a172819a860d469b95e"} Sep 29 10:58:09 crc kubenswrapper[4752]: I0929 10:58:09.813237 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6cfkv"] Sep 29 10:58:10 crc kubenswrapper[4752]: I0929 10:58:10.687889 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pvg87" event={"ID":"f921ac72-7292-4794-9593-dc9da850677c","Type":"ContainerStarted","Data":"ffe294286334b732e783ec7447e5f7fd283adcde34fe8630e7298d8dc45eb7c9"} Sep 29 10:58:10 crc kubenswrapper[4752]: I0929 10:58:10.689490 4752 generic.go:334] "Generic (PLEG): container finished" podID="2d625ec1-322c-4653-b716-7e48615f94d8" containerID="3c54e9f4c97bbb0b66db105d78e34d15f930269dd23c709a092f48b413b71585" exitCode=0 Sep 29 10:58:10 crc kubenswrapper[4752]: I0929 10:58:10.689523 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6cfkv" event={"ID":"2d625ec1-322c-4653-b716-7e48615f94d8","Type":"ContainerDied","Data":"3c54e9f4c97bbb0b66db105d78e34d15f930269dd23c709a092f48b413b71585"} Sep 29 10:58:10 crc kubenswrapper[4752]: I0929 10:58:10.689543 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6cfkv" event={"ID":"2d625ec1-322c-4653-b716-7e48615f94d8","Type":"ContainerStarted","Data":"8cbe7f5d909c2ebebd338e537328b7499f903b9b5a786603693d2fa5703a3346"} Sep 29 10:58:10 crc kubenswrapper[4752]: I0929 10:58:10.706278 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-pvg87" podStartSLOduration=2.982879106 podStartE2EDuration="5.706255763s" podCreationTimestamp="2025-09-29 10:58:05 +0000 UTC" firstStartedPulling="2025-09-29 10:58:07.662588601 +0000 UTC m=+828.451730268" lastFinishedPulling="2025-09-29 10:58:10.385965258 +0000 UTC m=+831.175106925" observedRunningTime="2025-09-29 10:58:10.703716346 +0000 UTC m=+831.492858033" watchObservedRunningTime="2025-09-29 10:58:10.706255763 +0000 UTC m=+831.495397430" Sep 29 10:58:11 crc kubenswrapper[4752]: I0929 10:58:11.700953 4752 generic.go:334] "Generic (PLEG): container finished" podID="2d625ec1-322c-4653-b716-7e48615f94d8" containerID="3cd7f19d21a3309610bd9438fd081904d096c391bbdb9e10445939071367010b" exitCode=0 Sep 29 10:58:11 crc kubenswrapper[4752]: I0929 10:58:11.701050 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6cfkv" event={"ID":"2d625ec1-322c-4653-b716-7e48615f94d8","Type":"ContainerDied","Data":"3cd7f19d21a3309610bd9438fd081904d096c391bbdb9e10445939071367010b"} Sep 29 10:58:12 crc kubenswrapper[4752]: I0929 10:58:12.712051 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6cfkv" event={"ID":"2d625ec1-322c-4653-b716-7e48615f94d8","Type":"ContainerStarted","Data":"bbb424215669dc52605c43445e2ef44e3bfcfe2285af49677a02720c616fd062"} Sep 29 10:58:12 crc kubenswrapper[4752]: I0929 10:58:12.732431 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6cfkv" podStartSLOduration=2.183440029 podStartE2EDuration="3.732404394s" podCreationTimestamp="2025-09-29 10:58:09 +0000 UTC" firstStartedPulling="2025-09-29 10:58:10.690744057 +0000 UTC m=+831.479885724" lastFinishedPulling="2025-09-29 10:58:12.239708422 +0000 UTC m=+833.028850089" observedRunningTime="2025-09-29 10:58:12.72956757 +0000 UTC m=+833.518709237" watchObservedRunningTime="2025-09-29 10:58:12.732404394 +0000 UTC m=+833.521546061" Sep 29 10:58:15 crc kubenswrapper[4752]: I0929 10:58:15.902536 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-pvg87" Sep 29 10:58:15 crc kubenswrapper[4752]: I0929 10:58:15.903312 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-pvg87" Sep 29 10:58:15 crc kubenswrapper[4752]: I0929 10:58:15.940965 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-pvg87" Sep 29 10:58:16 crc kubenswrapper[4752]: I0929 10:58:16.817757 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-pvg87" Sep 29 10:58:19 crc kubenswrapper[4752]: I0929 10:58:19.378517 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pvg87"] Sep 29 10:58:19 crc kubenswrapper[4752]: I0929 10:58:19.379311 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-pvg87" podUID="f921ac72-7292-4794-9593-dc9da850677c" containerName="registry-server" containerID="cri-o://ffe294286334b732e783ec7447e5f7fd283adcde34fe8630e7298d8dc45eb7c9" gracePeriod=2 Sep 29 10:58:19 crc kubenswrapper[4752]: I0929 10:58:19.505611 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6cfkv" Sep 29 10:58:19 crc kubenswrapper[4752]: I0929 10:58:19.506161 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6cfkv" Sep 29 10:58:19 crc kubenswrapper[4752]: I0929 10:58:19.553505 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6cfkv" Sep 29 10:58:19 crc kubenswrapper[4752]: I0929 10:58:19.801129 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6cfkv" Sep 29 10:58:20 crc kubenswrapper[4752]: E0929 10:58:20.045047 4752 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf921ac72_7292_4794_9593_dc9da850677c.slice/crio-conmon-ffe294286334b732e783ec7447e5f7fd283adcde34fe8630e7298d8dc45eb7c9.scope\": RecentStats: unable to find data in memory cache]" Sep 29 10:58:20 crc kubenswrapper[4752]: I0929 10:58:20.422488 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pvg87" Sep 29 10:58:20 crc kubenswrapper[4752]: I0929 10:58:20.487008 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f921ac72-7292-4794-9593-dc9da850677c-catalog-content\") pod \"f921ac72-7292-4794-9593-dc9da850677c\" (UID: \"f921ac72-7292-4794-9593-dc9da850677c\") " Sep 29 10:58:20 crc kubenswrapper[4752]: I0929 10:58:20.487062 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mtn5b\" (UniqueName: \"kubernetes.io/projected/f921ac72-7292-4794-9593-dc9da850677c-kube-api-access-mtn5b\") pod \"f921ac72-7292-4794-9593-dc9da850677c\" (UID: \"f921ac72-7292-4794-9593-dc9da850677c\") " Sep 29 10:58:20 crc kubenswrapper[4752]: I0929 10:58:20.487140 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f921ac72-7292-4794-9593-dc9da850677c-utilities\") pod \"f921ac72-7292-4794-9593-dc9da850677c\" (UID: \"f921ac72-7292-4794-9593-dc9da850677c\") " Sep 29 10:58:20 crc kubenswrapper[4752]: I0929 10:58:20.488137 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f921ac72-7292-4794-9593-dc9da850677c-utilities" (OuterVolumeSpecName: "utilities") pod "f921ac72-7292-4794-9593-dc9da850677c" (UID: "f921ac72-7292-4794-9593-dc9da850677c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:58:20 crc kubenswrapper[4752]: I0929 10:58:20.493391 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f921ac72-7292-4794-9593-dc9da850677c-kube-api-access-mtn5b" (OuterVolumeSpecName: "kube-api-access-mtn5b") pod "f921ac72-7292-4794-9593-dc9da850677c" (UID: "f921ac72-7292-4794-9593-dc9da850677c"). InnerVolumeSpecName "kube-api-access-mtn5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:58:20 crc kubenswrapper[4752]: I0929 10:58:20.539519 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f921ac72-7292-4794-9593-dc9da850677c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f921ac72-7292-4794-9593-dc9da850677c" (UID: "f921ac72-7292-4794-9593-dc9da850677c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:58:20 crc kubenswrapper[4752]: I0929 10:58:20.589467 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f921ac72-7292-4794-9593-dc9da850677c-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 10:58:20 crc kubenswrapper[4752]: I0929 10:58:20.589513 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f921ac72-7292-4794-9593-dc9da850677c-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 10:58:20 crc kubenswrapper[4752]: I0929 10:58:20.589570 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mtn5b\" (UniqueName: \"kubernetes.io/projected/f921ac72-7292-4794-9593-dc9da850677c-kube-api-access-mtn5b\") on node \"crc\" DevicePath \"\"" Sep 29 10:58:20 crc kubenswrapper[4752]: I0929 10:58:20.767163 4752 generic.go:334] "Generic (PLEG): container finished" podID="f921ac72-7292-4794-9593-dc9da850677c" containerID="ffe294286334b732e783ec7447e5f7fd283adcde34fe8630e7298d8dc45eb7c9" exitCode=0 Sep 29 10:58:20 crc kubenswrapper[4752]: I0929 10:58:20.767505 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pvg87" event={"ID":"f921ac72-7292-4794-9593-dc9da850677c","Type":"ContainerDied","Data":"ffe294286334b732e783ec7447e5f7fd283adcde34fe8630e7298d8dc45eb7c9"} Sep 29 10:58:20 crc kubenswrapper[4752]: I0929 10:58:20.767554 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pvg87" event={"ID":"f921ac72-7292-4794-9593-dc9da850677c","Type":"ContainerDied","Data":"284da388bf05c6d7c01a4f443e49628d434430b60ea492a3281ca2a862e47254"} Sep 29 10:58:20 crc kubenswrapper[4752]: I0929 10:58:20.767585 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pvg87" Sep 29 10:58:20 crc kubenswrapper[4752]: I0929 10:58:20.767596 4752 scope.go:117] "RemoveContainer" containerID="ffe294286334b732e783ec7447e5f7fd283adcde34fe8630e7298d8dc45eb7c9" Sep 29 10:58:20 crc kubenswrapper[4752]: I0929 10:58:20.784847 4752 scope.go:117] "RemoveContainer" containerID="0cf515fd623c814b0384f10dbaf813ad5ef9177f6ec61a172819a860d469b95e" Sep 29 10:58:20 crc kubenswrapper[4752]: I0929 10:58:20.799291 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pvg87"] Sep 29 10:58:20 crc kubenswrapper[4752]: I0929 10:58:20.800218 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-pvg87"] Sep 29 10:58:20 crc kubenswrapper[4752]: I0929 10:58:20.815283 4752 scope.go:117] "RemoveContainer" containerID="204c0a6b9aefb1ff28273b0378d7d2833995e411da7893b77f064b0cd41b6b28" Sep 29 10:58:20 crc kubenswrapper[4752]: I0929 10:58:20.840008 4752 scope.go:117] "RemoveContainer" containerID="ffe294286334b732e783ec7447e5f7fd283adcde34fe8630e7298d8dc45eb7c9" Sep 29 10:58:20 crc kubenswrapper[4752]: E0929 10:58:20.840894 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffe294286334b732e783ec7447e5f7fd283adcde34fe8630e7298d8dc45eb7c9\": container with ID starting with ffe294286334b732e783ec7447e5f7fd283adcde34fe8630e7298d8dc45eb7c9 not found: ID does not exist" containerID="ffe294286334b732e783ec7447e5f7fd283adcde34fe8630e7298d8dc45eb7c9" Sep 29 10:58:20 crc kubenswrapper[4752]: I0929 10:58:20.840941 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffe294286334b732e783ec7447e5f7fd283adcde34fe8630e7298d8dc45eb7c9"} err="failed to get container status \"ffe294286334b732e783ec7447e5f7fd283adcde34fe8630e7298d8dc45eb7c9\": rpc error: code = NotFound desc = could not find container \"ffe294286334b732e783ec7447e5f7fd283adcde34fe8630e7298d8dc45eb7c9\": container with ID starting with ffe294286334b732e783ec7447e5f7fd283adcde34fe8630e7298d8dc45eb7c9 not found: ID does not exist" Sep 29 10:58:20 crc kubenswrapper[4752]: I0929 10:58:20.840972 4752 scope.go:117] "RemoveContainer" containerID="0cf515fd623c814b0384f10dbaf813ad5ef9177f6ec61a172819a860d469b95e" Sep 29 10:58:20 crc kubenswrapper[4752]: E0929 10:58:20.841632 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0cf515fd623c814b0384f10dbaf813ad5ef9177f6ec61a172819a860d469b95e\": container with ID starting with 0cf515fd623c814b0384f10dbaf813ad5ef9177f6ec61a172819a860d469b95e not found: ID does not exist" containerID="0cf515fd623c814b0384f10dbaf813ad5ef9177f6ec61a172819a860d469b95e" Sep 29 10:58:20 crc kubenswrapper[4752]: I0929 10:58:20.841679 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0cf515fd623c814b0384f10dbaf813ad5ef9177f6ec61a172819a860d469b95e"} err="failed to get container status \"0cf515fd623c814b0384f10dbaf813ad5ef9177f6ec61a172819a860d469b95e\": rpc error: code = NotFound desc = could not find container \"0cf515fd623c814b0384f10dbaf813ad5ef9177f6ec61a172819a860d469b95e\": container with ID starting with 0cf515fd623c814b0384f10dbaf813ad5ef9177f6ec61a172819a860d469b95e not found: ID does not exist" Sep 29 10:58:20 crc kubenswrapper[4752]: I0929 10:58:20.841711 4752 scope.go:117] "RemoveContainer" containerID="204c0a6b9aefb1ff28273b0378d7d2833995e411da7893b77f064b0cd41b6b28" Sep 29 10:58:20 crc kubenswrapper[4752]: E0929 10:58:20.842390 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"204c0a6b9aefb1ff28273b0378d7d2833995e411da7893b77f064b0cd41b6b28\": container with ID starting with 204c0a6b9aefb1ff28273b0378d7d2833995e411da7893b77f064b0cd41b6b28 not found: ID does not exist" containerID="204c0a6b9aefb1ff28273b0378d7d2833995e411da7893b77f064b0cd41b6b28" Sep 29 10:58:20 crc kubenswrapper[4752]: I0929 10:58:20.842436 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"204c0a6b9aefb1ff28273b0378d7d2833995e411da7893b77f064b0cd41b6b28"} err="failed to get container status \"204c0a6b9aefb1ff28273b0378d7d2833995e411da7893b77f064b0cd41b6b28\": rpc error: code = NotFound desc = could not find container \"204c0a6b9aefb1ff28273b0378d7d2833995e411da7893b77f064b0cd41b6b28\": container with ID starting with 204c0a6b9aefb1ff28273b0378d7d2833995e411da7893b77f064b0cd41b6b28 not found: ID does not exist" Sep 29 10:58:22 crc kubenswrapper[4752]: I0929 10:58:22.038914 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f921ac72-7292-4794-9593-dc9da850677c" path="/var/lib/kubelet/pods/f921ac72-7292-4794-9593-dc9da850677c/volumes" Sep 29 10:58:23 crc kubenswrapper[4752]: I0929 10:58:23.169174 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6cfkv"] Sep 29 10:58:23 crc kubenswrapper[4752]: I0929 10:58:23.169489 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-6cfkv" podUID="2d625ec1-322c-4653-b716-7e48615f94d8" containerName="registry-server" containerID="cri-o://bbb424215669dc52605c43445e2ef44e3bfcfe2285af49677a02720c616fd062" gracePeriod=2 Sep 29 10:58:23 crc kubenswrapper[4752]: I0929 10:58:23.792265 4752 generic.go:334] "Generic (PLEG): container finished" podID="2d625ec1-322c-4653-b716-7e48615f94d8" containerID="bbb424215669dc52605c43445e2ef44e3bfcfe2285af49677a02720c616fd062" exitCode=0 Sep 29 10:58:23 crc kubenswrapper[4752]: I0929 10:58:23.792348 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6cfkv" event={"ID":"2d625ec1-322c-4653-b716-7e48615f94d8","Type":"ContainerDied","Data":"bbb424215669dc52605c43445e2ef44e3bfcfe2285af49677a02720c616fd062"} Sep 29 10:58:24 crc kubenswrapper[4752]: I0929 10:58:24.234570 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6cfkv" Sep 29 10:58:24 crc kubenswrapper[4752]: I0929 10:58:24.343124 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9tfpw\" (UniqueName: \"kubernetes.io/projected/2d625ec1-322c-4653-b716-7e48615f94d8-kube-api-access-9tfpw\") pod \"2d625ec1-322c-4653-b716-7e48615f94d8\" (UID: \"2d625ec1-322c-4653-b716-7e48615f94d8\") " Sep 29 10:58:24 crc kubenswrapper[4752]: I0929 10:58:24.343189 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d625ec1-322c-4653-b716-7e48615f94d8-utilities\") pod \"2d625ec1-322c-4653-b716-7e48615f94d8\" (UID: \"2d625ec1-322c-4653-b716-7e48615f94d8\") " Sep 29 10:58:24 crc kubenswrapper[4752]: I0929 10:58:24.343230 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d625ec1-322c-4653-b716-7e48615f94d8-catalog-content\") pod \"2d625ec1-322c-4653-b716-7e48615f94d8\" (UID: \"2d625ec1-322c-4653-b716-7e48615f94d8\") " Sep 29 10:58:24 crc kubenswrapper[4752]: I0929 10:58:24.344220 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d625ec1-322c-4653-b716-7e48615f94d8-utilities" (OuterVolumeSpecName: "utilities") pod "2d625ec1-322c-4653-b716-7e48615f94d8" (UID: "2d625ec1-322c-4653-b716-7e48615f94d8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:58:24 crc kubenswrapper[4752]: I0929 10:58:24.350248 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d625ec1-322c-4653-b716-7e48615f94d8-kube-api-access-9tfpw" (OuterVolumeSpecName: "kube-api-access-9tfpw") pod "2d625ec1-322c-4653-b716-7e48615f94d8" (UID: "2d625ec1-322c-4653-b716-7e48615f94d8"). InnerVolumeSpecName "kube-api-access-9tfpw". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:58:24 crc kubenswrapper[4752]: I0929 10:58:24.387848 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d625ec1-322c-4653-b716-7e48615f94d8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2d625ec1-322c-4653-b716-7e48615f94d8" (UID: "2d625ec1-322c-4653-b716-7e48615f94d8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:58:24 crc kubenswrapper[4752]: I0929 10:58:24.445160 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9tfpw\" (UniqueName: \"kubernetes.io/projected/2d625ec1-322c-4653-b716-7e48615f94d8-kube-api-access-9tfpw\") on node \"crc\" DevicePath \"\"" Sep 29 10:58:24 crc kubenswrapper[4752]: I0929 10:58:24.445211 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d625ec1-322c-4653-b716-7e48615f94d8-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 10:58:24 crc kubenswrapper[4752]: I0929 10:58:24.445224 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d625ec1-322c-4653-b716-7e48615f94d8-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 10:58:24 crc kubenswrapper[4752]: I0929 10:58:24.802245 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6cfkv" event={"ID":"2d625ec1-322c-4653-b716-7e48615f94d8","Type":"ContainerDied","Data":"8cbe7f5d909c2ebebd338e537328b7499f903b9b5a786603693d2fa5703a3346"} Sep 29 10:58:24 crc kubenswrapper[4752]: I0929 10:58:24.802310 4752 scope.go:117] "RemoveContainer" containerID="bbb424215669dc52605c43445e2ef44e3bfcfe2285af49677a02720c616fd062" Sep 29 10:58:24 crc kubenswrapper[4752]: I0929 10:58:24.802314 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6cfkv" Sep 29 10:58:24 crc kubenswrapper[4752]: I0929 10:58:24.818434 4752 scope.go:117] "RemoveContainer" containerID="3cd7f19d21a3309610bd9438fd081904d096c391bbdb9e10445939071367010b" Sep 29 10:58:24 crc kubenswrapper[4752]: I0929 10:58:24.832856 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6cfkv"] Sep 29 10:58:24 crc kubenswrapper[4752]: I0929 10:58:24.836872 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-6cfkv"] Sep 29 10:58:24 crc kubenswrapper[4752]: I0929 10:58:24.848623 4752 scope.go:117] "RemoveContainer" containerID="3c54e9f4c97bbb0b66db105d78e34d15f930269dd23c709a092f48b413b71585" Sep 29 10:58:26 crc kubenswrapper[4752]: I0929 10:58:26.038388 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d625ec1-322c-4653-b716-7e48615f94d8" path="/var/lib/kubelet/pods/2d625ec1-322c-4653-b716-7e48615f94d8/volumes" Sep 29 10:58:28 crc kubenswrapper[4752]: I0929 10:58:28.789267 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-74ffd6549c-bbnch" Sep 29 10:58:29 crc kubenswrapper[4752]: I0929 10:58:29.465242 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-5478bdb765-tmdtx"] Sep 29 10:58:29 crc kubenswrapper[4752]: E0929 10:58:29.465530 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f921ac72-7292-4794-9593-dc9da850677c" containerName="registry-server" Sep 29 10:58:29 crc kubenswrapper[4752]: I0929 10:58:29.465546 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="f921ac72-7292-4794-9593-dc9da850677c" containerName="registry-server" Sep 29 10:58:29 crc kubenswrapper[4752]: E0929 10:58:29.465559 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f921ac72-7292-4794-9593-dc9da850677c" containerName="extract-utilities" Sep 29 10:58:29 crc kubenswrapper[4752]: I0929 10:58:29.465565 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="f921ac72-7292-4794-9593-dc9da850677c" containerName="extract-utilities" Sep 29 10:58:29 crc kubenswrapper[4752]: E0929 10:58:29.465575 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f921ac72-7292-4794-9593-dc9da850677c" containerName="extract-content" Sep 29 10:58:29 crc kubenswrapper[4752]: I0929 10:58:29.465582 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="f921ac72-7292-4794-9593-dc9da850677c" containerName="extract-content" Sep 29 10:58:29 crc kubenswrapper[4752]: E0929 10:58:29.465597 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d625ec1-322c-4653-b716-7e48615f94d8" containerName="extract-utilities" Sep 29 10:58:29 crc kubenswrapper[4752]: I0929 10:58:29.465602 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d625ec1-322c-4653-b716-7e48615f94d8" containerName="extract-utilities" Sep 29 10:58:29 crc kubenswrapper[4752]: E0929 10:58:29.465612 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d625ec1-322c-4653-b716-7e48615f94d8" containerName="registry-server" Sep 29 10:58:29 crc kubenswrapper[4752]: I0929 10:58:29.465617 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d625ec1-322c-4653-b716-7e48615f94d8" containerName="registry-server" Sep 29 10:58:29 crc kubenswrapper[4752]: E0929 10:58:29.465625 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d625ec1-322c-4653-b716-7e48615f94d8" containerName="extract-content" Sep 29 10:58:29 crc kubenswrapper[4752]: I0929 10:58:29.465630 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d625ec1-322c-4653-b716-7e48615f94d8" containerName="extract-content" Sep 29 10:58:29 crc kubenswrapper[4752]: I0929 10:58:29.465737 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="f921ac72-7292-4794-9593-dc9da850677c" containerName="registry-server" Sep 29 10:58:29 crc kubenswrapper[4752]: I0929 10:58:29.465747 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d625ec1-322c-4653-b716-7e48615f94d8" containerName="registry-server" Sep 29 10:58:29 crc kubenswrapper[4752]: I0929 10:58:29.466192 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-tmdtx" Sep 29 10:58:29 crc kubenswrapper[4752]: I0929 10:58:29.470461 4752 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Sep 29 10:58:29 crc kubenswrapper[4752]: I0929 10:58:29.483859 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-rzcng"] Sep 29 10:58:29 crc kubenswrapper[4752]: I0929 10:58:29.491069 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-rzcng" Sep 29 10:58:29 crc kubenswrapper[4752]: I0929 10:58:29.494087 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-5478bdb765-tmdtx"] Sep 29 10:58:29 crc kubenswrapper[4752]: I0929 10:58:29.505284 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Sep 29 10:58:29 crc kubenswrapper[4752]: I0929 10:58:29.505525 4752 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Sep 29 10:58:29 crc kubenswrapper[4752]: I0929 10:58:29.520854 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/1311e458-5fb3-4322-9c19-08bce8710d6e-metrics\") pod \"frr-k8s-rzcng\" (UID: \"1311e458-5fb3-4322-9c19-08bce8710d6e\") " pod="metallb-system/frr-k8s-rzcng" Sep 29 10:58:29 crc kubenswrapper[4752]: I0929 10:58:29.521060 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgxl8\" (UniqueName: \"kubernetes.io/projected/1311e458-5fb3-4322-9c19-08bce8710d6e-kube-api-access-kgxl8\") pod \"frr-k8s-rzcng\" (UID: \"1311e458-5fb3-4322-9c19-08bce8710d6e\") " pod="metallb-system/frr-k8s-rzcng" Sep 29 10:58:29 crc kubenswrapper[4752]: I0929 10:58:29.521101 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bafc1f1a-c41b-4f47-bf6c-134aa6d5f9f0-cert\") pod \"frr-k8s-webhook-server-5478bdb765-tmdtx\" (UID: \"bafc1f1a-c41b-4f47-bf6c-134aa6d5f9f0\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-tmdtx" Sep 29 10:58:29 crc kubenswrapper[4752]: I0929 10:58:29.521119 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/1311e458-5fb3-4322-9c19-08bce8710d6e-frr-conf\") pod \"frr-k8s-rzcng\" (UID: \"1311e458-5fb3-4322-9c19-08bce8710d6e\") " pod="metallb-system/frr-k8s-rzcng" Sep 29 10:58:29 crc kubenswrapper[4752]: I0929 10:58:29.521138 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1311e458-5fb3-4322-9c19-08bce8710d6e-metrics-certs\") pod \"frr-k8s-rzcng\" (UID: \"1311e458-5fb3-4322-9c19-08bce8710d6e\") " pod="metallb-system/frr-k8s-rzcng" Sep 29 10:58:29 crc kubenswrapper[4752]: I0929 10:58:29.521157 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/1311e458-5fb3-4322-9c19-08bce8710d6e-frr-startup\") pod \"frr-k8s-rzcng\" (UID: \"1311e458-5fb3-4322-9c19-08bce8710d6e\") " pod="metallb-system/frr-k8s-rzcng" Sep 29 10:58:29 crc kubenswrapper[4752]: I0929 10:58:29.521176 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/1311e458-5fb3-4322-9c19-08bce8710d6e-frr-sockets\") pod \"frr-k8s-rzcng\" (UID: \"1311e458-5fb3-4322-9c19-08bce8710d6e\") " pod="metallb-system/frr-k8s-rzcng" Sep 29 10:58:29 crc kubenswrapper[4752]: I0929 10:58:29.521205 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/1311e458-5fb3-4322-9c19-08bce8710d6e-reloader\") pod \"frr-k8s-rzcng\" (UID: \"1311e458-5fb3-4322-9c19-08bce8710d6e\") " pod="metallb-system/frr-k8s-rzcng" Sep 29 10:58:29 crc kubenswrapper[4752]: I0929 10:58:29.521252 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4zsd\" (UniqueName: \"kubernetes.io/projected/bafc1f1a-c41b-4f47-bf6c-134aa6d5f9f0-kube-api-access-j4zsd\") pod \"frr-k8s-webhook-server-5478bdb765-tmdtx\" (UID: \"bafc1f1a-c41b-4f47-bf6c-134aa6d5f9f0\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-tmdtx" Sep 29 10:58:29 crc kubenswrapper[4752]: I0929 10:58:29.576322 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-s4jmn"] Sep 29 10:58:29 crc kubenswrapper[4752]: I0929 10:58:29.577832 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-s4jmn" Sep 29 10:58:29 crc kubenswrapper[4752]: I0929 10:58:29.583531 4752 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Sep 29 10:58:29 crc kubenswrapper[4752]: I0929 10:58:29.583827 4752 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Sep 29 10:58:29 crc kubenswrapper[4752]: I0929 10:58:29.585100 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Sep 29 10:58:29 crc kubenswrapper[4752]: I0929 10:58:29.603239 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-5d688f5ffc-fqwg9"] Sep 29 10:58:29 crc kubenswrapper[4752]: I0929 10:58:29.604567 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-5d688f5ffc-fqwg9" Sep 29 10:58:29 crc kubenswrapper[4752]: I0929 10:58:29.609456 4752 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Sep 29 10:58:29 crc kubenswrapper[4752]: I0929 10:58:29.613149 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-5d688f5ffc-fqwg9"] Sep 29 10:58:29 crc kubenswrapper[4752]: I0929 10:58:29.623829 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgxl8\" (UniqueName: \"kubernetes.io/projected/1311e458-5fb3-4322-9c19-08bce8710d6e-kube-api-access-kgxl8\") pod \"frr-k8s-rzcng\" (UID: \"1311e458-5fb3-4322-9c19-08bce8710d6e\") " pod="metallb-system/frr-k8s-rzcng" Sep 29 10:58:29 crc kubenswrapper[4752]: I0929 10:58:29.623909 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/09cd799c-a613-4441-b407-b907862fec48-metallb-excludel2\") pod \"speaker-s4jmn\" (UID: \"09cd799c-a613-4441-b407-b907862fec48\") " pod="metallb-system/speaker-s4jmn" Sep 29 10:58:29 crc kubenswrapper[4752]: I0929 10:58:29.623941 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bafc1f1a-c41b-4f47-bf6c-134aa6d5f9f0-cert\") pod \"frr-k8s-webhook-server-5478bdb765-tmdtx\" (UID: \"bafc1f1a-c41b-4f47-bf6c-134aa6d5f9f0\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-tmdtx" Sep 29 10:58:29 crc kubenswrapper[4752]: I0929 10:58:29.623966 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/1311e458-5fb3-4322-9c19-08bce8710d6e-frr-conf\") pod \"frr-k8s-rzcng\" (UID: \"1311e458-5fb3-4322-9c19-08bce8710d6e\") " pod="metallb-system/frr-k8s-rzcng" Sep 29 10:58:29 crc kubenswrapper[4752]: I0929 10:58:29.623993 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1311e458-5fb3-4322-9c19-08bce8710d6e-metrics-certs\") pod \"frr-k8s-rzcng\" (UID: \"1311e458-5fb3-4322-9c19-08bce8710d6e\") " pod="metallb-system/frr-k8s-rzcng" Sep 29 10:58:29 crc kubenswrapper[4752]: I0929 10:58:29.624031 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/1311e458-5fb3-4322-9c19-08bce8710d6e-frr-startup\") pod \"frr-k8s-rzcng\" (UID: \"1311e458-5fb3-4322-9c19-08bce8710d6e\") " pod="metallb-system/frr-k8s-rzcng" Sep 29 10:58:29 crc kubenswrapper[4752]: I0929 10:58:29.624055 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/1311e458-5fb3-4322-9c19-08bce8710d6e-frr-sockets\") pod \"frr-k8s-rzcng\" (UID: \"1311e458-5fb3-4322-9c19-08bce8710d6e\") " pod="metallb-system/frr-k8s-rzcng" Sep 29 10:58:29 crc kubenswrapper[4752]: I0929 10:58:29.624080 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/09cd799c-a613-4441-b407-b907862fec48-memberlist\") pod \"speaker-s4jmn\" (UID: \"09cd799c-a613-4441-b407-b907862fec48\") " pod="metallb-system/speaker-s4jmn" Sep 29 10:58:29 crc kubenswrapper[4752]: I0929 10:58:29.624101 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/20827846-6dfa-4a33-b073-182cd993a60a-metrics-certs\") pod \"controller-5d688f5ffc-fqwg9\" (UID: \"20827846-6dfa-4a33-b073-182cd993a60a\") " pod="metallb-system/controller-5d688f5ffc-fqwg9" Sep 29 10:58:29 crc kubenswrapper[4752]: I0929 10:58:29.624141 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/1311e458-5fb3-4322-9c19-08bce8710d6e-reloader\") pod \"frr-k8s-rzcng\" (UID: \"1311e458-5fb3-4322-9c19-08bce8710d6e\") " pod="metallb-system/frr-k8s-rzcng" Sep 29 10:58:29 crc kubenswrapper[4752]: I0929 10:58:29.624161 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/09cd799c-a613-4441-b407-b907862fec48-metrics-certs\") pod \"speaker-s4jmn\" (UID: \"09cd799c-a613-4441-b407-b907862fec48\") " pod="metallb-system/speaker-s4jmn" Sep 29 10:58:29 crc kubenswrapper[4752]: I0929 10:58:29.624205 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qksg\" (UniqueName: \"kubernetes.io/projected/09cd799c-a613-4441-b407-b907862fec48-kube-api-access-9qksg\") pod \"speaker-s4jmn\" (UID: \"09cd799c-a613-4441-b407-b907862fec48\") " pod="metallb-system/speaker-s4jmn" Sep 29 10:58:29 crc kubenswrapper[4752]: I0929 10:58:29.624235 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbcsp\" (UniqueName: \"kubernetes.io/projected/20827846-6dfa-4a33-b073-182cd993a60a-kube-api-access-rbcsp\") pod \"controller-5d688f5ffc-fqwg9\" (UID: \"20827846-6dfa-4a33-b073-182cd993a60a\") " pod="metallb-system/controller-5d688f5ffc-fqwg9" Sep 29 10:58:29 crc kubenswrapper[4752]: I0929 10:58:29.624258 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20827846-6dfa-4a33-b073-182cd993a60a-cert\") pod \"controller-5d688f5ffc-fqwg9\" (UID: \"20827846-6dfa-4a33-b073-182cd993a60a\") " pod="metallb-system/controller-5d688f5ffc-fqwg9" Sep 29 10:58:29 crc kubenswrapper[4752]: I0929 10:58:29.624289 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4zsd\" (UniqueName: \"kubernetes.io/projected/bafc1f1a-c41b-4f47-bf6c-134aa6d5f9f0-kube-api-access-j4zsd\") pod \"frr-k8s-webhook-server-5478bdb765-tmdtx\" (UID: \"bafc1f1a-c41b-4f47-bf6c-134aa6d5f9f0\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-tmdtx" Sep 29 10:58:29 crc kubenswrapper[4752]: I0929 10:58:29.624316 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/1311e458-5fb3-4322-9c19-08bce8710d6e-metrics\") pod \"frr-k8s-rzcng\" (UID: \"1311e458-5fb3-4322-9c19-08bce8710d6e\") " pod="metallb-system/frr-k8s-rzcng" Sep 29 10:58:29 crc kubenswrapper[4752]: I0929 10:58:29.624858 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/1311e458-5fb3-4322-9c19-08bce8710d6e-metrics\") pod \"frr-k8s-rzcng\" (UID: \"1311e458-5fb3-4322-9c19-08bce8710d6e\") " pod="metallb-system/frr-k8s-rzcng" Sep 29 10:58:29 crc kubenswrapper[4752]: I0929 10:58:29.625238 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/1311e458-5fb3-4322-9c19-08bce8710d6e-reloader\") pod \"frr-k8s-rzcng\" (UID: \"1311e458-5fb3-4322-9c19-08bce8710d6e\") " pod="metallb-system/frr-k8s-rzcng" Sep 29 10:58:29 crc kubenswrapper[4752]: I0929 10:58:29.625268 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/1311e458-5fb3-4322-9c19-08bce8710d6e-frr-conf\") pod \"frr-k8s-rzcng\" (UID: \"1311e458-5fb3-4322-9c19-08bce8710d6e\") " pod="metallb-system/frr-k8s-rzcng" Sep 29 10:58:29 crc kubenswrapper[4752]: I0929 10:58:29.627357 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/1311e458-5fb3-4322-9c19-08bce8710d6e-frr-startup\") pod \"frr-k8s-rzcng\" (UID: \"1311e458-5fb3-4322-9c19-08bce8710d6e\") " pod="metallb-system/frr-k8s-rzcng" Sep 29 10:58:29 crc kubenswrapper[4752]: I0929 10:58:29.627646 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/1311e458-5fb3-4322-9c19-08bce8710d6e-frr-sockets\") pod \"frr-k8s-rzcng\" (UID: \"1311e458-5fb3-4322-9c19-08bce8710d6e\") " pod="metallb-system/frr-k8s-rzcng" Sep 29 10:58:29 crc kubenswrapper[4752]: I0929 10:58:29.637834 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1311e458-5fb3-4322-9c19-08bce8710d6e-metrics-certs\") pod \"frr-k8s-rzcng\" (UID: \"1311e458-5fb3-4322-9c19-08bce8710d6e\") " pod="metallb-system/frr-k8s-rzcng" Sep 29 10:58:29 crc kubenswrapper[4752]: I0929 10:58:29.637848 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bafc1f1a-c41b-4f47-bf6c-134aa6d5f9f0-cert\") pod \"frr-k8s-webhook-server-5478bdb765-tmdtx\" (UID: \"bafc1f1a-c41b-4f47-bf6c-134aa6d5f9f0\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-tmdtx" Sep 29 10:58:29 crc kubenswrapper[4752]: I0929 10:58:29.648739 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4zsd\" (UniqueName: \"kubernetes.io/projected/bafc1f1a-c41b-4f47-bf6c-134aa6d5f9f0-kube-api-access-j4zsd\") pod \"frr-k8s-webhook-server-5478bdb765-tmdtx\" (UID: \"bafc1f1a-c41b-4f47-bf6c-134aa6d5f9f0\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-tmdtx" Sep 29 10:58:29 crc kubenswrapper[4752]: I0929 10:58:29.649781 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgxl8\" (UniqueName: \"kubernetes.io/projected/1311e458-5fb3-4322-9c19-08bce8710d6e-kube-api-access-kgxl8\") pod \"frr-k8s-rzcng\" (UID: \"1311e458-5fb3-4322-9c19-08bce8710d6e\") " pod="metallb-system/frr-k8s-rzcng" Sep 29 10:58:29 crc kubenswrapper[4752]: I0929 10:58:29.726237 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/09cd799c-a613-4441-b407-b907862fec48-memberlist\") pod \"speaker-s4jmn\" (UID: \"09cd799c-a613-4441-b407-b907862fec48\") " pod="metallb-system/speaker-s4jmn" Sep 29 10:58:29 crc kubenswrapper[4752]: I0929 10:58:29.726289 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/20827846-6dfa-4a33-b073-182cd993a60a-metrics-certs\") pod \"controller-5d688f5ffc-fqwg9\" (UID: \"20827846-6dfa-4a33-b073-182cd993a60a\") " pod="metallb-system/controller-5d688f5ffc-fqwg9" Sep 29 10:58:29 crc kubenswrapper[4752]: I0929 10:58:29.726324 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/09cd799c-a613-4441-b407-b907862fec48-metrics-certs\") pod \"speaker-s4jmn\" (UID: \"09cd799c-a613-4441-b407-b907862fec48\") " pod="metallb-system/speaker-s4jmn" Sep 29 10:58:29 crc kubenswrapper[4752]: I0929 10:58:29.726359 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qksg\" (UniqueName: \"kubernetes.io/projected/09cd799c-a613-4441-b407-b907862fec48-kube-api-access-9qksg\") pod \"speaker-s4jmn\" (UID: \"09cd799c-a613-4441-b407-b907862fec48\") " pod="metallb-system/speaker-s4jmn" Sep 29 10:58:29 crc kubenswrapper[4752]: I0929 10:58:29.726379 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbcsp\" (UniqueName: \"kubernetes.io/projected/20827846-6dfa-4a33-b073-182cd993a60a-kube-api-access-rbcsp\") pod \"controller-5d688f5ffc-fqwg9\" (UID: \"20827846-6dfa-4a33-b073-182cd993a60a\") " pod="metallb-system/controller-5d688f5ffc-fqwg9" Sep 29 10:58:29 crc kubenswrapper[4752]: I0929 10:58:29.726397 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20827846-6dfa-4a33-b073-182cd993a60a-cert\") pod \"controller-5d688f5ffc-fqwg9\" (UID: \"20827846-6dfa-4a33-b073-182cd993a60a\") " pod="metallb-system/controller-5d688f5ffc-fqwg9" Sep 29 10:58:29 crc kubenswrapper[4752]: I0929 10:58:29.726434 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/09cd799c-a613-4441-b407-b907862fec48-metallb-excludel2\") pod \"speaker-s4jmn\" (UID: \"09cd799c-a613-4441-b407-b907862fec48\") " pod="metallb-system/speaker-s4jmn" Sep 29 10:58:29 crc kubenswrapper[4752]: E0929 10:58:29.726450 4752 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Sep 29 10:58:29 crc kubenswrapper[4752]: E0929 10:58:29.726535 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09cd799c-a613-4441-b407-b907862fec48-memberlist podName:09cd799c-a613-4441-b407-b907862fec48 nodeName:}" failed. No retries permitted until 2025-09-29 10:58:30.226510351 +0000 UTC m=+851.015652018 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/09cd799c-a613-4441-b407-b907862fec48-memberlist") pod "speaker-s4jmn" (UID: "09cd799c-a613-4441-b407-b907862fec48") : secret "metallb-memberlist" not found Sep 29 10:58:29 crc kubenswrapper[4752]: E0929 10:58:29.727076 4752 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Sep 29 10:58:29 crc kubenswrapper[4752]: E0929 10:58:29.727114 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/20827846-6dfa-4a33-b073-182cd993a60a-metrics-certs podName:20827846-6dfa-4a33-b073-182cd993a60a nodeName:}" failed. No retries permitted until 2025-09-29 10:58:30.227105436 +0000 UTC m=+851.016247103 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/20827846-6dfa-4a33-b073-182cd993a60a-metrics-certs") pod "controller-5d688f5ffc-fqwg9" (UID: "20827846-6dfa-4a33-b073-182cd993a60a") : secret "controller-certs-secret" not found Sep 29 10:58:29 crc kubenswrapper[4752]: E0929 10:58:29.727154 4752 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Sep 29 10:58:29 crc kubenswrapper[4752]: E0929 10:58:29.727173 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09cd799c-a613-4441-b407-b907862fec48-metrics-certs podName:09cd799c-a613-4441-b407-b907862fec48 nodeName:}" failed. No retries permitted until 2025-09-29 10:58:30.227167039 +0000 UTC m=+851.016308706 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/09cd799c-a613-4441-b407-b907862fec48-metrics-certs") pod "speaker-s4jmn" (UID: "09cd799c-a613-4441-b407-b907862fec48") : secret "speaker-certs-secret" not found Sep 29 10:58:29 crc kubenswrapper[4752]: I0929 10:58:29.727173 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/09cd799c-a613-4441-b407-b907862fec48-metallb-excludel2\") pod \"speaker-s4jmn\" (UID: \"09cd799c-a613-4441-b407-b907862fec48\") " pod="metallb-system/speaker-s4jmn" Sep 29 10:58:29 crc kubenswrapper[4752]: I0929 10:58:29.729129 4752 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Sep 29 10:58:29 crc kubenswrapper[4752]: I0929 10:58:29.741529 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20827846-6dfa-4a33-b073-182cd993a60a-cert\") pod \"controller-5d688f5ffc-fqwg9\" (UID: \"20827846-6dfa-4a33-b073-182cd993a60a\") " pod="metallb-system/controller-5d688f5ffc-fqwg9" Sep 29 10:58:29 crc kubenswrapper[4752]: I0929 10:58:29.752706 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qksg\" (UniqueName: \"kubernetes.io/projected/09cd799c-a613-4441-b407-b907862fec48-kube-api-access-9qksg\") pod \"speaker-s4jmn\" (UID: \"09cd799c-a613-4441-b407-b907862fec48\") " pod="metallb-system/speaker-s4jmn" Sep 29 10:58:29 crc kubenswrapper[4752]: I0929 10:58:29.752738 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbcsp\" (UniqueName: \"kubernetes.io/projected/20827846-6dfa-4a33-b073-182cd993a60a-kube-api-access-rbcsp\") pod \"controller-5d688f5ffc-fqwg9\" (UID: \"20827846-6dfa-4a33-b073-182cd993a60a\") " pod="metallb-system/controller-5d688f5ffc-fqwg9" Sep 29 10:58:29 crc kubenswrapper[4752]: I0929 10:58:29.837877 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-tmdtx" Sep 29 10:58:29 crc kubenswrapper[4752]: I0929 10:58:29.852874 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-rzcng" Sep 29 10:58:30 crc kubenswrapper[4752]: I0929 10:58:30.197060 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-5478bdb765-tmdtx"] Sep 29 10:58:30 crc kubenswrapper[4752]: I0929 10:58:30.233703 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/09cd799c-a613-4441-b407-b907862fec48-metrics-certs\") pod \"speaker-s4jmn\" (UID: \"09cd799c-a613-4441-b407-b907862fec48\") " pod="metallb-system/speaker-s4jmn" Sep 29 10:58:30 crc kubenswrapper[4752]: I0929 10:58:30.233847 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/09cd799c-a613-4441-b407-b907862fec48-memberlist\") pod \"speaker-s4jmn\" (UID: \"09cd799c-a613-4441-b407-b907862fec48\") " pod="metallb-system/speaker-s4jmn" Sep 29 10:58:30 crc kubenswrapper[4752]: I0929 10:58:30.233904 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/20827846-6dfa-4a33-b073-182cd993a60a-metrics-certs\") pod \"controller-5d688f5ffc-fqwg9\" (UID: \"20827846-6dfa-4a33-b073-182cd993a60a\") " pod="metallb-system/controller-5d688f5ffc-fqwg9" Sep 29 10:58:30 crc kubenswrapper[4752]: E0929 10:58:30.234040 4752 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Sep 29 10:58:30 crc kubenswrapper[4752]: E0929 10:58:30.234170 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09cd799c-a613-4441-b407-b907862fec48-memberlist podName:09cd799c-a613-4441-b407-b907862fec48 nodeName:}" failed. No retries permitted until 2025-09-29 10:58:31.234134731 +0000 UTC m=+852.023276568 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/09cd799c-a613-4441-b407-b907862fec48-memberlist") pod "speaker-s4jmn" (UID: "09cd799c-a613-4441-b407-b907862fec48") : secret "metallb-memberlist" not found Sep 29 10:58:30 crc kubenswrapper[4752]: I0929 10:58:30.241167 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/20827846-6dfa-4a33-b073-182cd993a60a-metrics-certs\") pod \"controller-5d688f5ffc-fqwg9\" (UID: \"20827846-6dfa-4a33-b073-182cd993a60a\") " pod="metallb-system/controller-5d688f5ffc-fqwg9" Sep 29 10:58:30 crc kubenswrapper[4752]: I0929 10:58:30.241515 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/09cd799c-a613-4441-b407-b907862fec48-metrics-certs\") pod \"speaker-s4jmn\" (UID: \"09cd799c-a613-4441-b407-b907862fec48\") " pod="metallb-system/speaker-s4jmn" Sep 29 10:58:30 crc kubenswrapper[4752]: I0929 10:58:30.313377 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-5d688f5ffc-fqwg9" Sep 29 10:58:30 crc kubenswrapper[4752]: I0929 10:58:30.714759 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-5d688f5ffc-fqwg9"] Sep 29 10:58:30 crc kubenswrapper[4752]: I0929 10:58:30.842162 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-tmdtx" event={"ID":"bafc1f1a-c41b-4f47-bf6c-134aa6d5f9f0","Type":"ContainerStarted","Data":"dbd59249ad457ff3d1b32c8003cada52adadc0d90d4b40092c09e340a66d8bbe"} Sep 29 10:58:30 crc kubenswrapper[4752]: I0929 10:58:30.843662 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-rzcng" event={"ID":"1311e458-5fb3-4322-9c19-08bce8710d6e","Type":"ContainerStarted","Data":"7613d8ba5d2d4433f05f9d73d2e4074863aa2d0f930f5cb2c72be5b50122879f"} Sep 29 10:58:30 crc kubenswrapper[4752]: I0929 10:58:30.844904 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5d688f5ffc-fqwg9" event={"ID":"20827846-6dfa-4a33-b073-182cd993a60a","Type":"ContainerStarted","Data":"ea9b7dca1b79d4a50a63aec7006cb003a205be00feb771734fefee30fb339333"} Sep 29 10:58:31 crc kubenswrapper[4752]: I0929 10:58:31.246406 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/09cd799c-a613-4441-b407-b907862fec48-memberlist\") pod \"speaker-s4jmn\" (UID: \"09cd799c-a613-4441-b407-b907862fec48\") " pod="metallb-system/speaker-s4jmn" Sep 29 10:58:31 crc kubenswrapper[4752]: I0929 10:58:31.256765 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/09cd799c-a613-4441-b407-b907862fec48-memberlist\") pod \"speaker-s4jmn\" (UID: \"09cd799c-a613-4441-b407-b907862fec48\") " pod="metallb-system/speaker-s4jmn" Sep 29 10:58:31 crc kubenswrapper[4752]: I0929 10:58:31.407033 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-s4jmn" Sep 29 10:58:31 crc kubenswrapper[4752]: I0929 10:58:31.867679 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5d688f5ffc-fqwg9" event={"ID":"20827846-6dfa-4a33-b073-182cd993a60a","Type":"ContainerStarted","Data":"87a986c4f64e67cbd853a101cc333626ac2a4e9d9c1713fee4e36b080e140f7e"} Sep 29 10:58:31 crc kubenswrapper[4752]: I0929 10:58:31.867741 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5d688f5ffc-fqwg9" event={"ID":"20827846-6dfa-4a33-b073-182cd993a60a","Type":"ContainerStarted","Data":"08700d87bfede64288cf192c2df4a6aa7ba2aa058a778738d2841cc50286346b"} Sep 29 10:58:31 crc kubenswrapper[4752]: I0929 10:58:31.867975 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-5d688f5ffc-fqwg9" Sep 29 10:58:31 crc kubenswrapper[4752]: I0929 10:58:31.882822 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-s4jmn" event={"ID":"09cd799c-a613-4441-b407-b907862fec48","Type":"ContainerStarted","Data":"8edfc38597be161be4727568b92b908b99a87faa689dd9da2964af26db15816c"} Sep 29 10:58:31 crc kubenswrapper[4752]: I0929 10:58:31.882880 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-s4jmn" event={"ID":"09cd799c-a613-4441-b407-b907862fec48","Type":"ContainerStarted","Data":"7657c0d29cb548a6f3b569aca50c9e1ba70fc92a95ae1f3a88230512d5437348"} Sep 29 10:58:31 crc kubenswrapper[4752]: I0929 10:58:31.904439 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-5d688f5ffc-fqwg9" podStartSLOduration=2.904418015 podStartE2EDuration="2.904418015s" podCreationTimestamp="2025-09-29 10:58:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:58:31.9034722 +0000 UTC m=+852.692613867" watchObservedRunningTime="2025-09-29 10:58:31.904418015 +0000 UTC m=+852.693559682" Sep 29 10:58:32 crc kubenswrapper[4752]: I0929 10:58:32.898507 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-s4jmn" event={"ID":"09cd799c-a613-4441-b407-b907862fec48","Type":"ContainerStarted","Data":"77dbf65e0177cafc689be5c60c6bd26efc25c2963d3005f1730ad1ece4e54774"} Sep 29 10:58:32 crc kubenswrapper[4752]: I0929 10:58:32.899016 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-s4jmn" Sep 29 10:58:32 crc kubenswrapper[4752]: I0929 10:58:32.923222 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-s4jmn" podStartSLOduration=3.923194346 podStartE2EDuration="3.923194346s" podCreationTimestamp="2025-09-29 10:58:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:58:32.918443102 +0000 UTC m=+853.707584769" watchObservedRunningTime="2025-09-29 10:58:32.923194346 +0000 UTC m=+853.712336013" Sep 29 10:58:37 crc kubenswrapper[4752]: I0929 10:58:37.948891 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-tmdtx" event={"ID":"bafc1f1a-c41b-4f47-bf6c-134aa6d5f9f0","Type":"ContainerStarted","Data":"eb0c97d64032db3e4b2c0d61c98205da58f9dd85e54d126212548866638bc0e1"} Sep 29 10:58:37 crc kubenswrapper[4752]: I0929 10:58:37.949425 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-tmdtx" Sep 29 10:58:37 crc kubenswrapper[4752]: I0929 10:58:37.952658 4752 generic.go:334] "Generic (PLEG): container finished" podID="1311e458-5fb3-4322-9c19-08bce8710d6e" containerID="016eaf24ae56550c173df6cd9449016d58947c2bba15ea73fa9e30c81e8fdb6f" exitCode=0 Sep 29 10:58:37 crc kubenswrapper[4752]: I0929 10:58:37.952715 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-rzcng" event={"ID":"1311e458-5fb3-4322-9c19-08bce8710d6e","Type":"ContainerDied","Data":"016eaf24ae56550c173df6cd9449016d58947c2bba15ea73fa9e30c81e8fdb6f"} Sep 29 10:58:37 crc kubenswrapper[4752]: I0929 10:58:37.972056 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-tmdtx" podStartSLOduration=2.115953719 podStartE2EDuration="8.972030902s" podCreationTimestamp="2025-09-29 10:58:29 +0000 UTC" firstStartedPulling="2025-09-29 10:58:30.202913054 +0000 UTC m=+850.992054721" lastFinishedPulling="2025-09-29 10:58:37.058990237 +0000 UTC m=+857.848131904" observedRunningTime="2025-09-29 10:58:37.969155526 +0000 UTC m=+858.758297203" watchObservedRunningTime="2025-09-29 10:58:37.972030902 +0000 UTC m=+858.761172579" Sep 29 10:58:38 crc kubenswrapper[4752]: I0929 10:58:38.961416 4752 generic.go:334] "Generic (PLEG): container finished" podID="1311e458-5fb3-4322-9c19-08bce8710d6e" containerID="c5875fa32c2d037da1929391e129d33a970ddd550625b1ed4ccde1dfd5e7ba80" exitCode=0 Sep 29 10:58:38 crc kubenswrapper[4752]: I0929 10:58:38.961505 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-rzcng" event={"ID":"1311e458-5fb3-4322-9c19-08bce8710d6e","Type":"ContainerDied","Data":"c5875fa32c2d037da1929391e129d33a970ddd550625b1ed4ccde1dfd5e7ba80"} Sep 29 10:58:39 crc kubenswrapper[4752]: I0929 10:58:39.971041 4752 generic.go:334] "Generic (PLEG): container finished" podID="1311e458-5fb3-4322-9c19-08bce8710d6e" containerID="700edb0a039bc0b8c3a771f728ee6b1de93e1dbacf32b1b06df6dd5103c1526c" exitCode=0 Sep 29 10:58:39 crc kubenswrapper[4752]: I0929 10:58:39.971107 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-rzcng" event={"ID":"1311e458-5fb3-4322-9c19-08bce8710d6e","Type":"ContainerDied","Data":"700edb0a039bc0b8c3a771f728ee6b1de93e1dbacf32b1b06df6dd5103c1526c"} Sep 29 10:58:40 crc kubenswrapper[4752]: I0929 10:58:40.317595 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-5d688f5ffc-fqwg9" Sep 29 10:58:40 crc kubenswrapper[4752]: I0929 10:58:40.993690 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-rzcng" event={"ID":"1311e458-5fb3-4322-9c19-08bce8710d6e","Type":"ContainerStarted","Data":"842b0edb09f874c395f3cc10ab1835d136063ea15ad3533c959d5b82eb5a1b4b"} Sep 29 10:58:40 crc kubenswrapper[4752]: I0929 10:58:40.994153 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-rzcng" event={"ID":"1311e458-5fb3-4322-9c19-08bce8710d6e","Type":"ContainerStarted","Data":"d3a717e382395b9d52d4a01f7bc84d011ab6d1bb10ab2e29a8cc8bc956cd6d66"} Sep 29 10:58:40 crc kubenswrapper[4752]: I0929 10:58:40.994165 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-rzcng" event={"ID":"1311e458-5fb3-4322-9c19-08bce8710d6e","Type":"ContainerStarted","Data":"064df524020a85d6f26ac486a4fdf5762d1553dab2841d3705a3e8fc2cbfbe44"} Sep 29 10:58:40 crc kubenswrapper[4752]: I0929 10:58:40.994330 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-rzcng" Sep 29 10:58:40 crc kubenswrapper[4752]: I0929 10:58:40.994372 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-rzcng" event={"ID":"1311e458-5fb3-4322-9c19-08bce8710d6e","Type":"ContainerStarted","Data":"0e3c64b6824f5aa1376913394894a9c6ee39d09184bb984109d3023c7b1cc8a0"} Sep 29 10:58:40 crc kubenswrapper[4752]: I0929 10:58:40.994390 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-rzcng" event={"ID":"1311e458-5fb3-4322-9c19-08bce8710d6e","Type":"ContainerStarted","Data":"c46d3384a7a72a7cd5135b144b6098241e39434cabb3bc6a235d383f5e386267"} Sep 29 10:58:40 crc kubenswrapper[4752]: I0929 10:58:40.994399 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-rzcng" event={"ID":"1311e458-5fb3-4322-9c19-08bce8710d6e","Type":"ContainerStarted","Data":"01f663a79a0b1cd23afd9d6f5948652ebd892e5579a57b2ffcc095f181f7265a"} Sep 29 10:58:41 crc kubenswrapper[4752]: I0929 10:58:41.025659 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-rzcng" podStartSLOduration=5.02892693 podStartE2EDuration="12.025637713s" podCreationTimestamp="2025-09-29 10:58:29 +0000 UTC" firstStartedPulling="2025-09-29 10:58:30.082631607 +0000 UTC m=+850.871773274" lastFinishedPulling="2025-09-29 10:58:37.07934239 +0000 UTC m=+857.868484057" observedRunningTime="2025-09-29 10:58:41.023945069 +0000 UTC m=+861.813086786" watchObservedRunningTime="2025-09-29 10:58:41.025637713 +0000 UTC m=+861.814779380" Sep 29 10:58:41 crc kubenswrapper[4752]: I0929 10:58:41.412308 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-s4jmn" Sep 29 10:58:42 crc kubenswrapper[4752]: I0929 10:58:42.838912 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb694ldsh"] Sep 29 10:58:42 crc kubenswrapper[4752]: I0929 10:58:42.841710 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb694ldsh" Sep 29 10:58:42 crc kubenswrapper[4752]: I0929 10:58:42.846728 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Sep 29 10:58:42 crc kubenswrapper[4752]: I0929 10:58:42.855073 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pcqb\" (UniqueName: \"kubernetes.io/projected/67244ff2-a859-4d03-8ef2-8898fcfd01c0-kube-api-access-4pcqb\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb694ldsh\" (UID: \"67244ff2-a859-4d03-8ef2-8898fcfd01c0\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb694ldsh" Sep 29 10:58:42 crc kubenswrapper[4752]: I0929 10:58:42.855545 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/67244ff2-a859-4d03-8ef2-8898fcfd01c0-util\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb694ldsh\" (UID: \"67244ff2-a859-4d03-8ef2-8898fcfd01c0\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb694ldsh" Sep 29 10:58:42 crc kubenswrapper[4752]: I0929 10:58:42.855768 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/67244ff2-a859-4d03-8ef2-8898fcfd01c0-bundle\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb694ldsh\" (UID: \"67244ff2-a859-4d03-8ef2-8898fcfd01c0\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb694ldsh" Sep 29 10:58:42 crc kubenswrapper[4752]: I0929 10:58:42.860540 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb694ldsh"] Sep 29 10:58:42 crc kubenswrapper[4752]: I0929 10:58:42.957286 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4pcqb\" (UniqueName: \"kubernetes.io/projected/67244ff2-a859-4d03-8ef2-8898fcfd01c0-kube-api-access-4pcqb\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb694ldsh\" (UID: \"67244ff2-a859-4d03-8ef2-8898fcfd01c0\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb694ldsh" Sep 29 10:58:42 crc kubenswrapper[4752]: I0929 10:58:42.957418 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/67244ff2-a859-4d03-8ef2-8898fcfd01c0-util\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb694ldsh\" (UID: \"67244ff2-a859-4d03-8ef2-8898fcfd01c0\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb694ldsh" Sep 29 10:58:42 crc kubenswrapper[4752]: I0929 10:58:42.957472 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/67244ff2-a859-4d03-8ef2-8898fcfd01c0-bundle\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb694ldsh\" (UID: \"67244ff2-a859-4d03-8ef2-8898fcfd01c0\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb694ldsh" Sep 29 10:58:42 crc kubenswrapper[4752]: I0929 10:58:42.958054 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/67244ff2-a859-4d03-8ef2-8898fcfd01c0-util\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb694ldsh\" (UID: \"67244ff2-a859-4d03-8ef2-8898fcfd01c0\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb694ldsh" Sep 29 10:58:42 crc kubenswrapper[4752]: I0929 10:58:42.958196 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/67244ff2-a859-4d03-8ef2-8898fcfd01c0-bundle\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb694ldsh\" (UID: \"67244ff2-a859-4d03-8ef2-8898fcfd01c0\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb694ldsh" Sep 29 10:58:42 crc kubenswrapper[4752]: I0929 10:58:42.977419 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pcqb\" (UniqueName: \"kubernetes.io/projected/67244ff2-a859-4d03-8ef2-8898fcfd01c0-kube-api-access-4pcqb\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb694ldsh\" (UID: \"67244ff2-a859-4d03-8ef2-8898fcfd01c0\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb694ldsh" Sep 29 10:58:43 crc kubenswrapper[4752]: I0929 10:58:43.168892 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb694ldsh" Sep 29 10:58:43 crc kubenswrapper[4752]: I0929 10:58:43.585365 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb694ldsh"] Sep 29 10:58:43 crc kubenswrapper[4752]: W0929 10:58:43.594192 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67244ff2_a859_4d03_8ef2_8898fcfd01c0.slice/crio-ddc62d8f54009c8e866fb4354a24a231237f6bd15b86d81f2585439e28cf06b1 WatchSource:0}: Error finding container ddc62d8f54009c8e866fb4354a24a231237f6bd15b86d81f2585439e28cf06b1: Status 404 returned error can't find the container with id ddc62d8f54009c8e866fb4354a24a231237f6bd15b86d81f2585439e28cf06b1 Sep 29 10:58:44 crc kubenswrapper[4752]: I0929 10:58:44.020544 4752 generic.go:334] "Generic (PLEG): container finished" podID="67244ff2-a859-4d03-8ef2-8898fcfd01c0" containerID="337f75e2e67906d22f279d382d6d9e2c5c591f793e5adcb00a9cccfb52d1fe94" exitCode=0 Sep 29 10:58:44 crc kubenswrapper[4752]: I0929 10:58:44.020606 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb694ldsh" event={"ID":"67244ff2-a859-4d03-8ef2-8898fcfd01c0","Type":"ContainerDied","Data":"337f75e2e67906d22f279d382d6d9e2c5c591f793e5adcb00a9cccfb52d1fe94"} Sep 29 10:58:44 crc kubenswrapper[4752]: I0929 10:58:44.020643 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb694ldsh" event={"ID":"67244ff2-a859-4d03-8ef2-8898fcfd01c0","Type":"ContainerStarted","Data":"ddc62d8f54009c8e866fb4354a24a231237f6bd15b86d81f2585439e28cf06b1"} Sep 29 10:58:44 crc kubenswrapper[4752]: I0929 10:58:44.854109 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-rzcng" Sep 29 10:58:44 crc kubenswrapper[4752]: I0929 10:58:44.894263 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-rzcng" Sep 29 10:58:48 crc kubenswrapper[4752]: I0929 10:58:48.050759 4752 generic.go:334] "Generic (PLEG): container finished" podID="67244ff2-a859-4d03-8ef2-8898fcfd01c0" containerID="ed7ea0e918179a4dcb35606cf1a6a98234c10cf72a49d137744337513e16aa5d" exitCode=0 Sep 29 10:58:48 crc kubenswrapper[4752]: I0929 10:58:48.050859 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb694ldsh" event={"ID":"67244ff2-a859-4d03-8ef2-8898fcfd01c0","Type":"ContainerDied","Data":"ed7ea0e918179a4dcb35606cf1a6a98234c10cf72a49d137744337513e16aa5d"} Sep 29 10:58:49 crc kubenswrapper[4752]: I0929 10:58:49.062039 4752 generic.go:334] "Generic (PLEG): container finished" podID="67244ff2-a859-4d03-8ef2-8898fcfd01c0" containerID="053d99e054e62e968330c3bb1dd190145e2f2b6d0ba37f3fc1ed03a89c123821" exitCode=0 Sep 29 10:58:49 crc kubenswrapper[4752]: I0929 10:58:49.062142 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb694ldsh" event={"ID":"67244ff2-a859-4d03-8ef2-8898fcfd01c0","Type":"ContainerDied","Data":"053d99e054e62e968330c3bb1dd190145e2f2b6d0ba37f3fc1ed03a89c123821"} Sep 29 10:58:49 crc kubenswrapper[4752]: I0929 10:58:49.846471 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-tmdtx" Sep 29 10:58:50 crc kubenswrapper[4752]: I0929 10:58:50.320480 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb694ldsh" Sep 29 10:58:50 crc kubenswrapper[4752]: I0929 10:58:50.369703 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/67244ff2-a859-4d03-8ef2-8898fcfd01c0-util\") pod \"67244ff2-a859-4d03-8ef2-8898fcfd01c0\" (UID: \"67244ff2-a859-4d03-8ef2-8898fcfd01c0\") " Sep 29 10:58:50 crc kubenswrapper[4752]: I0929 10:58:50.370021 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4pcqb\" (UniqueName: \"kubernetes.io/projected/67244ff2-a859-4d03-8ef2-8898fcfd01c0-kube-api-access-4pcqb\") pod \"67244ff2-a859-4d03-8ef2-8898fcfd01c0\" (UID: \"67244ff2-a859-4d03-8ef2-8898fcfd01c0\") " Sep 29 10:58:50 crc kubenswrapper[4752]: I0929 10:58:50.370108 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/67244ff2-a859-4d03-8ef2-8898fcfd01c0-bundle\") pod \"67244ff2-a859-4d03-8ef2-8898fcfd01c0\" (UID: \"67244ff2-a859-4d03-8ef2-8898fcfd01c0\") " Sep 29 10:58:50 crc kubenswrapper[4752]: I0929 10:58:50.371690 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67244ff2-a859-4d03-8ef2-8898fcfd01c0-bundle" (OuterVolumeSpecName: "bundle") pod "67244ff2-a859-4d03-8ef2-8898fcfd01c0" (UID: "67244ff2-a859-4d03-8ef2-8898fcfd01c0"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:58:50 crc kubenswrapper[4752]: I0929 10:58:50.377544 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67244ff2-a859-4d03-8ef2-8898fcfd01c0-kube-api-access-4pcqb" (OuterVolumeSpecName: "kube-api-access-4pcqb") pod "67244ff2-a859-4d03-8ef2-8898fcfd01c0" (UID: "67244ff2-a859-4d03-8ef2-8898fcfd01c0"). InnerVolumeSpecName "kube-api-access-4pcqb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:58:50 crc kubenswrapper[4752]: I0929 10:58:50.383303 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67244ff2-a859-4d03-8ef2-8898fcfd01c0-util" (OuterVolumeSpecName: "util") pod "67244ff2-a859-4d03-8ef2-8898fcfd01c0" (UID: "67244ff2-a859-4d03-8ef2-8898fcfd01c0"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:58:50 crc kubenswrapper[4752]: I0929 10:58:50.472933 4752 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/67244ff2-a859-4d03-8ef2-8898fcfd01c0-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:58:50 crc kubenswrapper[4752]: I0929 10:58:50.472983 4752 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/67244ff2-a859-4d03-8ef2-8898fcfd01c0-util\") on node \"crc\" DevicePath \"\"" Sep 29 10:58:50 crc kubenswrapper[4752]: I0929 10:58:50.472997 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4pcqb\" (UniqueName: \"kubernetes.io/projected/67244ff2-a859-4d03-8ef2-8898fcfd01c0-kube-api-access-4pcqb\") on node \"crc\" DevicePath \"\"" Sep 29 10:58:51 crc kubenswrapper[4752]: I0929 10:58:51.080743 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb694ldsh" event={"ID":"67244ff2-a859-4d03-8ef2-8898fcfd01c0","Type":"ContainerDied","Data":"ddc62d8f54009c8e866fb4354a24a231237f6bd15b86d81f2585439e28cf06b1"} Sep 29 10:58:51 crc kubenswrapper[4752]: I0929 10:58:51.080824 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ddc62d8f54009c8e866fb4354a24a231237f6bd15b86d81f2585439e28cf06b1" Sep 29 10:58:51 crc kubenswrapper[4752]: I0929 10:58:51.080886 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb694ldsh" Sep 29 10:58:56 crc kubenswrapper[4752]: I0929 10:58:56.176089 4752 patch_prober.go:28] interesting pod/machine-config-daemon-mgrvs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 10:58:56 crc kubenswrapper[4752]: I0929 10:58:56.176743 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 10:58:56 crc kubenswrapper[4752]: I0929 10:58:56.847049 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-2w769"] Sep 29 10:58:56 crc kubenswrapper[4752]: E0929 10:58:56.847422 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67244ff2-a859-4d03-8ef2-8898fcfd01c0" containerName="extract" Sep 29 10:58:56 crc kubenswrapper[4752]: I0929 10:58:56.847442 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="67244ff2-a859-4d03-8ef2-8898fcfd01c0" containerName="extract" Sep 29 10:58:56 crc kubenswrapper[4752]: E0929 10:58:56.847456 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67244ff2-a859-4d03-8ef2-8898fcfd01c0" containerName="pull" Sep 29 10:58:56 crc kubenswrapper[4752]: I0929 10:58:56.847464 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="67244ff2-a859-4d03-8ef2-8898fcfd01c0" containerName="pull" Sep 29 10:58:56 crc kubenswrapper[4752]: E0929 10:58:56.847496 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67244ff2-a859-4d03-8ef2-8898fcfd01c0" containerName="util" Sep 29 10:58:56 crc kubenswrapper[4752]: I0929 10:58:56.847508 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="67244ff2-a859-4d03-8ef2-8898fcfd01c0" containerName="util" Sep 29 10:58:56 crc kubenswrapper[4752]: I0929 10:58:56.847668 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="67244ff2-a859-4d03-8ef2-8898fcfd01c0" containerName="extract" Sep 29 10:58:56 crc kubenswrapper[4752]: I0929 10:58:56.848331 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-2w769" Sep 29 10:58:56 crc kubenswrapper[4752]: I0929 10:58:56.852298 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Sep 29 10:58:56 crc kubenswrapper[4752]: I0929 10:58:56.852637 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Sep 29 10:58:56 crc kubenswrapper[4752]: I0929 10:58:56.857768 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-425tw\" (UniqueName: \"kubernetes.io/projected/f70adf67-7162-4a14-a393-514f3732fa19-kube-api-access-425tw\") pod \"cert-manager-operator-controller-manager-57cd46d6d-2w769\" (UID: \"f70adf67-7162-4a14-a393-514f3732fa19\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-2w769" Sep 29 10:58:56 crc kubenswrapper[4752]: I0929 10:58:56.876771 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-2w769"] Sep 29 10:58:56 crc kubenswrapper[4752]: I0929 10:58:56.959655 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-425tw\" (UniqueName: \"kubernetes.io/projected/f70adf67-7162-4a14-a393-514f3732fa19-kube-api-access-425tw\") pod \"cert-manager-operator-controller-manager-57cd46d6d-2w769\" (UID: \"f70adf67-7162-4a14-a393-514f3732fa19\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-2w769" Sep 29 10:58:56 crc kubenswrapper[4752]: I0929 10:58:56.989198 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-425tw\" (UniqueName: \"kubernetes.io/projected/f70adf67-7162-4a14-a393-514f3732fa19-kube-api-access-425tw\") pod \"cert-manager-operator-controller-manager-57cd46d6d-2w769\" (UID: \"f70adf67-7162-4a14-a393-514f3732fa19\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-2w769" Sep 29 10:58:57 crc kubenswrapper[4752]: I0929 10:58:57.165584 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-2w769" Sep 29 10:58:57 crc kubenswrapper[4752]: I0929 10:58:57.400327 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-2w769"] Sep 29 10:58:57 crc kubenswrapper[4752]: W0929 10:58:57.409045 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf70adf67_7162_4a14_a393_514f3732fa19.slice/crio-cb65366ef1123822ecedb982ffb4e9521fd2134a199b2d1f760c560bd11ed4fe WatchSource:0}: Error finding container cb65366ef1123822ecedb982ffb4e9521fd2134a199b2d1f760c560bd11ed4fe: Status 404 returned error can't find the container with id cb65366ef1123822ecedb982ffb4e9521fd2134a199b2d1f760c560bd11ed4fe Sep 29 10:58:58 crc kubenswrapper[4752]: I0929 10:58:58.128164 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-2w769" event={"ID":"f70adf67-7162-4a14-a393-514f3732fa19","Type":"ContainerStarted","Data":"cb65366ef1123822ecedb982ffb4e9521fd2134a199b2d1f760c560bd11ed4fe"} Sep 29 10:58:59 crc kubenswrapper[4752]: I0929 10:58:59.857786 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-rzcng" Sep 29 10:59:01 crc kubenswrapper[4752]: I0929 10:59:01.152911 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-2w769" event={"ID":"f70adf67-7162-4a14-a393-514f3732fa19","Type":"ContainerStarted","Data":"781e75e7bbcb12e4549cfa9a30951d3306912eff9c22bbbb052025b906117082"} Sep 29 10:59:01 crc kubenswrapper[4752]: I0929 10:59:01.170110 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-2w769" podStartSLOduration=2.3042413059999998 podStartE2EDuration="5.170079246s" podCreationTimestamp="2025-09-29 10:58:56 +0000 UTC" firstStartedPulling="2025-09-29 10:58:57.411024891 +0000 UTC m=+878.200166558" lastFinishedPulling="2025-09-29 10:59:00.276862831 +0000 UTC m=+881.066004498" observedRunningTime="2025-09-29 10:59:01.169565793 +0000 UTC m=+881.958707470" watchObservedRunningTime="2025-09-29 10:59:01.170079246 +0000 UTC m=+881.959220933" Sep 29 10:59:04 crc kubenswrapper[4752]: I0929 10:59:04.385388 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-d969966f-bqbrt"] Sep 29 10:59:04 crc kubenswrapper[4752]: I0929 10:59:04.386537 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-d969966f-bqbrt" Sep 29 10:59:04 crc kubenswrapper[4752]: I0929 10:59:04.388878 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Sep 29 10:59:04 crc kubenswrapper[4752]: I0929 10:59:04.389010 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Sep 29 10:59:04 crc kubenswrapper[4752]: I0929 10:59:04.397564 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-d969966f-bqbrt"] Sep 29 10:59:04 crc kubenswrapper[4752]: I0929 10:59:04.473044 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xw28\" (UniqueName: \"kubernetes.io/projected/466779ed-d985-4eba-9277-1c5be5b56f9d-kube-api-access-5xw28\") pod \"cert-manager-webhook-d969966f-bqbrt\" (UID: \"466779ed-d985-4eba-9277-1c5be5b56f9d\") " pod="cert-manager/cert-manager-webhook-d969966f-bqbrt" Sep 29 10:59:04 crc kubenswrapper[4752]: I0929 10:59:04.473102 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/466779ed-d985-4eba-9277-1c5be5b56f9d-bound-sa-token\") pod \"cert-manager-webhook-d969966f-bqbrt\" (UID: \"466779ed-d985-4eba-9277-1c5be5b56f9d\") " pod="cert-manager/cert-manager-webhook-d969966f-bqbrt" Sep 29 10:59:04 crc kubenswrapper[4752]: I0929 10:59:04.574148 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xw28\" (UniqueName: \"kubernetes.io/projected/466779ed-d985-4eba-9277-1c5be5b56f9d-kube-api-access-5xw28\") pod \"cert-manager-webhook-d969966f-bqbrt\" (UID: \"466779ed-d985-4eba-9277-1c5be5b56f9d\") " pod="cert-manager/cert-manager-webhook-d969966f-bqbrt" Sep 29 10:59:04 crc kubenswrapper[4752]: I0929 10:59:04.574209 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/466779ed-d985-4eba-9277-1c5be5b56f9d-bound-sa-token\") pod \"cert-manager-webhook-d969966f-bqbrt\" (UID: \"466779ed-d985-4eba-9277-1c5be5b56f9d\") " pod="cert-manager/cert-manager-webhook-d969966f-bqbrt" Sep 29 10:59:04 crc kubenswrapper[4752]: I0929 10:59:04.592726 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/466779ed-d985-4eba-9277-1c5be5b56f9d-bound-sa-token\") pod \"cert-manager-webhook-d969966f-bqbrt\" (UID: \"466779ed-d985-4eba-9277-1c5be5b56f9d\") " pod="cert-manager/cert-manager-webhook-d969966f-bqbrt" Sep 29 10:59:04 crc kubenswrapper[4752]: I0929 10:59:04.592918 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xw28\" (UniqueName: \"kubernetes.io/projected/466779ed-d985-4eba-9277-1c5be5b56f9d-kube-api-access-5xw28\") pod \"cert-manager-webhook-d969966f-bqbrt\" (UID: \"466779ed-d985-4eba-9277-1c5be5b56f9d\") " pod="cert-manager/cert-manager-webhook-d969966f-bqbrt" Sep 29 10:59:04 crc kubenswrapper[4752]: I0929 10:59:04.709366 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-d969966f-bqbrt" Sep 29 10:59:04 crc kubenswrapper[4752]: I0929 10:59:04.922725 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-d969966f-bqbrt"] Sep 29 10:59:04 crc kubenswrapper[4752]: W0929 10:59:04.938582 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod466779ed_d985_4eba_9277_1c5be5b56f9d.slice/crio-6bd02067dfef554fd77c179b62a39413c924f096fe5e62c45ea017a4e1125a8c WatchSource:0}: Error finding container 6bd02067dfef554fd77c179b62a39413c924f096fe5e62c45ea017a4e1125a8c: Status 404 returned error can't find the container with id 6bd02067dfef554fd77c179b62a39413c924f096fe5e62c45ea017a4e1125a8c Sep 29 10:59:05 crc kubenswrapper[4752]: I0929 10:59:05.179357 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-d969966f-bqbrt" event={"ID":"466779ed-d985-4eba-9277-1c5be5b56f9d","Type":"ContainerStarted","Data":"6bd02067dfef554fd77c179b62a39413c924f096fe5e62c45ea017a4e1125a8c"} Sep 29 10:59:05 crc kubenswrapper[4752]: I0929 10:59:05.757295 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7d9f95dbf-vqkv7"] Sep 29 10:59:05 crc kubenswrapper[4752]: I0929 10:59:05.758328 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-vqkv7" Sep 29 10:59:05 crc kubenswrapper[4752]: I0929 10:59:05.767080 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7d9f95dbf-vqkv7"] Sep 29 10:59:05 crc kubenswrapper[4752]: I0929 10:59:05.894617 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/31dd96e4-2a05-4c30-b142-2cdcd6d82798-bound-sa-token\") pod \"cert-manager-cainjector-7d9f95dbf-vqkv7\" (UID: \"31dd96e4-2a05-4c30-b142-2cdcd6d82798\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-vqkv7" Sep 29 10:59:05 crc kubenswrapper[4752]: I0929 10:59:05.894685 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fc5wm\" (UniqueName: \"kubernetes.io/projected/31dd96e4-2a05-4c30-b142-2cdcd6d82798-kube-api-access-fc5wm\") pod \"cert-manager-cainjector-7d9f95dbf-vqkv7\" (UID: \"31dd96e4-2a05-4c30-b142-2cdcd6d82798\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-vqkv7" Sep 29 10:59:05 crc kubenswrapper[4752]: I0929 10:59:05.996064 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/31dd96e4-2a05-4c30-b142-2cdcd6d82798-bound-sa-token\") pod \"cert-manager-cainjector-7d9f95dbf-vqkv7\" (UID: \"31dd96e4-2a05-4c30-b142-2cdcd6d82798\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-vqkv7" Sep 29 10:59:05 crc kubenswrapper[4752]: I0929 10:59:05.996145 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fc5wm\" (UniqueName: \"kubernetes.io/projected/31dd96e4-2a05-4c30-b142-2cdcd6d82798-kube-api-access-fc5wm\") pod \"cert-manager-cainjector-7d9f95dbf-vqkv7\" (UID: \"31dd96e4-2a05-4c30-b142-2cdcd6d82798\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-vqkv7" Sep 29 10:59:06 crc kubenswrapper[4752]: I0929 10:59:06.019707 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/31dd96e4-2a05-4c30-b142-2cdcd6d82798-bound-sa-token\") pod \"cert-manager-cainjector-7d9f95dbf-vqkv7\" (UID: \"31dd96e4-2a05-4c30-b142-2cdcd6d82798\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-vqkv7" Sep 29 10:59:06 crc kubenswrapper[4752]: I0929 10:59:06.019904 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fc5wm\" (UniqueName: \"kubernetes.io/projected/31dd96e4-2a05-4c30-b142-2cdcd6d82798-kube-api-access-fc5wm\") pod \"cert-manager-cainjector-7d9f95dbf-vqkv7\" (UID: \"31dd96e4-2a05-4c30-b142-2cdcd6d82798\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-vqkv7" Sep 29 10:59:06 crc kubenswrapper[4752]: I0929 10:59:06.078006 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-vqkv7" Sep 29 10:59:06 crc kubenswrapper[4752]: I0929 10:59:06.498218 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7d9f95dbf-vqkv7"] Sep 29 10:59:07 crc kubenswrapper[4752]: I0929 10:59:07.197206 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-vqkv7" event={"ID":"31dd96e4-2a05-4c30-b142-2cdcd6d82798","Type":"ContainerStarted","Data":"fc68e08fcf51266c32ff5d725b1bdc54731c0a510d116f684631486d52bee89c"} Sep 29 10:59:10 crc kubenswrapper[4752]: I0929 10:59:10.219709 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-vqkv7" event={"ID":"31dd96e4-2a05-4c30-b142-2cdcd6d82798","Type":"ContainerStarted","Data":"c7dababff4a3447c2095de21715d8daea923d7a64fe883aa371e5e425a1db050"} Sep 29 10:59:10 crc kubenswrapper[4752]: I0929 10:59:10.221551 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-d969966f-bqbrt" event={"ID":"466779ed-d985-4eba-9277-1c5be5b56f9d","Type":"ContainerStarted","Data":"5b3eabd1c1e69edf7dcb7de3f28ac85dfb294b45ae98e0cd090c88a6124e50fe"} Sep 29 10:59:10 crc kubenswrapper[4752]: I0929 10:59:10.221627 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-d969966f-bqbrt" Sep 29 10:59:10 crc kubenswrapper[4752]: I0929 10:59:10.248676 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-vqkv7" podStartSLOduration=2.600283918 podStartE2EDuration="5.248637717s" podCreationTimestamp="2025-09-29 10:59:05 +0000 UTC" firstStartedPulling="2025-09-29 10:59:06.513711034 +0000 UTC m=+887.302852701" lastFinishedPulling="2025-09-29 10:59:09.162064833 +0000 UTC m=+889.951206500" observedRunningTime="2025-09-29 10:59:10.235960176 +0000 UTC m=+891.025101863" watchObservedRunningTime="2025-09-29 10:59:10.248637717 +0000 UTC m=+891.037779424" Sep 29 10:59:10 crc kubenswrapper[4752]: I0929 10:59:10.262038 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-d969966f-bqbrt" podStartSLOduration=2.055877478 podStartE2EDuration="6.262019178s" podCreationTimestamp="2025-09-29 10:59:04 +0000 UTC" firstStartedPulling="2025-09-29 10:59:04.942222845 +0000 UTC m=+885.731364512" lastFinishedPulling="2025-09-29 10:59:09.148364545 +0000 UTC m=+889.937506212" observedRunningTime="2025-09-29 10:59:10.259281816 +0000 UTC m=+891.048423483" watchObservedRunningTime="2025-09-29 10:59:10.262019178 +0000 UTC m=+891.051160875" Sep 29 10:59:14 crc kubenswrapper[4752]: I0929 10:59:14.713241 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-d969966f-bqbrt" Sep 29 10:59:23 crc kubenswrapper[4752]: I0929 10:59:23.140353 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-7d4cc89fcb-pcv7b"] Sep 29 10:59:23 crc kubenswrapper[4752]: I0929 10:59:23.142011 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-7d4cc89fcb-pcv7b" Sep 29 10:59:23 crc kubenswrapper[4752]: I0929 10:59:23.153722 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-7d4cc89fcb-pcv7b"] Sep 29 10:59:23 crc kubenswrapper[4752]: I0929 10:59:23.154995 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/77aec240-9f3c-4bbd-bb60-c1c1a81999ae-bound-sa-token\") pod \"cert-manager-7d4cc89fcb-pcv7b\" (UID: \"77aec240-9f3c-4bbd-bb60-c1c1a81999ae\") " pod="cert-manager/cert-manager-7d4cc89fcb-pcv7b" Sep 29 10:59:23 crc kubenswrapper[4752]: I0929 10:59:23.155055 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jb8m2\" (UniqueName: \"kubernetes.io/projected/77aec240-9f3c-4bbd-bb60-c1c1a81999ae-kube-api-access-jb8m2\") pod \"cert-manager-7d4cc89fcb-pcv7b\" (UID: \"77aec240-9f3c-4bbd-bb60-c1c1a81999ae\") " pod="cert-manager/cert-manager-7d4cc89fcb-pcv7b" Sep 29 10:59:23 crc kubenswrapper[4752]: I0929 10:59:23.256485 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/77aec240-9f3c-4bbd-bb60-c1c1a81999ae-bound-sa-token\") pod \"cert-manager-7d4cc89fcb-pcv7b\" (UID: \"77aec240-9f3c-4bbd-bb60-c1c1a81999ae\") " pod="cert-manager/cert-manager-7d4cc89fcb-pcv7b" Sep 29 10:59:23 crc kubenswrapper[4752]: I0929 10:59:23.256560 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jb8m2\" (UniqueName: \"kubernetes.io/projected/77aec240-9f3c-4bbd-bb60-c1c1a81999ae-kube-api-access-jb8m2\") pod \"cert-manager-7d4cc89fcb-pcv7b\" (UID: \"77aec240-9f3c-4bbd-bb60-c1c1a81999ae\") " pod="cert-manager/cert-manager-7d4cc89fcb-pcv7b" Sep 29 10:59:23 crc kubenswrapper[4752]: I0929 10:59:23.284692 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/77aec240-9f3c-4bbd-bb60-c1c1a81999ae-bound-sa-token\") pod \"cert-manager-7d4cc89fcb-pcv7b\" (UID: \"77aec240-9f3c-4bbd-bb60-c1c1a81999ae\") " pod="cert-manager/cert-manager-7d4cc89fcb-pcv7b" Sep 29 10:59:23 crc kubenswrapper[4752]: I0929 10:59:23.285104 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jb8m2\" (UniqueName: \"kubernetes.io/projected/77aec240-9f3c-4bbd-bb60-c1c1a81999ae-kube-api-access-jb8m2\") pod \"cert-manager-7d4cc89fcb-pcv7b\" (UID: \"77aec240-9f3c-4bbd-bb60-c1c1a81999ae\") " pod="cert-manager/cert-manager-7d4cc89fcb-pcv7b" Sep 29 10:59:23 crc kubenswrapper[4752]: I0929 10:59:23.461541 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-7d4cc89fcb-pcv7b" Sep 29 10:59:23 crc kubenswrapper[4752]: I0929 10:59:23.674662 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-7d4cc89fcb-pcv7b"] Sep 29 10:59:23 crc kubenswrapper[4752]: W0929 10:59:23.682661 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod77aec240_9f3c_4bbd_bb60_c1c1a81999ae.slice/crio-984ce1cf5e7130a5658fbe6a91e0c8eec891146ef9415c61f2761a34b265b62d WatchSource:0}: Error finding container 984ce1cf5e7130a5658fbe6a91e0c8eec891146ef9415c61f2761a34b265b62d: Status 404 returned error can't find the container with id 984ce1cf5e7130a5658fbe6a91e0c8eec891146ef9415c61f2761a34b265b62d Sep 29 10:59:24 crc kubenswrapper[4752]: I0929 10:59:24.324212 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-7d4cc89fcb-pcv7b" event={"ID":"77aec240-9f3c-4bbd-bb60-c1c1a81999ae","Type":"ContainerStarted","Data":"107dbf1ff9395444f10360bcb4176560a05d5691228623b995a9c065784525db"} Sep 29 10:59:24 crc kubenswrapper[4752]: I0929 10:59:24.324722 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-7d4cc89fcb-pcv7b" event={"ID":"77aec240-9f3c-4bbd-bb60-c1c1a81999ae","Type":"ContainerStarted","Data":"984ce1cf5e7130a5658fbe6a91e0c8eec891146ef9415c61f2761a34b265b62d"} Sep 29 10:59:24 crc kubenswrapper[4752]: I0929 10:59:24.342983 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-7d4cc89fcb-pcv7b" podStartSLOduration=1.3429595220000001 podStartE2EDuration="1.342959522s" podCreationTimestamp="2025-09-29 10:59:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:59:24.341952276 +0000 UTC m=+905.131093953" watchObservedRunningTime="2025-09-29 10:59:24.342959522 +0000 UTC m=+905.132101199" Sep 29 10:59:26 crc kubenswrapper[4752]: I0929 10:59:26.175736 4752 patch_prober.go:28] interesting pod/machine-config-daemon-mgrvs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 10:59:26 crc kubenswrapper[4752]: I0929 10:59:26.175869 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 10:59:28 crc kubenswrapper[4752]: I0929 10:59:28.464217 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-7m2nw"] Sep 29 10:59:28 crc kubenswrapper[4752]: I0929 10:59:28.466078 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-7m2nw" Sep 29 10:59:28 crc kubenswrapper[4752]: I0929 10:59:28.470571 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Sep 29 10:59:28 crc kubenswrapper[4752]: I0929 10:59:28.470782 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Sep 29 10:59:28 crc kubenswrapper[4752]: I0929 10:59:28.484758 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-7m2nw"] Sep 29 10:59:28 crc kubenswrapper[4752]: I0929 10:59:28.539864 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-742mt\" (UniqueName: \"kubernetes.io/projected/b3885b46-efb3-4122-aed2-c12627c56d80-kube-api-access-742mt\") pod \"openstack-operator-index-7m2nw\" (UID: \"b3885b46-efb3-4122-aed2-c12627c56d80\") " pod="openstack-operators/openstack-operator-index-7m2nw" Sep 29 10:59:28 crc kubenswrapper[4752]: I0929 10:59:28.641622 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-742mt\" (UniqueName: \"kubernetes.io/projected/b3885b46-efb3-4122-aed2-c12627c56d80-kube-api-access-742mt\") pod \"openstack-operator-index-7m2nw\" (UID: \"b3885b46-efb3-4122-aed2-c12627c56d80\") " pod="openstack-operators/openstack-operator-index-7m2nw" Sep 29 10:59:28 crc kubenswrapper[4752]: I0929 10:59:28.669095 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-742mt\" (UniqueName: \"kubernetes.io/projected/b3885b46-efb3-4122-aed2-c12627c56d80-kube-api-access-742mt\") pod \"openstack-operator-index-7m2nw\" (UID: \"b3885b46-efb3-4122-aed2-c12627c56d80\") " pod="openstack-operators/openstack-operator-index-7m2nw" Sep 29 10:59:28 crc kubenswrapper[4752]: I0929 10:59:28.791284 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-7m2nw" Sep 29 10:59:28 crc kubenswrapper[4752]: I0929 10:59:28.987038 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-7m2nw"] Sep 29 10:59:29 crc kubenswrapper[4752]: I0929 10:59:29.357762 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-7m2nw" event={"ID":"b3885b46-efb3-4122-aed2-c12627c56d80","Type":"ContainerStarted","Data":"2baf708750cfdde239899610c908093e0fa375488c155ca8c17d7732c9874b42"} Sep 29 10:59:31 crc kubenswrapper[4752]: I0929 10:59:31.839424 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-7m2nw"] Sep 29 10:59:32 crc kubenswrapper[4752]: I0929 10:59:32.445075 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-rk7d4"] Sep 29 10:59:32 crc kubenswrapper[4752]: I0929 10:59:32.452100 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-rk7d4" Sep 29 10:59:32 crc kubenswrapper[4752]: I0929 10:59:32.460065 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-rk7d4"] Sep 29 10:59:32 crc kubenswrapper[4752]: I0929 10:59:32.616293 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tc8g\" (UniqueName: \"kubernetes.io/projected/16fdd20d-d43b-4b62-92fd-3760549d7b92-kube-api-access-7tc8g\") pod \"openstack-operator-index-rk7d4\" (UID: \"16fdd20d-d43b-4b62-92fd-3760549d7b92\") " pod="openstack-operators/openstack-operator-index-rk7d4" Sep 29 10:59:32 crc kubenswrapper[4752]: I0929 10:59:32.718182 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tc8g\" (UniqueName: \"kubernetes.io/projected/16fdd20d-d43b-4b62-92fd-3760549d7b92-kube-api-access-7tc8g\") pod \"openstack-operator-index-rk7d4\" (UID: \"16fdd20d-d43b-4b62-92fd-3760549d7b92\") " pod="openstack-operators/openstack-operator-index-rk7d4" Sep 29 10:59:32 crc kubenswrapper[4752]: I0929 10:59:32.744796 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tc8g\" (UniqueName: \"kubernetes.io/projected/16fdd20d-d43b-4b62-92fd-3760549d7b92-kube-api-access-7tc8g\") pod \"openstack-operator-index-rk7d4\" (UID: \"16fdd20d-d43b-4b62-92fd-3760549d7b92\") " pod="openstack-operators/openstack-operator-index-rk7d4" Sep 29 10:59:32 crc kubenswrapper[4752]: I0929 10:59:32.778404 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-rk7d4" Sep 29 10:59:33 crc kubenswrapper[4752]: I0929 10:59:33.385365 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-7m2nw" event={"ID":"b3885b46-efb3-4122-aed2-c12627c56d80","Type":"ContainerStarted","Data":"634d770d024cb80108951ae80dc60cfd4f1d910119cfcdbbec75c5f9fdfd1141"} Sep 29 10:59:33 crc kubenswrapper[4752]: I0929 10:59:33.385506 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-7m2nw" podUID="b3885b46-efb3-4122-aed2-c12627c56d80" containerName="registry-server" containerID="cri-o://634d770d024cb80108951ae80dc60cfd4f1d910119cfcdbbec75c5f9fdfd1141" gracePeriod=2 Sep 29 10:59:33 crc kubenswrapper[4752]: I0929 10:59:33.410202 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-7m2nw" podStartSLOduration=1.341575554 podStartE2EDuration="5.410171747s" podCreationTimestamp="2025-09-29 10:59:28 +0000 UTC" firstStartedPulling="2025-09-29 10:59:28.999435294 +0000 UTC m=+909.788576951" lastFinishedPulling="2025-09-29 10:59:33.068031477 +0000 UTC m=+913.857173144" observedRunningTime="2025-09-29 10:59:33.404077137 +0000 UTC m=+914.193218814" watchObservedRunningTime="2025-09-29 10:59:33.410171747 +0000 UTC m=+914.199313404" Sep 29 10:59:33 crc kubenswrapper[4752]: I0929 10:59:33.437839 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-rk7d4"] Sep 29 10:59:33 crc kubenswrapper[4752]: W0929 10:59:33.441511 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16fdd20d_d43b_4b62_92fd_3760549d7b92.slice/crio-38c544cc6a989391c1672d7e3f330bcd3fdaa2382cd4e53ca6ad991aa8669fde WatchSource:0}: Error finding container 38c544cc6a989391c1672d7e3f330bcd3fdaa2382cd4e53ca6ad991aa8669fde: Status 404 returned error can't find the container with id 38c544cc6a989391c1672d7e3f330bcd3fdaa2382cd4e53ca6ad991aa8669fde Sep 29 10:59:33 crc kubenswrapper[4752]: I0929 10:59:33.853247 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-7m2nw" Sep 29 10:59:34 crc kubenswrapper[4752]: I0929 10:59:34.035348 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-742mt\" (UniqueName: \"kubernetes.io/projected/b3885b46-efb3-4122-aed2-c12627c56d80-kube-api-access-742mt\") pod \"b3885b46-efb3-4122-aed2-c12627c56d80\" (UID: \"b3885b46-efb3-4122-aed2-c12627c56d80\") " Sep 29 10:59:34 crc kubenswrapper[4752]: I0929 10:59:34.049734 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3885b46-efb3-4122-aed2-c12627c56d80-kube-api-access-742mt" (OuterVolumeSpecName: "kube-api-access-742mt") pod "b3885b46-efb3-4122-aed2-c12627c56d80" (UID: "b3885b46-efb3-4122-aed2-c12627c56d80"). InnerVolumeSpecName "kube-api-access-742mt". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:59:34 crc kubenswrapper[4752]: I0929 10:59:34.137392 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-742mt\" (UniqueName: \"kubernetes.io/projected/b3885b46-efb3-4122-aed2-c12627c56d80-kube-api-access-742mt\") on node \"crc\" DevicePath \"\"" Sep 29 10:59:34 crc kubenswrapper[4752]: I0929 10:59:34.394979 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-rk7d4" event={"ID":"16fdd20d-d43b-4b62-92fd-3760549d7b92","Type":"ContainerStarted","Data":"b215cb062cf6ee5d2409c40d2be389a4f09442013aa9152f404254804789fddf"} Sep 29 10:59:34 crc kubenswrapper[4752]: I0929 10:59:34.395038 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-rk7d4" event={"ID":"16fdd20d-d43b-4b62-92fd-3760549d7b92","Type":"ContainerStarted","Data":"38c544cc6a989391c1672d7e3f330bcd3fdaa2382cd4e53ca6ad991aa8669fde"} Sep 29 10:59:34 crc kubenswrapper[4752]: I0929 10:59:34.396980 4752 generic.go:334] "Generic (PLEG): container finished" podID="b3885b46-efb3-4122-aed2-c12627c56d80" containerID="634d770d024cb80108951ae80dc60cfd4f1d910119cfcdbbec75c5f9fdfd1141" exitCode=0 Sep 29 10:59:34 crc kubenswrapper[4752]: I0929 10:59:34.397013 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-7m2nw" event={"ID":"b3885b46-efb3-4122-aed2-c12627c56d80","Type":"ContainerDied","Data":"634d770d024cb80108951ae80dc60cfd4f1d910119cfcdbbec75c5f9fdfd1141"} Sep 29 10:59:34 crc kubenswrapper[4752]: I0929 10:59:34.397034 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-7m2nw" event={"ID":"b3885b46-efb3-4122-aed2-c12627c56d80","Type":"ContainerDied","Data":"2baf708750cfdde239899610c908093e0fa375488c155ca8c17d7732c9874b42"} Sep 29 10:59:34 crc kubenswrapper[4752]: I0929 10:59:34.397057 4752 scope.go:117] "RemoveContainer" containerID="634d770d024cb80108951ae80dc60cfd4f1d910119cfcdbbec75c5f9fdfd1141" Sep 29 10:59:34 crc kubenswrapper[4752]: I0929 10:59:34.397158 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-7m2nw" Sep 29 10:59:34 crc kubenswrapper[4752]: I0929 10:59:34.412173 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-rk7d4" podStartSLOduration=2.273129482 podStartE2EDuration="2.412148908s" podCreationTimestamp="2025-09-29 10:59:32 +0000 UTC" firstStartedPulling="2025-09-29 10:59:33.445410199 +0000 UTC m=+914.234551866" lastFinishedPulling="2025-09-29 10:59:33.584429625 +0000 UTC m=+914.373571292" observedRunningTime="2025-09-29 10:59:34.409708805 +0000 UTC m=+915.198850482" watchObservedRunningTime="2025-09-29 10:59:34.412148908 +0000 UTC m=+915.201290575" Sep 29 10:59:34 crc kubenswrapper[4752]: I0929 10:59:34.420774 4752 scope.go:117] "RemoveContainer" containerID="634d770d024cb80108951ae80dc60cfd4f1d910119cfcdbbec75c5f9fdfd1141" Sep 29 10:59:34 crc kubenswrapper[4752]: E0929 10:59:34.421219 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"634d770d024cb80108951ae80dc60cfd4f1d910119cfcdbbec75c5f9fdfd1141\": container with ID starting with 634d770d024cb80108951ae80dc60cfd4f1d910119cfcdbbec75c5f9fdfd1141 not found: ID does not exist" containerID="634d770d024cb80108951ae80dc60cfd4f1d910119cfcdbbec75c5f9fdfd1141" Sep 29 10:59:34 crc kubenswrapper[4752]: I0929 10:59:34.421262 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"634d770d024cb80108951ae80dc60cfd4f1d910119cfcdbbec75c5f9fdfd1141"} err="failed to get container status \"634d770d024cb80108951ae80dc60cfd4f1d910119cfcdbbec75c5f9fdfd1141\": rpc error: code = NotFound desc = could not find container \"634d770d024cb80108951ae80dc60cfd4f1d910119cfcdbbec75c5f9fdfd1141\": container with ID starting with 634d770d024cb80108951ae80dc60cfd4f1d910119cfcdbbec75c5f9fdfd1141 not found: ID does not exist" Sep 29 10:59:34 crc kubenswrapper[4752]: I0929 10:59:34.431326 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-7m2nw"] Sep 29 10:59:34 crc kubenswrapper[4752]: I0929 10:59:34.434781 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-7m2nw"] Sep 29 10:59:36 crc kubenswrapper[4752]: I0929 10:59:36.040049 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3885b46-efb3-4122-aed2-c12627c56d80" path="/var/lib/kubelet/pods/b3885b46-efb3-4122-aed2-c12627c56d80/volumes" Sep 29 10:59:42 crc kubenswrapper[4752]: I0929 10:59:42.778601 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-rk7d4" Sep 29 10:59:42 crc kubenswrapper[4752]: I0929 10:59:42.783047 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-rk7d4" Sep 29 10:59:42 crc kubenswrapper[4752]: I0929 10:59:42.821525 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-rk7d4" Sep 29 10:59:43 crc kubenswrapper[4752]: I0929 10:59:43.486775 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-rk7d4" Sep 29 10:59:43 crc kubenswrapper[4752]: I0929 10:59:43.835015 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-rk7d4"] Sep 29 10:59:44 crc kubenswrapper[4752]: I0929 10:59:44.440515 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-2s7f8"] Sep 29 10:59:44 crc kubenswrapper[4752]: E0929 10:59:44.440835 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3885b46-efb3-4122-aed2-c12627c56d80" containerName="registry-server" Sep 29 10:59:44 crc kubenswrapper[4752]: I0929 10:59:44.440848 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3885b46-efb3-4122-aed2-c12627c56d80" containerName="registry-server" Sep 29 10:59:44 crc kubenswrapper[4752]: I0929 10:59:44.441001 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3885b46-efb3-4122-aed2-c12627c56d80" containerName="registry-server" Sep 29 10:59:44 crc kubenswrapper[4752]: I0929 10:59:44.441513 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-2s7f8" Sep 29 10:59:44 crc kubenswrapper[4752]: I0929 10:59:44.443365 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-kjtrd" Sep 29 10:59:44 crc kubenswrapper[4752]: I0929 10:59:44.453959 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-2s7f8"] Sep 29 10:59:44 crc kubenswrapper[4752]: I0929 10:59:44.594559 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shfsc\" (UniqueName: \"kubernetes.io/projected/b333e87b-b51e-4644-b1de-f08e213d84bd-kube-api-access-shfsc\") pod \"openstack-operator-index-2s7f8\" (UID: \"b333e87b-b51e-4644-b1de-f08e213d84bd\") " pod="openstack-operators/openstack-operator-index-2s7f8" Sep 29 10:59:44 crc kubenswrapper[4752]: I0929 10:59:44.696916 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shfsc\" (UniqueName: \"kubernetes.io/projected/b333e87b-b51e-4644-b1de-f08e213d84bd-kube-api-access-shfsc\") pod \"openstack-operator-index-2s7f8\" (UID: \"b333e87b-b51e-4644-b1de-f08e213d84bd\") " pod="openstack-operators/openstack-operator-index-2s7f8" Sep 29 10:59:44 crc kubenswrapper[4752]: I0929 10:59:44.720741 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shfsc\" (UniqueName: \"kubernetes.io/projected/b333e87b-b51e-4644-b1de-f08e213d84bd-kube-api-access-shfsc\") pod \"openstack-operator-index-2s7f8\" (UID: \"b333e87b-b51e-4644-b1de-f08e213d84bd\") " pod="openstack-operators/openstack-operator-index-2s7f8" Sep 29 10:59:44 crc kubenswrapper[4752]: I0929 10:59:44.777890 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-2s7f8" Sep 29 10:59:45 crc kubenswrapper[4752]: I0929 10:59:45.189766 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-2s7f8"] Sep 29 10:59:45 crc kubenswrapper[4752]: W0929 10:59:45.195555 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb333e87b_b51e_4644_b1de_f08e213d84bd.slice/crio-d70147fbbeab7a4de752b903f7c06399659f484ce048796f3cf0fc6ba731f6e5 WatchSource:0}: Error finding container d70147fbbeab7a4de752b903f7c06399659f484ce048796f3cf0fc6ba731f6e5: Status 404 returned error can't find the container with id d70147fbbeab7a4de752b903f7c06399659f484ce048796f3cf0fc6ba731f6e5 Sep 29 10:59:45 crc kubenswrapper[4752]: I0929 10:59:45.489845 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-2s7f8" event={"ID":"b333e87b-b51e-4644-b1de-f08e213d84bd","Type":"ContainerStarted","Data":"d70147fbbeab7a4de752b903f7c06399659f484ce048796f3cf0fc6ba731f6e5"} Sep 29 10:59:45 crc kubenswrapper[4752]: I0929 10:59:45.489956 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-rk7d4" podUID="16fdd20d-d43b-4b62-92fd-3760549d7b92" containerName="registry-server" containerID="cri-o://b215cb062cf6ee5d2409c40d2be389a4f09442013aa9152f404254804789fddf" gracePeriod=2 Sep 29 10:59:46 crc kubenswrapper[4752]: I0929 10:59:46.079625 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-rk7d4" Sep 29 10:59:46 crc kubenswrapper[4752]: I0929 10:59:46.220706 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7tc8g\" (UniqueName: \"kubernetes.io/projected/16fdd20d-d43b-4b62-92fd-3760549d7b92-kube-api-access-7tc8g\") pod \"16fdd20d-d43b-4b62-92fd-3760549d7b92\" (UID: \"16fdd20d-d43b-4b62-92fd-3760549d7b92\") " Sep 29 10:59:46 crc kubenswrapper[4752]: I0929 10:59:46.227887 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16fdd20d-d43b-4b62-92fd-3760549d7b92-kube-api-access-7tc8g" (OuterVolumeSpecName: "kube-api-access-7tc8g") pod "16fdd20d-d43b-4b62-92fd-3760549d7b92" (UID: "16fdd20d-d43b-4b62-92fd-3760549d7b92"). InnerVolumeSpecName "kube-api-access-7tc8g". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:59:46 crc kubenswrapper[4752]: I0929 10:59:46.322084 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7tc8g\" (UniqueName: \"kubernetes.io/projected/16fdd20d-d43b-4b62-92fd-3760549d7b92-kube-api-access-7tc8g\") on node \"crc\" DevicePath \"\"" Sep 29 10:59:46 crc kubenswrapper[4752]: I0929 10:59:46.499835 4752 generic.go:334] "Generic (PLEG): container finished" podID="16fdd20d-d43b-4b62-92fd-3760549d7b92" containerID="b215cb062cf6ee5d2409c40d2be389a4f09442013aa9152f404254804789fddf" exitCode=0 Sep 29 10:59:46 crc kubenswrapper[4752]: I0929 10:59:46.499932 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-rk7d4" Sep 29 10:59:46 crc kubenswrapper[4752]: I0929 10:59:46.499920 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-rk7d4" event={"ID":"16fdd20d-d43b-4b62-92fd-3760549d7b92","Type":"ContainerDied","Data":"b215cb062cf6ee5d2409c40d2be389a4f09442013aa9152f404254804789fddf"} Sep 29 10:59:46 crc kubenswrapper[4752]: I0929 10:59:46.500293 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-rk7d4" event={"ID":"16fdd20d-d43b-4b62-92fd-3760549d7b92","Type":"ContainerDied","Data":"38c544cc6a989391c1672d7e3f330bcd3fdaa2382cd4e53ca6ad991aa8669fde"} Sep 29 10:59:46 crc kubenswrapper[4752]: I0929 10:59:46.500329 4752 scope.go:117] "RemoveContainer" containerID="b215cb062cf6ee5d2409c40d2be389a4f09442013aa9152f404254804789fddf" Sep 29 10:59:46 crc kubenswrapper[4752]: I0929 10:59:46.502523 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-2s7f8" event={"ID":"b333e87b-b51e-4644-b1de-f08e213d84bd","Type":"ContainerStarted","Data":"886a6a1c9321df69caa3b875e32838942bd573b40ef6f70e7a0af7ae80e039c6"} Sep 29 10:59:46 crc kubenswrapper[4752]: I0929 10:59:46.521162 4752 scope.go:117] "RemoveContainer" containerID="b215cb062cf6ee5d2409c40d2be389a4f09442013aa9152f404254804789fddf" Sep 29 10:59:46 crc kubenswrapper[4752]: E0929 10:59:46.521742 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b215cb062cf6ee5d2409c40d2be389a4f09442013aa9152f404254804789fddf\": container with ID starting with b215cb062cf6ee5d2409c40d2be389a4f09442013aa9152f404254804789fddf not found: ID does not exist" containerID="b215cb062cf6ee5d2409c40d2be389a4f09442013aa9152f404254804789fddf" Sep 29 10:59:46 crc kubenswrapper[4752]: I0929 10:59:46.521781 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b215cb062cf6ee5d2409c40d2be389a4f09442013aa9152f404254804789fddf"} err="failed to get container status \"b215cb062cf6ee5d2409c40d2be389a4f09442013aa9152f404254804789fddf\": rpc error: code = NotFound desc = could not find container \"b215cb062cf6ee5d2409c40d2be389a4f09442013aa9152f404254804789fddf\": container with ID starting with b215cb062cf6ee5d2409c40d2be389a4f09442013aa9152f404254804789fddf not found: ID does not exist" Sep 29 10:59:46 crc kubenswrapper[4752]: I0929 10:59:46.525688 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-2s7f8" podStartSLOduration=2.055208918 podStartE2EDuration="2.525673115s" podCreationTimestamp="2025-09-29 10:59:44 +0000 UTC" firstStartedPulling="2025-09-29 10:59:45.199392291 +0000 UTC m=+925.988533958" lastFinishedPulling="2025-09-29 10:59:45.669856488 +0000 UTC m=+926.458998155" observedRunningTime="2025-09-29 10:59:46.521439484 +0000 UTC m=+927.310581151" watchObservedRunningTime="2025-09-29 10:59:46.525673115 +0000 UTC m=+927.314814782" Sep 29 10:59:46 crc kubenswrapper[4752]: I0929 10:59:46.541408 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-rk7d4"] Sep 29 10:59:46 crc kubenswrapper[4752]: I0929 10:59:46.547098 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-rk7d4"] Sep 29 10:59:48 crc kubenswrapper[4752]: I0929 10:59:48.040286 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16fdd20d-d43b-4b62-92fd-3760549d7b92" path="/var/lib/kubelet/pods/16fdd20d-d43b-4b62-92fd-3760549d7b92/volumes" Sep 29 10:59:54 crc kubenswrapper[4752]: I0929 10:59:54.778169 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-2s7f8" Sep 29 10:59:54 crc kubenswrapper[4752]: I0929 10:59:54.779000 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-2s7f8" Sep 29 10:59:54 crc kubenswrapper[4752]: I0929 10:59:54.814973 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-2s7f8" Sep 29 10:59:55 crc kubenswrapper[4752]: I0929 10:59:55.594602 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-2s7f8" Sep 29 10:59:56 crc kubenswrapper[4752]: I0929 10:59:56.176203 4752 patch_prober.go:28] interesting pod/machine-config-daemon-mgrvs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 10:59:56 crc kubenswrapper[4752]: I0929 10:59:56.176314 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 10:59:56 crc kubenswrapper[4752]: I0929 10:59:56.176393 4752 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" Sep 29 10:59:56 crc kubenswrapper[4752]: I0929 10:59:56.177274 4752 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"dbba8a90f680e465e868c9761ab597851b2db8c336cda0417acd2b4d326ea54a"} pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 29 10:59:56 crc kubenswrapper[4752]: I0929 10:59:56.177348 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" containerName="machine-config-daemon" containerID="cri-o://dbba8a90f680e465e868c9761ab597851b2db8c336cda0417acd2b4d326ea54a" gracePeriod=600 Sep 29 10:59:56 crc kubenswrapper[4752]: I0929 10:59:56.575452 4752 generic.go:334] "Generic (PLEG): container finished" podID="5863c243-797d-462a-b11f-71aaf005f8d1" containerID="dbba8a90f680e465e868c9761ab597851b2db8c336cda0417acd2b4d326ea54a" exitCode=0 Sep 29 10:59:56 crc kubenswrapper[4752]: I0929 10:59:56.575647 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" event={"ID":"5863c243-797d-462a-b11f-71aaf005f8d1","Type":"ContainerDied","Data":"dbba8a90f680e465e868c9761ab597851b2db8c336cda0417acd2b4d326ea54a"} Sep 29 10:59:56 crc kubenswrapper[4752]: I0929 10:59:56.576602 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" event={"ID":"5863c243-797d-462a-b11f-71aaf005f8d1","Type":"ContainerStarted","Data":"48a0da04429cf7fcc316318f0d1c0bddde646fbce423db761e54fa0241cf9fda"} Sep 29 10:59:56 crc kubenswrapper[4752]: I0929 10:59:56.576631 4752 scope.go:117] "RemoveContainer" containerID="c407b08be26fe95221bcb36f9b8690f867d6ce7b5902b3cd248dbfd3fb7865c7" Sep 29 10:59:58 crc kubenswrapper[4752]: I0929 10:59:58.496675 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/24245cca302114c11807d483276eec81c7be35f7f8990fcd9b59e964097xccn"] Sep 29 10:59:58 crc kubenswrapper[4752]: E0929 10:59:58.497507 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16fdd20d-d43b-4b62-92fd-3760549d7b92" containerName="registry-server" Sep 29 10:59:58 crc kubenswrapper[4752]: I0929 10:59:58.497527 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="16fdd20d-d43b-4b62-92fd-3760549d7b92" containerName="registry-server" Sep 29 10:59:58 crc kubenswrapper[4752]: I0929 10:59:58.497663 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="16fdd20d-d43b-4b62-92fd-3760549d7b92" containerName="registry-server" Sep 29 10:59:58 crc kubenswrapper[4752]: I0929 10:59:58.498729 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/24245cca302114c11807d483276eec81c7be35f7f8990fcd9b59e964097xccn" Sep 29 10:59:58 crc kubenswrapper[4752]: I0929 10:59:58.502680 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-x25gs" Sep 29 10:59:58 crc kubenswrapper[4752]: I0929 10:59:58.514553 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/24245cca302114c11807d483276eec81c7be35f7f8990fcd9b59e964097xccn"] Sep 29 10:59:58 crc kubenswrapper[4752]: I0929 10:59:58.610571 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f94fa837-71f7-4540-b897-430e3f72928f-util\") pod \"24245cca302114c11807d483276eec81c7be35f7f8990fcd9b59e964097xccn\" (UID: \"f94fa837-71f7-4540-b897-430e3f72928f\") " pod="openstack-operators/24245cca302114c11807d483276eec81c7be35f7f8990fcd9b59e964097xccn" Sep 29 10:59:58 crc kubenswrapper[4752]: I0929 10:59:58.610790 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pr9j6\" (UniqueName: \"kubernetes.io/projected/f94fa837-71f7-4540-b897-430e3f72928f-kube-api-access-pr9j6\") pod \"24245cca302114c11807d483276eec81c7be35f7f8990fcd9b59e964097xccn\" (UID: \"f94fa837-71f7-4540-b897-430e3f72928f\") " pod="openstack-operators/24245cca302114c11807d483276eec81c7be35f7f8990fcd9b59e964097xccn" Sep 29 10:59:58 crc kubenswrapper[4752]: I0929 10:59:58.610856 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f94fa837-71f7-4540-b897-430e3f72928f-bundle\") pod \"24245cca302114c11807d483276eec81c7be35f7f8990fcd9b59e964097xccn\" (UID: \"f94fa837-71f7-4540-b897-430e3f72928f\") " pod="openstack-operators/24245cca302114c11807d483276eec81c7be35f7f8990fcd9b59e964097xccn" Sep 29 10:59:58 crc kubenswrapper[4752]: I0929 10:59:58.711993 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pr9j6\" (UniqueName: \"kubernetes.io/projected/f94fa837-71f7-4540-b897-430e3f72928f-kube-api-access-pr9j6\") pod \"24245cca302114c11807d483276eec81c7be35f7f8990fcd9b59e964097xccn\" (UID: \"f94fa837-71f7-4540-b897-430e3f72928f\") " pod="openstack-operators/24245cca302114c11807d483276eec81c7be35f7f8990fcd9b59e964097xccn" Sep 29 10:59:58 crc kubenswrapper[4752]: I0929 10:59:58.712367 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f94fa837-71f7-4540-b897-430e3f72928f-bundle\") pod \"24245cca302114c11807d483276eec81c7be35f7f8990fcd9b59e964097xccn\" (UID: \"f94fa837-71f7-4540-b897-430e3f72928f\") " pod="openstack-operators/24245cca302114c11807d483276eec81c7be35f7f8990fcd9b59e964097xccn" Sep 29 10:59:58 crc kubenswrapper[4752]: I0929 10:59:58.712399 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f94fa837-71f7-4540-b897-430e3f72928f-util\") pod \"24245cca302114c11807d483276eec81c7be35f7f8990fcd9b59e964097xccn\" (UID: \"f94fa837-71f7-4540-b897-430e3f72928f\") " pod="openstack-operators/24245cca302114c11807d483276eec81c7be35f7f8990fcd9b59e964097xccn" Sep 29 10:59:58 crc kubenswrapper[4752]: I0929 10:59:58.713151 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f94fa837-71f7-4540-b897-430e3f72928f-util\") pod \"24245cca302114c11807d483276eec81c7be35f7f8990fcd9b59e964097xccn\" (UID: \"f94fa837-71f7-4540-b897-430e3f72928f\") " pod="openstack-operators/24245cca302114c11807d483276eec81c7be35f7f8990fcd9b59e964097xccn" Sep 29 10:59:58 crc kubenswrapper[4752]: I0929 10:59:58.713241 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f94fa837-71f7-4540-b897-430e3f72928f-bundle\") pod \"24245cca302114c11807d483276eec81c7be35f7f8990fcd9b59e964097xccn\" (UID: \"f94fa837-71f7-4540-b897-430e3f72928f\") " pod="openstack-operators/24245cca302114c11807d483276eec81c7be35f7f8990fcd9b59e964097xccn" Sep 29 10:59:58 crc kubenswrapper[4752]: I0929 10:59:58.734278 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pr9j6\" (UniqueName: \"kubernetes.io/projected/f94fa837-71f7-4540-b897-430e3f72928f-kube-api-access-pr9j6\") pod \"24245cca302114c11807d483276eec81c7be35f7f8990fcd9b59e964097xccn\" (UID: \"f94fa837-71f7-4540-b897-430e3f72928f\") " pod="openstack-operators/24245cca302114c11807d483276eec81c7be35f7f8990fcd9b59e964097xccn" Sep 29 10:59:58 crc kubenswrapper[4752]: I0929 10:59:58.817898 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/24245cca302114c11807d483276eec81c7be35f7f8990fcd9b59e964097xccn" Sep 29 10:59:59 crc kubenswrapper[4752]: I0929 10:59:59.246760 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/24245cca302114c11807d483276eec81c7be35f7f8990fcd9b59e964097xccn"] Sep 29 10:59:59 crc kubenswrapper[4752]: I0929 10:59:59.604915 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/24245cca302114c11807d483276eec81c7be35f7f8990fcd9b59e964097xccn" event={"ID":"f94fa837-71f7-4540-b897-430e3f72928f","Type":"ContainerStarted","Data":"3829bc31569a758d19f503f2fa35c4a0c2e1199c8f80642da6221926ff8a47cc"} Sep 29 10:59:59 crc kubenswrapper[4752]: I0929 10:59:59.604987 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/24245cca302114c11807d483276eec81c7be35f7f8990fcd9b59e964097xccn" event={"ID":"f94fa837-71f7-4540-b897-430e3f72928f","Type":"ContainerStarted","Data":"31bc7abc54dd66c31fc97bf530de78ed2d9e0381eb83c8df5f72060f82e9aff4"} Sep 29 11:00:00 crc kubenswrapper[4752]: I0929 11:00:00.151766 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29319060-9rrgq"] Sep 29 11:00:00 crc kubenswrapper[4752]: I0929 11:00:00.153669 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29319060-9rrgq" Sep 29 11:00:00 crc kubenswrapper[4752]: I0929 11:00:00.156148 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 29 11:00:00 crc kubenswrapper[4752]: I0929 11:00:00.160115 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 29 11:00:00 crc kubenswrapper[4752]: I0929 11:00:00.165207 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29319060-9rrgq"] Sep 29 11:00:00 crc kubenswrapper[4752]: I0929 11:00:00.241538 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/da0d491d-1eb4-40b0-89d7-3697f8b60002-config-volume\") pod \"collect-profiles-29319060-9rrgq\" (UID: \"da0d491d-1eb4-40b0-89d7-3697f8b60002\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319060-9rrgq" Sep 29 11:00:00 crc kubenswrapper[4752]: I0929 11:00:00.241789 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tm9zs\" (UniqueName: \"kubernetes.io/projected/da0d491d-1eb4-40b0-89d7-3697f8b60002-kube-api-access-tm9zs\") pod \"collect-profiles-29319060-9rrgq\" (UID: \"da0d491d-1eb4-40b0-89d7-3697f8b60002\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319060-9rrgq" Sep 29 11:00:00 crc kubenswrapper[4752]: I0929 11:00:00.241908 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/da0d491d-1eb4-40b0-89d7-3697f8b60002-secret-volume\") pod \"collect-profiles-29319060-9rrgq\" (UID: \"da0d491d-1eb4-40b0-89d7-3697f8b60002\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319060-9rrgq" Sep 29 11:00:00 crc kubenswrapper[4752]: I0929 11:00:00.344479 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/da0d491d-1eb4-40b0-89d7-3697f8b60002-config-volume\") pod \"collect-profiles-29319060-9rrgq\" (UID: \"da0d491d-1eb4-40b0-89d7-3697f8b60002\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319060-9rrgq" Sep 29 11:00:00 crc kubenswrapper[4752]: I0929 11:00:00.344573 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tm9zs\" (UniqueName: \"kubernetes.io/projected/da0d491d-1eb4-40b0-89d7-3697f8b60002-kube-api-access-tm9zs\") pod \"collect-profiles-29319060-9rrgq\" (UID: \"da0d491d-1eb4-40b0-89d7-3697f8b60002\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319060-9rrgq" Sep 29 11:00:00 crc kubenswrapper[4752]: I0929 11:00:00.344615 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/da0d491d-1eb4-40b0-89d7-3697f8b60002-secret-volume\") pod \"collect-profiles-29319060-9rrgq\" (UID: \"da0d491d-1eb4-40b0-89d7-3697f8b60002\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319060-9rrgq" Sep 29 11:00:00 crc kubenswrapper[4752]: I0929 11:00:00.345678 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/da0d491d-1eb4-40b0-89d7-3697f8b60002-config-volume\") pod \"collect-profiles-29319060-9rrgq\" (UID: \"da0d491d-1eb4-40b0-89d7-3697f8b60002\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319060-9rrgq" Sep 29 11:00:00 crc kubenswrapper[4752]: I0929 11:00:00.352097 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/da0d491d-1eb4-40b0-89d7-3697f8b60002-secret-volume\") pod \"collect-profiles-29319060-9rrgq\" (UID: \"da0d491d-1eb4-40b0-89d7-3697f8b60002\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319060-9rrgq" Sep 29 11:00:00 crc kubenswrapper[4752]: I0929 11:00:00.364545 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tm9zs\" (UniqueName: \"kubernetes.io/projected/da0d491d-1eb4-40b0-89d7-3697f8b60002-kube-api-access-tm9zs\") pod \"collect-profiles-29319060-9rrgq\" (UID: \"da0d491d-1eb4-40b0-89d7-3697f8b60002\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319060-9rrgq" Sep 29 11:00:00 crc kubenswrapper[4752]: I0929 11:00:00.474603 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29319060-9rrgq" Sep 29 11:00:00 crc kubenswrapper[4752]: I0929 11:00:00.618375 4752 generic.go:334] "Generic (PLEG): container finished" podID="f94fa837-71f7-4540-b897-430e3f72928f" containerID="3829bc31569a758d19f503f2fa35c4a0c2e1199c8f80642da6221926ff8a47cc" exitCode=0 Sep 29 11:00:00 crc kubenswrapper[4752]: I0929 11:00:00.618449 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/24245cca302114c11807d483276eec81c7be35f7f8990fcd9b59e964097xccn" event={"ID":"f94fa837-71f7-4540-b897-430e3f72928f","Type":"ContainerDied","Data":"3829bc31569a758d19f503f2fa35c4a0c2e1199c8f80642da6221926ff8a47cc"} Sep 29 11:00:00 crc kubenswrapper[4752]: I0929 11:00:00.926723 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29319060-9rrgq"] Sep 29 11:00:00 crc kubenswrapper[4752]: W0929 11:00:00.936249 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda0d491d_1eb4_40b0_89d7_3697f8b60002.slice/crio-ce6a04b0a4e64ba24b8bb830975375b6197501a88bc017c1c35c3ffed66b589d WatchSource:0}: Error finding container ce6a04b0a4e64ba24b8bb830975375b6197501a88bc017c1c35c3ffed66b589d: Status 404 returned error can't find the container with id ce6a04b0a4e64ba24b8bb830975375b6197501a88bc017c1c35c3ffed66b589d Sep 29 11:00:01 crc kubenswrapper[4752]: I0929 11:00:01.627497 4752 generic.go:334] "Generic (PLEG): container finished" podID="da0d491d-1eb4-40b0-89d7-3697f8b60002" containerID="5befd21439875c604d8e0c7f9b7993c0db139074275ba3cd79ebc14a925b9dbf" exitCode=0 Sep 29 11:00:01 crc kubenswrapper[4752]: I0929 11:00:01.627573 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29319060-9rrgq" event={"ID":"da0d491d-1eb4-40b0-89d7-3697f8b60002","Type":"ContainerDied","Data":"5befd21439875c604d8e0c7f9b7993c0db139074275ba3cd79ebc14a925b9dbf"} Sep 29 11:00:01 crc kubenswrapper[4752]: I0929 11:00:01.628026 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29319060-9rrgq" event={"ID":"da0d491d-1eb4-40b0-89d7-3697f8b60002","Type":"ContainerStarted","Data":"ce6a04b0a4e64ba24b8bb830975375b6197501a88bc017c1c35c3ffed66b589d"} Sep 29 11:00:01 crc kubenswrapper[4752]: I0929 11:00:01.631560 4752 generic.go:334] "Generic (PLEG): container finished" podID="f94fa837-71f7-4540-b897-430e3f72928f" containerID="27c22c9c29ef7fa607e470ec7806bf6a8c192d8125c9028243a789a615a68554" exitCode=0 Sep 29 11:00:01 crc kubenswrapper[4752]: I0929 11:00:01.631594 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/24245cca302114c11807d483276eec81c7be35f7f8990fcd9b59e964097xccn" event={"ID":"f94fa837-71f7-4540-b897-430e3f72928f","Type":"ContainerDied","Data":"27c22c9c29ef7fa607e470ec7806bf6a8c192d8125c9028243a789a615a68554"} Sep 29 11:00:02 crc kubenswrapper[4752]: I0929 11:00:02.641914 4752 generic.go:334] "Generic (PLEG): container finished" podID="f94fa837-71f7-4540-b897-430e3f72928f" containerID="81770d467df5c6ddfc3b5f78e09c9344c0462dfbc4199cc1dbd64841fcdf586b" exitCode=0 Sep 29 11:00:02 crc kubenswrapper[4752]: I0929 11:00:02.642000 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/24245cca302114c11807d483276eec81c7be35f7f8990fcd9b59e964097xccn" event={"ID":"f94fa837-71f7-4540-b897-430e3f72928f","Type":"ContainerDied","Data":"81770d467df5c6ddfc3b5f78e09c9344c0462dfbc4199cc1dbd64841fcdf586b"} Sep 29 11:00:03 crc kubenswrapper[4752]: I0929 11:00:03.015104 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29319060-9rrgq" Sep 29 11:00:03 crc kubenswrapper[4752]: I0929 11:00:03.194380 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/da0d491d-1eb4-40b0-89d7-3697f8b60002-secret-volume\") pod \"da0d491d-1eb4-40b0-89d7-3697f8b60002\" (UID: \"da0d491d-1eb4-40b0-89d7-3697f8b60002\") " Sep 29 11:00:03 crc kubenswrapper[4752]: I0929 11:00:03.194465 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tm9zs\" (UniqueName: \"kubernetes.io/projected/da0d491d-1eb4-40b0-89d7-3697f8b60002-kube-api-access-tm9zs\") pod \"da0d491d-1eb4-40b0-89d7-3697f8b60002\" (UID: \"da0d491d-1eb4-40b0-89d7-3697f8b60002\") " Sep 29 11:00:03 crc kubenswrapper[4752]: I0929 11:00:03.194535 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/da0d491d-1eb4-40b0-89d7-3697f8b60002-config-volume\") pod \"da0d491d-1eb4-40b0-89d7-3697f8b60002\" (UID: \"da0d491d-1eb4-40b0-89d7-3697f8b60002\") " Sep 29 11:00:03 crc kubenswrapper[4752]: I0929 11:00:03.196589 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da0d491d-1eb4-40b0-89d7-3697f8b60002-config-volume" (OuterVolumeSpecName: "config-volume") pod "da0d491d-1eb4-40b0-89d7-3697f8b60002" (UID: "da0d491d-1eb4-40b0-89d7-3697f8b60002"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 11:00:03 crc kubenswrapper[4752]: I0929 11:00:03.202363 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da0d491d-1eb4-40b0-89d7-3697f8b60002-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "da0d491d-1eb4-40b0-89d7-3697f8b60002" (UID: "da0d491d-1eb4-40b0-89d7-3697f8b60002"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:00:03 crc kubenswrapper[4752]: I0929 11:00:03.202552 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da0d491d-1eb4-40b0-89d7-3697f8b60002-kube-api-access-tm9zs" (OuterVolumeSpecName: "kube-api-access-tm9zs") pod "da0d491d-1eb4-40b0-89d7-3697f8b60002" (UID: "da0d491d-1eb4-40b0-89d7-3697f8b60002"). InnerVolumeSpecName "kube-api-access-tm9zs". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 11:00:03 crc kubenswrapper[4752]: I0929 11:00:03.296861 4752 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/da0d491d-1eb4-40b0-89d7-3697f8b60002-config-volume\") on node \"crc\" DevicePath \"\"" Sep 29 11:00:03 crc kubenswrapper[4752]: I0929 11:00:03.296897 4752 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/da0d491d-1eb4-40b0-89d7-3697f8b60002-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 29 11:00:03 crc kubenswrapper[4752]: I0929 11:00:03.296912 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tm9zs\" (UniqueName: \"kubernetes.io/projected/da0d491d-1eb4-40b0-89d7-3697f8b60002-kube-api-access-tm9zs\") on node \"crc\" DevicePath \"\"" Sep 29 11:00:03 crc kubenswrapper[4752]: I0929 11:00:03.651501 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29319060-9rrgq" event={"ID":"da0d491d-1eb4-40b0-89d7-3697f8b60002","Type":"ContainerDied","Data":"ce6a04b0a4e64ba24b8bb830975375b6197501a88bc017c1c35c3ffed66b589d"} Sep 29 11:00:03 crc kubenswrapper[4752]: I0929 11:00:03.651614 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce6a04b0a4e64ba24b8bb830975375b6197501a88bc017c1c35c3ffed66b589d" Sep 29 11:00:03 crc kubenswrapper[4752]: I0929 11:00:03.651531 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29319060-9rrgq" Sep 29 11:00:04 crc kubenswrapper[4752]: I0929 11:00:04.028248 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/24245cca302114c11807d483276eec81c7be35f7f8990fcd9b59e964097xccn" Sep 29 11:00:04 crc kubenswrapper[4752]: I0929 11:00:04.209424 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f94fa837-71f7-4540-b897-430e3f72928f-bundle\") pod \"f94fa837-71f7-4540-b897-430e3f72928f\" (UID: \"f94fa837-71f7-4540-b897-430e3f72928f\") " Sep 29 11:00:04 crc kubenswrapper[4752]: I0929 11:00:04.209523 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f94fa837-71f7-4540-b897-430e3f72928f-util\") pod \"f94fa837-71f7-4540-b897-430e3f72928f\" (UID: \"f94fa837-71f7-4540-b897-430e3f72928f\") " Sep 29 11:00:04 crc kubenswrapper[4752]: I0929 11:00:04.209642 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pr9j6\" (UniqueName: \"kubernetes.io/projected/f94fa837-71f7-4540-b897-430e3f72928f-kube-api-access-pr9j6\") pod \"f94fa837-71f7-4540-b897-430e3f72928f\" (UID: \"f94fa837-71f7-4540-b897-430e3f72928f\") " Sep 29 11:00:04 crc kubenswrapper[4752]: I0929 11:00:04.210064 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f94fa837-71f7-4540-b897-430e3f72928f-bundle" (OuterVolumeSpecName: "bundle") pod "f94fa837-71f7-4540-b897-430e3f72928f" (UID: "f94fa837-71f7-4540-b897-430e3f72928f"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 11:00:04 crc kubenswrapper[4752]: I0929 11:00:04.214200 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f94fa837-71f7-4540-b897-430e3f72928f-kube-api-access-pr9j6" (OuterVolumeSpecName: "kube-api-access-pr9j6") pod "f94fa837-71f7-4540-b897-430e3f72928f" (UID: "f94fa837-71f7-4540-b897-430e3f72928f"). InnerVolumeSpecName "kube-api-access-pr9j6". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 11:00:04 crc kubenswrapper[4752]: I0929 11:00:04.222978 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f94fa837-71f7-4540-b897-430e3f72928f-util" (OuterVolumeSpecName: "util") pod "f94fa837-71f7-4540-b897-430e3f72928f" (UID: "f94fa837-71f7-4540-b897-430e3f72928f"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 11:00:04 crc kubenswrapper[4752]: I0929 11:00:04.312206 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pr9j6\" (UniqueName: \"kubernetes.io/projected/f94fa837-71f7-4540-b897-430e3f72928f-kube-api-access-pr9j6\") on node \"crc\" DevicePath \"\"" Sep 29 11:00:04 crc kubenswrapper[4752]: I0929 11:00:04.312275 4752 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f94fa837-71f7-4540-b897-430e3f72928f-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 11:00:04 crc kubenswrapper[4752]: I0929 11:00:04.312294 4752 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f94fa837-71f7-4540-b897-430e3f72928f-util\") on node \"crc\" DevicePath \"\"" Sep 29 11:00:04 crc kubenswrapper[4752]: I0929 11:00:04.661357 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/24245cca302114c11807d483276eec81c7be35f7f8990fcd9b59e964097xccn" event={"ID":"f94fa837-71f7-4540-b897-430e3f72928f","Type":"ContainerDied","Data":"31bc7abc54dd66c31fc97bf530de78ed2d9e0381eb83c8df5f72060f82e9aff4"} Sep 29 11:00:04 crc kubenswrapper[4752]: I0929 11:00:04.661454 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="31bc7abc54dd66c31fc97bf530de78ed2d9e0381eb83c8df5f72060f82e9aff4" Sep 29 11:00:04 crc kubenswrapper[4752]: I0929 11:00:04.661404 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/24245cca302114c11807d483276eec81c7be35f7f8990fcd9b59e964097xccn" Sep 29 11:00:07 crc kubenswrapper[4752]: I0929 11:00:07.217627 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-d99bfc6df-bjdz6"] Sep 29 11:00:07 crc kubenswrapper[4752]: E0929 11:00:07.218425 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f94fa837-71f7-4540-b897-430e3f72928f" containerName="util" Sep 29 11:00:07 crc kubenswrapper[4752]: I0929 11:00:07.218445 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="f94fa837-71f7-4540-b897-430e3f72928f" containerName="util" Sep 29 11:00:07 crc kubenswrapper[4752]: E0929 11:00:07.218457 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f94fa837-71f7-4540-b897-430e3f72928f" containerName="extract" Sep 29 11:00:07 crc kubenswrapper[4752]: I0929 11:00:07.218464 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="f94fa837-71f7-4540-b897-430e3f72928f" containerName="extract" Sep 29 11:00:07 crc kubenswrapper[4752]: E0929 11:00:07.218477 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da0d491d-1eb4-40b0-89d7-3697f8b60002" containerName="collect-profiles" Sep 29 11:00:07 crc kubenswrapper[4752]: I0929 11:00:07.218483 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="da0d491d-1eb4-40b0-89d7-3697f8b60002" containerName="collect-profiles" Sep 29 11:00:07 crc kubenswrapper[4752]: E0929 11:00:07.218502 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f94fa837-71f7-4540-b897-430e3f72928f" containerName="pull" Sep 29 11:00:07 crc kubenswrapper[4752]: I0929 11:00:07.218507 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="f94fa837-71f7-4540-b897-430e3f72928f" containerName="pull" Sep 29 11:00:07 crc kubenswrapper[4752]: I0929 11:00:07.218629 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="da0d491d-1eb4-40b0-89d7-3697f8b60002" containerName="collect-profiles" Sep 29 11:00:07 crc kubenswrapper[4752]: I0929 11:00:07.218649 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="f94fa837-71f7-4540-b897-430e3f72928f" containerName="extract" Sep 29 11:00:07 crc kubenswrapper[4752]: I0929 11:00:07.219529 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-d99bfc6df-bjdz6" Sep 29 11:00:07 crc kubenswrapper[4752]: I0929 11:00:07.221823 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-mxvzv" Sep 29 11:00:07 crc kubenswrapper[4752]: I0929 11:00:07.247285 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-d99bfc6df-bjdz6"] Sep 29 11:00:07 crc kubenswrapper[4752]: I0929 11:00:07.361505 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gl8wr\" (UniqueName: \"kubernetes.io/projected/0f791715-ebb0-42db-b8fe-d8ab636b6bb2-kube-api-access-gl8wr\") pod \"openstack-operator-controller-operator-d99bfc6df-bjdz6\" (UID: \"0f791715-ebb0-42db-b8fe-d8ab636b6bb2\") " pod="openstack-operators/openstack-operator-controller-operator-d99bfc6df-bjdz6" Sep 29 11:00:07 crc kubenswrapper[4752]: I0929 11:00:07.463683 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gl8wr\" (UniqueName: \"kubernetes.io/projected/0f791715-ebb0-42db-b8fe-d8ab636b6bb2-kube-api-access-gl8wr\") pod \"openstack-operator-controller-operator-d99bfc6df-bjdz6\" (UID: \"0f791715-ebb0-42db-b8fe-d8ab636b6bb2\") " pod="openstack-operators/openstack-operator-controller-operator-d99bfc6df-bjdz6" Sep 29 11:00:07 crc kubenswrapper[4752]: I0929 11:00:07.483529 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gl8wr\" (UniqueName: \"kubernetes.io/projected/0f791715-ebb0-42db-b8fe-d8ab636b6bb2-kube-api-access-gl8wr\") pod \"openstack-operator-controller-operator-d99bfc6df-bjdz6\" (UID: \"0f791715-ebb0-42db-b8fe-d8ab636b6bb2\") " pod="openstack-operators/openstack-operator-controller-operator-d99bfc6df-bjdz6" Sep 29 11:00:07 crc kubenswrapper[4752]: I0929 11:00:07.543967 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-d99bfc6df-bjdz6" Sep 29 11:00:08 crc kubenswrapper[4752]: I0929 11:00:08.009895 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-d99bfc6df-bjdz6"] Sep 29 11:00:08 crc kubenswrapper[4752]: W0929 11:00:08.024208 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f791715_ebb0_42db_b8fe_d8ab636b6bb2.slice/crio-07cb7bf2865db26872bcfb7f57e495159a9b768908854359059416eab3069b00 WatchSource:0}: Error finding container 07cb7bf2865db26872bcfb7f57e495159a9b768908854359059416eab3069b00: Status 404 returned error can't find the container with id 07cb7bf2865db26872bcfb7f57e495159a9b768908854359059416eab3069b00 Sep 29 11:00:08 crc kubenswrapper[4752]: I0929 11:00:08.691315 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-d99bfc6df-bjdz6" event={"ID":"0f791715-ebb0-42db-b8fe-d8ab636b6bb2","Type":"ContainerStarted","Data":"07cb7bf2865db26872bcfb7f57e495159a9b768908854359059416eab3069b00"} Sep 29 11:00:12 crc kubenswrapper[4752]: I0929 11:00:12.723777 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-d99bfc6df-bjdz6" event={"ID":"0f791715-ebb0-42db-b8fe-d8ab636b6bb2","Type":"ContainerStarted","Data":"11a66f3c5379b55a900c64074dc7d5dc5543cd96b9dab995ba249b4ca7985f79"} Sep 29 11:00:14 crc kubenswrapper[4752]: I0929 11:00:14.739956 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-d99bfc6df-bjdz6" event={"ID":"0f791715-ebb0-42db-b8fe-d8ab636b6bb2","Type":"ContainerStarted","Data":"391ef9bae562fcc2f50ccfa518484444a0d56e60fc13efacfdb5cef3e57933b1"} Sep 29 11:00:14 crc kubenswrapper[4752]: I0929 11:00:14.740401 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-d99bfc6df-bjdz6" Sep 29 11:00:14 crc kubenswrapper[4752]: I0929 11:00:14.780900 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-d99bfc6df-bjdz6" podStartSLOduration=1.454712171 podStartE2EDuration="7.780875042s" podCreationTimestamp="2025-09-29 11:00:07 +0000 UTC" firstStartedPulling="2025-09-29 11:00:08.026996123 +0000 UTC m=+948.816137780" lastFinishedPulling="2025-09-29 11:00:14.353158984 +0000 UTC m=+955.142300651" observedRunningTime="2025-09-29 11:00:14.779108766 +0000 UTC m=+955.568250453" watchObservedRunningTime="2025-09-29 11:00:14.780875042 +0000 UTC m=+955.570016729" Sep 29 11:00:17 crc kubenswrapper[4752]: I0929 11:00:17.547665 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-d99bfc6df-bjdz6" Sep 29 11:00:43 crc kubenswrapper[4752]: I0929 11:00:43.715577 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-748c574d75-s6kn6"] Sep 29 11:00:43 crc kubenswrapper[4752]: I0929 11:00:43.717723 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-748c574d75-s6kn6" Sep 29 11:00:43 crc kubenswrapper[4752]: I0929 11:00:43.719364 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6495d75b5-vdwdn"] Sep 29 11:00:43 crc kubenswrapper[4752]: I0929 11:00:43.720356 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6495d75b5-vdwdn" Sep 29 11:00:43 crc kubenswrapper[4752]: I0929 11:00:43.720405 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-k6pv7" Sep 29 11:00:43 crc kubenswrapper[4752]: I0929 11:00:43.722622 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-4bmss" Sep 29 11:00:43 crc kubenswrapper[4752]: I0929 11:00:43.729922 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-7d74f4d695-gnrn4"] Sep 29 11:00:43 crc kubenswrapper[4752]: I0929 11:00:43.731232 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-7d74f4d695-gnrn4" Sep 29 11:00:43 crc kubenswrapper[4752]: I0929 11:00:43.733311 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-b4t9q" Sep 29 11:00:43 crc kubenswrapper[4752]: I0929 11:00:43.741108 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-748c574d75-s6kn6"] Sep 29 11:00:43 crc kubenswrapper[4752]: I0929 11:00:43.745685 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6495d75b5-vdwdn"] Sep 29 11:00:43 crc kubenswrapper[4752]: I0929 11:00:43.766357 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-67b5d44b7f-fzv5x"] Sep 29 11:00:43 crc kubenswrapper[4752]: I0929 11:00:43.767446 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-67b5d44b7f-fzv5x" Sep 29 11:00:43 crc kubenswrapper[4752]: I0929 11:00:43.770615 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-85s9w" Sep 29 11:00:43 crc kubenswrapper[4752]: I0929 11:00:43.800774 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-7d74f4d695-gnrn4"] Sep 29 11:00:43 crc kubenswrapper[4752]: I0929 11:00:43.811868 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-8ff95898-m96g4"] Sep 29 11:00:43 crc kubenswrapper[4752]: I0929 11:00:43.813132 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-8ff95898-m96g4" Sep 29 11:00:43 crc kubenswrapper[4752]: I0929 11:00:43.818681 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tftsx\" (UniqueName: \"kubernetes.io/projected/22f0a092-282c-4339-b57e-29bba94f1c26-kube-api-access-tftsx\") pod \"cinder-operator-controller-manager-748c574d75-s6kn6\" (UID: \"22f0a092-282c-4339-b57e-29bba94f1c26\") " pod="openstack-operators/cinder-operator-controller-manager-748c574d75-s6kn6" Sep 29 11:00:43 crc kubenswrapper[4752]: I0929 11:00:43.818767 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnzcc\" (UniqueName: \"kubernetes.io/projected/bcc941f9-bb40-42c6-a04a-2dbccdb5c63d-kube-api-access-pnzcc\") pod \"barbican-operator-controller-manager-6495d75b5-vdwdn\" (UID: \"bcc941f9-bb40-42c6-a04a-2dbccdb5c63d\") " pod="openstack-operators/barbican-operator-controller-manager-6495d75b5-vdwdn" Sep 29 11:00:43 crc kubenswrapper[4752]: I0929 11:00:43.821878 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-kcvhk" Sep 29 11:00:43 crc kubenswrapper[4752]: I0929 11:00:43.822068 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-67b5d44b7f-fzv5x"] Sep 29 11:00:43 crc kubenswrapper[4752]: I0929 11:00:43.850093 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-695847bc78-6ffsd"] Sep 29 11:00:43 crc kubenswrapper[4752]: I0929 11:00:43.851382 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-695847bc78-6ffsd" Sep 29 11:00:43 crc kubenswrapper[4752]: I0929 11:00:43.858161 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-ntjhv" Sep 29 11:00:43 crc kubenswrapper[4752]: I0929 11:00:43.865873 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-8ff95898-m96g4"] Sep 29 11:00:43 crc kubenswrapper[4752]: I0929 11:00:43.886504 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-695847bc78-6ffsd"] Sep 29 11:00:43 crc kubenswrapper[4752]: I0929 11:00:43.920933 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6d76\" (UniqueName: \"kubernetes.io/projected/d3d16719-6f3a-40f3-a68e-7ca209588644-kube-api-access-s6d76\") pod \"designate-operator-controller-manager-7d74f4d695-gnrn4\" (UID: \"d3d16719-6f3a-40f3-a68e-7ca209588644\") " pod="openstack-operators/designate-operator-controller-manager-7d74f4d695-gnrn4" Sep 29 11:00:43 crc kubenswrapper[4752]: I0929 11:00:43.920991 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxxrg\" (UniqueName: \"kubernetes.io/projected/ce29c55b-caec-4b27-b9fd-e815c897c38e-kube-api-access-hxxrg\") pod \"heat-operator-controller-manager-8ff95898-m96g4\" (UID: \"ce29c55b-caec-4b27-b9fd-e815c897c38e\") " pod="openstack-operators/heat-operator-controller-manager-8ff95898-m96g4" Sep 29 11:00:43 crc kubenswrapper[4752]: I0929 11:00:43.921061 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tftsx\" (UniqueName: \"kubernetes.io/projected/22f0a092-282c-4339-b57e-29bba94f1c26-kube-api-access-tftsx\") pod \"cinder-operator-controller-manager-748c574d75-s6kn6\" (UID: \"22f0a092-282c-4339-b57e-29bba94f1c26\") " pod="openstack-operators/cinder-operator-controller-manager-748c574d75-s6kn6" Sep 29 11:00:43 crc kubenswrapper[4752]: I0929 11:00:43.921105 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnzcc\" (UniqueName: \"kubernetes.io/projected/bcc941f9-bb40-42c6-a04a-2dbccdb5c63d-kube-api-access-pnzcc\") pod \"barbican-operator-controller-manager-6495d75b5-vdwdn\" (UID: \"bcc941f9-bb40-42c6-a04a-2dbccdb5c63d\") " pod="openstack-operators/barbican-operator-controller-manager-6495d75b5-vdwdn" Sep 29 11:00:43 crc kubenswrapper[4752]: I0929 11:00:43.921130 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cnfv\" (UniqueName: \"kubernetes.io/projected/02766095-b09a-47c5-b0c0-8577cf0c4df0-kube-api-access-8cnfv\") pod \"glance-operator-controller-manager-67b5d44b7f-fzv5x\" (UID: \"02766095-b09a-47c5-b0c0-8577cf0c4df0\") " pod="openstack-operators/glance-operator-controller-manager-67b5d44b7f-fzv5x" Sep 29 11:00:43 crc kubenswrapper[4752]: I0929 11:00:43.928257 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-858cd69f49-c2twd"] Sep 29 11:00:43 crc kubenswrapper[4752]: I0929 11:00:43.929451 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-858cd69f49-c2twd" Sep 29 11:00:43 crc kubenswrapper[4752]: I0929 11:00:43.932403 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Sep 29 11:00:43 crc kubenswrapper[4752]: I0929 11:00:43.940711 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-clfn7" Sep 29 11:00:43 crc kubenswrapper[4752]: I0929 11:00:43.947216 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-9fc8d5567-lncbz"] Sep 29 11:00:43 crc kubenswrapper[4752]: I0929 11:00:43.948429 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-9fc8d5567-lncbz" Sep 29 11:00:43 crc kubenswrapper[4752]: I0929 11:00:43.959548 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-lzmcx" Sep 29 11:00:43 crc kubenswrapper[4752]: I0929 11:00:43.971287 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnzcc\" (UniqueName: \"kubernetes.io/projected/bcc941f9-bb40-42c6-a04a-2dbccdb5c63d-kube-api-access-pnzcc\") pod \"barbican-operator-controller-manager-6495d75b5-vdwdn\" (UID: \"bcc941f9-bb40-42c6-a04a-2dbccdb5c63d\") " pod="openstack-operators/barbican-operator-controller-manager-6495d75b5-vdwdn" Sep 29 11:00:43 crc kubenswrapper[4752]: I0929 11:00:43.972343 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7bf498966c-vwbvq"] Sep 29 11:00:43 crc kubenswrapper[4752]: I0929 11:00:43.973461 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7bf498966c-vwbvq" Sep 29 11:00:43 crc kubenswrapper[4752]: I0929 11:00:43.974854 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tftsx\" (UniqueName: \"kubernetes.io/projected/22f0a092-282c-4339-b57e-29bba94f1c26-kube-api-access-tftsx\") pod \"cinder-operator-controller-manager-748c574d75-s6kn6\" (UID: \"22f0a092-282c-4339-b57e-29bba94f1c26\") " pod="openstack-operators/cinder-operator-controller-manager-748c574d75-s6kn6" Sep 29 11:00:43 crc kubenswrapper[4752]: I0929 11:00:43.976733 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-zfxw5" Sep 29 11:00:43 crc kubenswrapper[4752]: I0929 11:00:43.988422 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-9fc8d5567-lncbz"] Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.008106 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-858cd69f49-c2twd"] Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.013110 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-687b9cf756-qnmsk"] Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.014078 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-687b9cf756-qnmsk" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.019836 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-56cf9c6b99-vxd7j"] Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.020776 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-56cf9c6b99-vxd7j" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.021887 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qf96v\" (UniqueName: \"kubernetes.io/projected/e06f89cc-db11-4692-ab2c-50405feb9ca1-kube-api-access-qf96v\") pod \"infra-operator-controller-manager-858cd69f49-c2twd\" (UID: \"e06f89cc-db11-4692-ab2c-50405feb9ca1\") " pod="openstack-operators/infra-operator-controller-manager-858cd69f49-c2twd" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.021928 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6d76\" (UniqueName: \"kubernetes.io/projected/d3d16719-6f3a-40f3-a68e-7ca209588644-kube-api-access-s6d76\") pod \"designate-operator-controller-manager-7d74f4d695-gnrn4\" (UID: \"d3d16719-6f3a-40f3-a68e-7ca209588644\") " pod="openstack-operators/designate-operator-controller-manager-7d74f4d695-gnrn4" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.021962 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxxrg\" (UniqueName: \"kubernetes.io/projected/ce29c55b-caec-4b27-b9fd-e815c897c38e-kube-api-access-hxxrg\") pod \"heat-operator-controller-manager-8ff95898-m96g4\" (UID: \"ce29c55b-caec-4b27-b9fd-e815c897c38e\") " pod="openstack-operators/heat-operator-controller-manager-8ff95898-m96g4" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.021987 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gr2p\" (UniqueName: \"kubernetes.io/projected/4c1a6d67-1063-44bf-a2fa-be8dde72fabf-kube-api-access-7gr2p\") pod \"horizon-operator-controller-manager-695847bc78-6ffsd\" (UID: \"4c1a6d67-1063-44bf-a2fa-be8dde72fabf\") " pod="openstack-operators/horizon-operator-controller-manager-695847bc78-6ffsd" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.022021 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e06f89cc-db11-4692-ab2c-50405feb9ca1-cert\") pod \"infra-operator-controller-manager-858cd69f49-c2twd\" (UID: \"e06f89cc-db11-4692-ab2c-50405feb9ca1\") " pod="openstack-operators/infra-operator-controller-manager-858cd69f49-c2twd" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.022082 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cnfv\" (UniqueName: \"kubernetes.io/projected/02766095-b09a-47c5-b0c0-8577cf0c4df0-kube-api-access-8cnfv\") pod \"glance-operator-controller-manager-67b5d44b7f-fzv5x\" (UID: \"02766095-b09a-47c5-b0c0-8577cf0c4df0\") " pod="openstack-operators/glance-operator-controller-manager-67b5d44b7f-fzv5x" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.032168 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-nh924" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.032521 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-qk944" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.049103 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7bf498966c-vwbvq"] Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.049723 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-748c574d75-s6kn6" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.053698 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-687b9cf756-qnmsk"] Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.060348 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6495d75b5-vdwdn" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.061795 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cnfv\" (UniqueName: \"kubernetes.io/projected/02766095-b09a-47c5-b0c0-8577cf0c4df0-kube-api-access-8cnfv\") pod \"glance-operator-controller-manager-67b5d44b7f-fzv5x\" (UID: \"02766095-b09a-47c5-b0c0-8577cf0c4df0\") " pod="openstack-operators/glance-operator-controller-manager-67b5d44b7f-fzv5x" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.082330 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-54d766c9f9-w5nvh"] Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.084004 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-54d766c9f9-w5nvh" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.087188 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-989g4" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.088084 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6d76\" (UniqueName: \"kubernetes.io/projected/d3d16719-6f3a-40f3-a68e-7ca209588644-kube-api-access-s6d76\") pod \"designate-operator-controller-manager-7d74f4d695-gnrn4\" (UID: \"d3d16719-6f3a-40f3-a68e-7ca209588644\") " pod="openstack-operators/designate-operator-controller-manager-7d74f4d695-gnrn4" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.091641 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-c7c776c96-5nh8j"] Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.092916 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-67b5d44b7f-fzv5x" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.093240 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-5nh8j" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.110580 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxxrg\" (UniqueName: \"kubernetes.io/projected/ce29c55b-caec-4b27-b9fd-e815c897c38e-kube-api-access-hxxrg\") pod \"heat-operator-controller-manager-8ff95898-m96g4\" (UID: \"ce29c55b-caec-4b27-b9fd-e815c897c38e\") " pod="openstack-operators/heat-operator-controller-manager-8ff95898-m96g4" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.111140 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-7jrxv" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.130082 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fc4m\" (UniqueName: \"kubernetes.io/projected/28e91257-715f-462e-be70-361652522cb3-kube-api-access-7fc4m\") pod \"mariadb-operator-controller-manager-687b9cf756-qnmsk\" (UID: \"28e91257-715f-462e-be70-361652522cb3\") " pod="openstack-operators/mariadb-operator-controller-manager-687b9cf756-qnmsk" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.130171 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qf96v\" (UniqueName: \"kubernetes.io/projected/e06f89cc-db11-4692-ab2c-50405feb9ca1-kube-api-access-qf96v\") pod \"infra-operator-controller-manager-858cd69f49-c2twd\" (UID: \"e06f89cc-db11-4692-ab2c-50405feb9ca1\") " pod="openstack-operators/infra-operator-controller-manager-858cd69f49-c2twd" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.130303 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gr2p\" (UniqueName: \"kubernetes.io/projected/4c1a6d67-1063-44bf-a2fa-be8dde72fabf-kube-api-access-7gr2p\") pod \"horizon-operator-controller-manager-695847bc78-6ffsd\" (UID: \"4c1a6d67-1063-44bf-a2fa-be8dde72fabf\") " pod="openstack-operators/horizon-operator-controller-manager-695847bc78-6ffsd" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.130377 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghf28\" (UniqueName: \"kubernetes.io/projected/c95eca0a-b789-4db4-a906-d3323f1ee7ed-kube-api-access-ghf28\") pod \"manila-operator-controller-manager-56cf9c6b99-vxd7j\" (UID: \"c95eca0a-b789-4db4-a906-d3323f1ee7ed\") " pod="openstack-operators/manila-operator-controller-manager-56cf9c6b99-vxd7j" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.130470 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8c46b\" (UniqueName: \"kubernetes.io/projected/4ba7065e-6eff-42bb-acc3-2595f5cc8e71-kube-api-access-8c46b\") pod \"keystone-operator-controller-manager-7bf498966c-vwbvq\" (UID: \"4ba7065e-6eff-42bb-acc3-2595f5cc8e71\") " pod="openstack-operators/keystone-operator-controller-manager-7bf498966c-vwbvq" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.130498 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e06f89cc-db11-4692-ab2c-50405feb9ca1-cert\") pod \"infra-operator-controller-manager-858cd69f49-c2twd\" (UID: \"e06f89cc-db11-4692-ab2c-50405feb9ca1\") " pod="openstack-operators/infra-operator-controller-manager-858cd69f49-c2twd" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.130526 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qc2dz\" (UniqueName: \"kubernetes.io/projected/6f6d7f0e-296b-493f-8e5b-82dc348a3e6d-kube-api-access-qc2dz\") pod \"ironic-operator-controller-manager-9fc8d5567-lncbz\" (UID: \"6f6d7f0e-296b-493f-8e5b-82dc348a3e6d\") " pod="openstack-operators/ironic-operator-controller-manager-9fc8d5567-lncbz" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.168303 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-8ff95898-m96g4" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.174048 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-54d766c9f9-w5nvh"] Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.206192 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e06f89cc-db11-4692-ab2c-50405feb9ca1-cert\") pod \"infra-operator-controller-manager-858cd69f49-c2twd\" (UID: \"e06f89cc-db11-4692-ab2c-50405feb9ca1\") " pod="openstack-operators/infra-operator-controller-manager-858cd69f49-c2twd" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.234498 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghf28\" (UniqueName: \"kubernetes.io/projected/c95eca0a-b789-4db4-a906-d3323f1ee7ed-kube-api-access-ghf28\") pod \"manila-operator-controller-manager-56cf9c6b99-vxd7j\" (UID: \"c95eca0a-b789-4db4-a906-d3323f1ee7ed\") " pod="openstack-operators/manila-operator-controller-manager-56cf9c6b99-vxd7j" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.234580 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8c46b\" (UniqueName: \"kubernetes.io/projected/4ba7065e-6eff-42bb-acc3-2595f5cc8e71-kube-api-access-8c46b\") pod \"keystone-operator-controller-manager-7bf498966c-vwbvq\" (UID: \"4ba7065e-6eff-42bb-acc3-2595f5cc8e71\") " pod="openstack-operators/keystone-operator-controller-manager-7bf498966c-vwbvq" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.234624 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qc2dz\" (UniqueName: \"kubernetes.io/projected/6f6d7f0e-296b-493f-8e5b-82dc348a3e6d-kube-api-access-qc2dz\") pod \"ironic-operator-controller-manager-9fc8d5567-lncbz\" (UID: \"6f6d7f0e-296b-493f-8e5b-82dc348a3e6d\") " pod="openstack-operators/ironic-operator-controller-manager-9fc8d5567-lncbz" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.234693 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzrh8\" (UniqueName: \"kubernetes.io/projected/97664b33-b187-472d-8e08-462174d3e49a-kube-api-access-fzrh8\") pod \"neutron-operator-controller-manager-54d766c9f9-w5nvh\" (UID: \"97664b33-b187-472d-8e08-462174d3e49a\") " pod="openstack-operators/neutron-operator-controller-manager-54d766c9f9-w5nvh" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.234753 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fc4m\" (UniqueName: \"kubernetes.io/projected/28e91257-715f-462e-be70-361652522cb3-kube-api-access-7fc4m\") pod \"mariadb-operator-controller-manager-687b9cf756-qnmsk\" (UID: \"28e91257-715f-462e-be70-361652522cb3\") " pod="openstack-operators/mariadb-operator-controller-manager-687b9cf756-qnmsk" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.234827 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4h8h4\" (UniqueName: \"kubernetes.io/projected/db7baf7a-c952-4e2d-adaa-87815a1ad895-kube-api-access-4h8h4\") pod \"nova-operator-controller-manager-c7c776c96-5nh8j\" (UID: \"db7baf7a-c952-4e2d-adaa-87815a1ad895\") " pod="openstack-operators/nova-operator-controller-manager-c7c776c96-5nh8j" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.235771 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-56cf9c6b99-vxd7j"] Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.248610 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gr2p\" (UniqueName: \"kubernetes.io/projected/4c1a6d67-1063-44bf-a2fa-be8dde72fabf-kube-api-access-7gr2p\") pod \"horizon-operator-controller-manager-695847bc78-6ffsd\" (UID: \"4c1a6d67-1063-44bf-a2fa-be8dde72fabf\") " pod="openstack-operators/horizon-operator-controller-manager-695847bc78-6ffsd" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.256124 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-gkcw9"] Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.257461 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-gkcw9" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.261696 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-skjrq" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.261934 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qf96v\" (UniqueName: \"kubernetes.io/projected/e06f89cc-db11-4692-ab2c-50405feb9ca1-kube-api-access-qf96v\") pod \"infra-operator-controller-manager-858cd69f49-c2twd\" (UID: \"e06f89cc-db11-4692-ab2c-50405feb9ca1\") " pod="openstack-operators/infra-operator-controller-manager-858cd69f49-c2twd" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.262076 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-gkcw9"] Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.263726 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8c46b\" (UniqueName: \"kubernetes.io/projected/4ba7065e-6eff-42bb-acc3-2595f5cc8e71-kube-api-access-8c46b\") pod \"keystone-operator-controller-manager-7bf498966c-vwbvq\" (UID: \"4ba7065e-6eff-42bb-acc3-2595f5cc8e71\") " pod="openstack-operators/keystone-operator-controller-manager-7bf498966c-vwbvq" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.265146 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qc2dz\" (UniqueName: \"kubernetes.io/projected/6f6d7f0e-296b-493f-8e5b-82dc348a3e6d-kube-api-access-qc2dz\") pod \"ironic-operator-controller-manager-9fc8d5567-lncbz\" (UID: \"6f6d7f0e-296b-493f-8e5b-82dc348a3e6d\") " pod="openstack-operators/ironic-operator-controller-manager-9fc8d5567-lncbz" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.272169 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghf28\" (UniqueName: \"kubernetes.io/projected/c95eca0a-b789-4db4-a906-d3323f1ee7ed-kube-api-access-ghf28\") pod \"manila-operator-controller-manager-56cf9c6b99-vxd7j\" (UID: \"c95eca0a-b789-4db4-a906-d3323f1ee7ed\") " pod="openstack-operators/manila-operator-controller-manager-56cf9c6b99-vxd7j" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.272440 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fc4m\" (UniqueName: \"kubernetes.io/projected/28e91257-715f-462e-be70-361652522cb3-kube-api-access-7fc4m\") pod \"mariadb-operator-controller-manager-687b9cf756-qnmsk\" (UID: \"28e91257-715f-462e-be70-361652522cb3\") " pod="openstack-operators/mariadb-operator-controller-manager-687b9cf756-qnmsk" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.275362 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-c7c776c96-5nh8j"] Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.286165 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-56cf9c6b99-vxd7j" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.292252 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-474s5"] Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.294999 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-474s5" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.297911 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-gmrpg" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.298522 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.311299 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-5f95c46c78-tg8bp"] Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.312569 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-5f95c46c78-tg8bp" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.321343 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-tjqzp" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.342770 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzrh8\" (UniqueName: \"kubernetes.io/projected/97664b33-b187-472d-8e08-462174d3e49a-kube-api-access-fzrh8\") pod \"neutron-operator-controller-manager-54d766c9f9-w5nvh\" (UID: \"97664b33-b187-472d-8e08-462174d3e49a\") " pod="openstack-operators/neutron-operator-controller-manager-54d766c9f9-w5nvh" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.357765 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4h8h4\" (UniqueName: \"kubernetes.io/projected/db7baf7a-c952-4e2d-adaa-87815a1ad895-kube-api-access-4h8h4\") pod \"nova-operator-controller-manager-c7c776c96-5nh8j\" (UID: \"db7baf7a-c952-4e2d-adaa-87815a1ad895\") " pod="openstack-operators/nova-operator-controller-manager-c7c776c96-5nh8j" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.346206 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-774b97b48-zgk9d"] Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.346922 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-9fc8d5567-lncbz" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.366049 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-5f95c46c78-tg8bp"] Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.366483 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-774b97b48-zgk9d" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.366519 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-774b97b48-zgk9d"] Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.366745 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7bf498966c-vwbvq" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.370147 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-s7qgx" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.370735 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-7d74f4d695-gnrn4" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.374874 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-474s5"] Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.381654 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzrh8\" (UniqueName: \"kubernetes.io/projected/97664b33-b187-472d-8e08-462174d3e49a-kube-api-access-fzrh8\") pod \"neutron-operator-controller-manager-54d766c9f9-w5nvh\" (UID: \"97664b33-b187-472d-8e08-462174d3e49a\") " pod="openstack-operators/neutron-operator-controller-manager-54d766c9f9-w5nvh" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.381761 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-bc7dc7bd9-48hc9"] Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.383978 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-48hc9" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.387721 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5bf96cfbc4-g82n6"] Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.389196 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-p6md9" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.389311 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5bf96cfbc4-g82n6" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.392762 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-lmwfw" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.400321 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5bf96cfbc4-g82n6"] Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.404082 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-bc7dc7bd9-48hc9"] Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.407003 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-f66b554c6-rwzzh"] Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.413525 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-f66b554c6-rwzzh" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.417730 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-lgx2j" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.418120 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-f66b554c6-rwzzh"] Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.428848 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-586b879c47-crvdt"] Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.429735 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4h8h4\" (UniqueName: \"kubernetes.io/projected/db7baf7a-c952-4e2d-adaa-87815a1ad895-kube-api-access-4h8h4\") pod \"nova-operator-controller-manager-c7c776c96-5nh8j\" (UID: \"db7baf7a-c952-4e2d-adaa-87815a1ad895\") " pod="openstack-operators/nova-operator-controller-manager-c7c776c96-5nh8j" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.430124 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-586b879c47-crvdt" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.437964 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-wg26p" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.443206 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-586b879c47-crvdt"] Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.462022 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgcmf\" (UniqueName: \"kubernetes.io/projected/06abe84c-c450-4214-b94c-dc8ac39422bd-kube-api-access-xgcmf\") pod \"placement-operator-controller-manager-774b97b48-zgk9d\" (UID: \"06abe84c-c450-4214-b94c-dc8ac39422bd\") " pod="openstack-operators/placement-operator-controller-manager-774b97b48-zgk9d" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.462108 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kx6c\" (UniqueName: \"kubernetes.io/projected/6c7e19dc-afaf-4b5e-96b0-2049f29b5d0a-kube-api-access-7kx6c\") pod \"openstack-baremetal-operator-controller-manager-6d776955-474s5\" (UID: \"6c7e19dc-afaf-4b5e-96b0-2049f29b5d0a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-474s5" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.462153 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btdj6\" (UniqueName: \"kubernetes.io/projected/f6d91c53-a4c9-4516-8666-75dac446a27e-kube-api-access-btdj6\") pod \"swift-operator-controller-manager-bc7dc7bd9-48hc9\" (UID: \"f6d91c53-a4c9-4516-8666-75dac446a27e\") " pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-48hc9" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.462211 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59m5c\" (UniqueName: \"kubernetes.io/projected/ac414d81-10d8-4ef8-aeed-3f8bf43eae1d-kube-api-access-59m5c\") pod \"telemetry-operator-controller-manager-5bf96cfbc4-g82n6\" (UID: \"ac414d81-10d8-4ef8-aeed-3f8bf43eae1d\") " pod="openstack-operators/telemetry-operator-controller-manager-5bf96cfbc4-g82n6" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.462263 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2c82\" (UniqueName: \"kubernetes.io/projected/2aa7ba00-f408-407e-83cc-ce6b2c5b51fa-kube-api-access-w2c82\") pod \"watcher-operator-controller-manager-586b879c47-crvdt\" (UID: \"2aa7ba00-f408-407e-83cc-ce6b2c5b51fa\") " pod="openstack-operators/watcher-operator-controller-manager-586b879c47-crvdt" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.462297 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kxdm\" (UniqueName: \"kubernetes.io/projected/c35176b6-d50b-4e71-9c73-063a8213988c-kube-api-access-5kxdm\") pod \"octavia-operator-controller-manager-76fcc6dc7c-gkcw9\" (UID: \"c35176b6-d50b-4e71-9c73-063a8213988c\") " pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-gkcw9" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.462337 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2pwb\" (UniqueName: \"kubernetes.io/projected/54b928b4-cc8d-4093-8aef-4d6540a226c3-kube-api-access-s2pwb\") pod \"ovn-operator-controller-manager-5f95c46c78-tg8bp\" (UID: \"54b928b4-cc8d-4093-8aef-4d6540a226c3\") " pod="openstack-operators/ovn-operator-controller-manager-5f95c46c78-tg8bp" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.462359 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6c7e19dc-afaf-4b5e-96b0-2049f29b5d0a-cert\") pod \"openstack-baremetal-operator-controller-manager-6d776955-474s5\" (UID: \"6c7e19dc-afaf-4b5e-96b0-2049f29b5d0a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-474s5" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.462403 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27dl8\" (UniqueName: \"kubernetes.io/projected/ec44f1a0-8747-41d6-bf36-9899957065fc-kube-api-access-27dl8\") pod \"test-operator-controller-manager-f66b554c6-rwzzh\" (UID: \"ec44f1a0-8747-41d6-bf36-9899957065fc\") " pod="openstack-operators/test-operator-controller-manager-f66b554c6-rwzzh" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.484092 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-695847bc78-6ffsd" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.535128 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-687b9cf756-qnmsk" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.543050 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-74b6d8f6ff-gsjz5"] Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.554697 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-74b6d8f6ff-gsjz5" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.555276 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-858cd69f49-c2twd" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.559985 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-5nh8j" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.564722 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btdj6\" (UniqueName: \"kubernetes.io/projected/f6d91c53-a4c9-4516-8666-75dac446a27e-kube-api-access-btdj6\") pod \"swift-operator-controller-manager-bc7dc7bd9-48hc9\" (UID: \"f6d91c53-a4c9-4516-8666-75dac446a27e\") " pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-48hc9" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.564779 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59m5c\" (UniqueName: \"kubernetes.io/projected/ac414d81-10d8-4ef8-aeed-3f8bf43eae1d-kube-api-access-59m5c\") pod \"telemetry-operator-controller-manager-5bf96cfbc4-g82n6\" (UID: \"ac414d81-10d8-4ef8-aeed-3f8bf43eae1d\") " pod="openstack-operators/telemetry-operator-controller-manager-5bf96cfbc4-g82n6" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.564816 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2c82\" (UniqueName: \"kubernetes.io/projected/2aa7ba00-f408-407e-83cc-ce6b2c5b51fa-kube-api-access-w2c82\") pod \"watcher-operator-controller-manager-586b879c47-crvdt\" (UID: \"2aa7ba00-f408-407e-83cc-ce6b2c5b51fa\") " pod="openstack-operators/watcher-operator-controller-manager-586b879c47-crvdt" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.564843 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kxdm\" (UniqueName: \"kubernetes.io/projected/c35176b6-d50b-4e71-9c73-063a8213988c-kube-api-access-5kxdm\") pod \"octavia-operator-controller-manager-76fcc6dc7c-gkcw9\" (UID: \"c35176b6-d50b-4e71-9c73-063a8213988c\") " pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-gkcw9" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.564875 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2pwb\" (UniqueName: \"kubernetes.io/projected/54b928b4-cc8d-4093-8aef-4d6540a226c3-kube-api-access-s2pwb\") pod \"ovn-operator-controller-manager-5f95c46c78-tg8bp\" (UID: \"54b928b4-cc8d-4093-8aef-4d6540a226c3\") " pod="openstack-operators/ovn-operator-controller-manager-5f95c46c78-tg8bp" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.564892 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6c7e19dc-afaf-4b5e-96b0-2049f29b5d0a-cert\") pod \"openstack-baremetal-operator-controller-manager-6d776955-474s5\" (UID: \"6c7e19dc-afaf-4b5e-96b0-2049f29b5d0a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-474s5" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.564923 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27dl8\" (UniqueName: \"kubernetes.io/projected/ec44f1a0-8747-41d6-bf36-9899957065fc-kube-api-access-27dl8\") pod \"test-operator-controller-manager-f66b554c6-rwzzh\" (UID: \"ec44f1a0-8747-41d6-bf36-9899957065fc\") " pod="openstack-operators/test-operator-controller-manager-f66b554c6-rwzzh" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.564960 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgcmf\" (UniqueName: \"kubernetes.io/projected/06abe84c-c450-4214-b94c-dc8ac39422bd-kube-api-access-xgcmf\") pod \"placement-operator-controller-manager-774b97b48-zgk9d\" (UID: \"06abe84c-c450-4214-b94c-dc8ac39422bd\") " pod="openstack-operators/placement-operator-controller-manager-774b97b48-zgk9d" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.564980 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7kx6c\" (UniqueName: \"kubernetes.io/projected/6c7e19dc-afaf-4b5e-96b0-2049f29b5d0a-kube-api-access-7kx6c\") pod \"openstack-baremetal-operator-controller-manager-6d776955-474s5\" (UID: \"6c7e19dc-afaf-4b5e-96b0-2049f29b5d0a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-474s5" Sep 29 11:00:44 crc kubenswrapper[4752]: E0929 11:00:44.565445 4752 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Sep 29 11:00:44 crc kubenswrapper[4752]: E0929 11:00:44.565496 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6c7e19dc-afaf-4b5e-96b0-2049f29b5d0a-cert podName:6c7e19dc-afaf-4b5e-96b0-2049f29b5d0a nodeName:}" failed. No retries permitted until 2025-09-29 11:00:45.065479685 +0000 UTC m=+985.854621352 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6c7e19dc-afaf-4b5e-96b0-2049f29b5d0a-cert") pod "openstack-baremetal-operator-controller-manager-6d776955-474s5" (UID: "6c7e19dc-afaf-4b5e-96b0-2049f29b5d0a") : secret "openstack-baremetal-operator-webhook-server-cert" not found Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.574113 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.574383 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-4gfgd" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.593421 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-74b6d8f6ff-gsjz5"] Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.627726 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-54d766c9f9-w5nvh" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.642019 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2c82\" (UniqueName: \"kubernetes.io/projected/2aa7ba00-f408-407e-83cc-ce6b2c5b51fa-kube-api-access-w2c82\") pod \"watcher-operator-controller-manager-586b879c47-crvdt\" (UID: \"2aa7ba00-f408-407e-83cc-ce6b2c5b51fa\") " pod="openstack-operators/watcher-operator-controller-manager-586b879c47-crvdt" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.666446 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tm8q\" (UniqueName: \"kubernetes.io/projected/5471e4e9-e190-4c84-a23b-090158ae7133-kube-api-access-8tm8q\") pod \"openstack-operator-controller-manager-74b6d8f6ff-gsjz5\" (UID: \"5471e4e9-e190-4c84-a23b-090158ae7133\") " pod="openstack-operators/openstack-operator-controller-manager-74b6d8f6ff-gsjz5" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.666550 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5471e4e9-e190-4c84-a23b-090158ae7133-cert\") pod \"openstack-operator-controller-manager-74b6d8f6ff-gsjz5\" (UID: \"5471e4e9-e190-4c84-a23b-090158ae7133\") " pod="openstack-operators/openstack-operator-controller-manager-74b6d8f6ff-gsjz5" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.674542 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59m5c\" (UniqueName: \"kubernetes.io/projected/ac414d81-10d8-4ef8-aeed-3f8bf43eae1d-kube-api-access-59m5c\") pod \"telemetry-operator-controller-manager-5bf96cfbc4-g82n6\" (UID: \"ac414d81-10d8-4ef8-aeed-3f8bf43eae1d\") " pod="openstack-operators/telemetry-operator-controller-manager-5bf96cfbc4-g82n6" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.675902 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgcmf\" (UniqueName: \"kubernetes.io/projected/06abe84c-c450-4214-b94c-dc8ac39422bd-kube-api-access-xgcmf\") pod \"placement-operator-controller-manager-774b97b48-zgk9d\" (UID: \"06abe84c-c450-4214-b94c-dc8ac39422bd\") " pod="openstack-operators/placement-operator-controller-manager-774b97b48-zgk9d" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.676005 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kxdm\" (UniqueName: \"kubernetes.io/projected/c35176b6-d50b-4e71-9c73-063a8213988c-kube-api-access-5kxdm\") pod \"octavia-operator-controller-manager-76fcc6dc7c-gkcw9\" (UID: \"c35176b6-d50b-4e71-9c73-063a8213988c\") " pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-gkcw9" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.676520 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kx6c\" (UniqueName: \"kubernetes.io/projected/6c7e19dc-afaf-4b5e-96b0-2049f29b5d0a-kube-api-access-7kx6c\") pod \"openstack-baremetal-operator-controller-manager-6d776955-474s5\" (UID: \"6c7e19dc-afaf-4b5e-96b0-2049f29b5d0a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-474s5" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.677320 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27dl8\" (UniqueName: \"kubernetes.io/projected/ec44f1a0-8747-41d6-bf36-9899957065fc-kube-api-access-27dl8\") pod \"test-operator-controller-manager-f66b554c6-rwzzh\" (UID: \"ec44f1a0-8747-41d6-bf36-9899957065fc\") " pod="openstack-operators/test-operator-controller-manager-f66b554c6-rwzzh" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.683020 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btdj6\" (UniqueName: \"kubernetes.io/projected/f6d91c53-a4c9-4516-8666-75dac446a27e-kube-api-access-btdj6\") pod \"swift-operator-controller-manager-bc7dc7bd9-48hc9\" (UID: \"f6d91c53-a4c9-4516-8666-75dac446a27e\") " pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-48hc9" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.688052 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-l55rr"] Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.689074 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-l55rr" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.721955 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-gcp29" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.724456 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2pwb\" (UniqueName: \"kubernetes.io/projected/54b928b4-cc8d-4093-8aef-4d6540a226c3-kube-api-access-s2pwb\") pod \"ovn-operator-controller-manager-5f95c46c78-tg8bp\" (UID: \"54b928b4-cc8d-4093-8aef-4d6540a226c3\") " pod="openstack-operators/ovn-operator-controller-manager-5f95c46c78-tg8bp" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.773136 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tm8q\" (UniqueName: \"kubernetes.io/projected/5471e4e9-e190-4c84-a23b-090158ae7133-kube-api-access-8tm8q\") pod \"openstack-operator-controller-manager-74b6d8f6ff-gsjz5\" (UID: \"5471e4e9-e190-4c84-a23b-090158ae7133\") " pod="openstack-operators/openstack-operator-controller-manager-74b6d8f6ff-gsjz5" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.773243 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5471e4e9-e190-4c84-a23b-090158ae7133-cert\") pod \"openstack-operator-controller-manager-74b6d8f6ff-gsjz5\" (UID: \"5471e4e9-e190-4c84-a23b-090158ae7133\") " pod="openstack-operators/openstack-operator-controller-manager-74b6d8f6ff-gsjz5" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.773364 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpqx9\" (UniqueName: \"kubernetes.io/projected/5b974650-8f7f-4826-b767-0dcd35bb6f3f-kube-api-access-rpqx9\") pod \"rabbitmq-cluster-operator-manager-79d8469568-l55rr\" (UID: \"5b974650-8f7f-4826-b767-0dcd35bb6f3f\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-l55rr" Sep 29 11:00:44 crc kubenswrapper[4752]: E0929 11:00:44.773753 4752 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Sep 29 11:00:44 crc kubenswrapper[4752]: E0929 11:00:44.782155 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5471e4e9-e190-4c84-a23b-090158ae7133-cert podName:5471e4e9-e190-4c84-a23b-090158ae7133 nodeName:}" failed. No retries permitted until 2025-09-29 11:00:45.282105157 +0000 UTC m=+986.071246824 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5471e4e9-e190-4c84-a23b-090158ae7133-cert") pod "openstack-operator-controller-manager-74b6d8f6ff-gsjz5" (UID: "5471e4e9-e190-4c84-a23b-090158ae7133") : secret "webhook-server-cert" not found Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.793768 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-774b97b48-zgk9d" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.824880 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tm8q\" (UniqueName: \"kubernetes.io/projected/5471e4e9-e190-4c84-a23b-090158ae7133-kube-api-access-8tm8q\") pod \"openstack-operator-controller-manager-74b6d8f6ff-gsjz5\" (UID: \"5471e4e9-e190-4c84-a23b-090158ae7133\") " pod="openstack-operators/openstack-operator-controller-manager-74b6d8f6ff-gsjz5" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.838339 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5bf96cfbc4-g82n6" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.838856 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-l55rr"] Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.869593 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-f66b554c6-rwzzh" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.889741 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-gkcw9" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.905134 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rpqx9\" (UniqueName: \"kubernetes.io/projected/5b974650-8f7f-4826-b767-0dcd35bb6f3f-kube-api-access-rpqx9\") pod \"rabbitmq-cluster-operator-manager-79d8469568-l55rr\" (UID: \"5b974650-8f7f-4826-b767-0dcd35bb6f3f\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-l55rr" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.916006 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-586b879c47-crvdt" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.970120 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpqx9\" (UniqueName: \"kubernetes.io/projected/5b974650-8f7f-4826-b767-0dcd35bb6f3f-kube-api-access-rpqx9\") pod \"rabbitmq-cluster-operator-manager-79d8469568-l55rr\" (UID: \"5b974650-8f7f-4826-b767-0dcd35bb6f3f\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-l55rr" Sep 29 11:00:44 crc kubenswrapper[4752]: I0929 11:00:44.977952 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-5f95c46c78-tg8bp" Sep 29 11:00:45 crc kubenswrapper[4752]: I0929 11:00:45.109714 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6c7e19dc-afaf-4b5e-96b0-2049f29b5d0a-cert\") pod \"openstack-baremetal-operator-controller-manager-6d776955-474s5\" (UID: \"6c7e19dc-afaf-4b5e-96b0-2049f29b5d0a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-474s5" Sep 29 11:00:45 crc kubenswrapper[4752]: E0929 11:00:45.110411 4752 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Sep 29 11:00:45 crc kubenswrapper[4752]: E0929 11:00:45.110502 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6c7e19dc-afaf-4b5e-96b0-2049f29b5d0a-cert podName:6c7e19dc-afaf-4b5e-96b0-2049f29b5d0a nodeName:}" failed. No retries permitted until 2025-09-29 11:00:46.110477579 +0000 UTC m=+986.899619286 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6c7e19dc-afaf-4b5e-96b0-2049f29b5d0a-cert") pod "openstack-baremetal-operator-controller-manager-6d776955-474s5" (UID: "6c7e19dc-afaf-4b5e-96b0-2049f29b5d0a") : secret "openstack-baremetal-operator-webhook-server-cert" not found Sep 29 11:00:45 crc kubenswrapper[4752]: I0929 11:00:45.117773 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-748c574d75-s6kn6"] Sep 29 11:00:45 crc kubenswrapper[4752]: I0929 11:00:45.308476 4752 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 29 11:00:45 crc kubenswrapper[4752]: I0929 11:00:45.330539 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5471e4e9-e190-4c84-a23b-090158ae7133-cert\") pod \"openstack-operator-controller-manager-74b6d8f6ff-gsjz5\" (UID: \"5471e4e9-e190-4c84-a23b-090158ae7133\") " pod="openstack-operators/openstack-operator-controller-manager-74b6d8f6ff-gsjz5" Sep 29 11:00:45 crc kubenswrapper[4752]: E0929 11:00:45.330789 4752 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Sep 29 11:00:45 crc kubenswrapper[4752]: E0929 11:00:45.330884 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5471e4e9-e190-4c84-a23b-090158ae7133-cert podName:5471e4e9-e190-4c84-a23b-090158ae7133 nodeName:}" failed. No retries permitted until 2025-09-29 11:00:46.330867938 +0000 UTC m=+987.120009605 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5471e4e9-e190-4c84-a23b-090158ae7133-cert") pod "openstack-operator-controller-manager-74b6d8f6ff-gsjz5" (UID: "5471e4e9-e190-4c84-a23b-090158ae7133") : secret "webhook-server-cert" not found Sep 29 11:00:45 crc kubenswrapper[4752]: I0929 11:00:45.408066 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-48hc9" Sep 29 11:00:45 crc kubenswrapper[4752]: I0929 11:00:45.482008 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-l55rr" Sep 29 11:00:45 crc kubenswrapper[4752]: I0929 11:00:45.672076 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6495d75b5-vdwdn"] Sep 29 11:00:45 crc kubenswrapper[4752]: I0929 11:00:45.692633 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-67b5d44b7f-fzv5x"] Sep 29 11:00:45 crc kubenswrapper[4752]: W0929 11:00:45.727778 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02766095_b09a_47c5_b0c0_8577cf0c4df0.slice/crio-82f4fa1fec37ed8f25c54ee58eb9aab1aed4cd8916ff6345024a899b57101250 WatchSource:0}: Error finding container 82f4fa1fec37ed8f25c54ee58eb9aab1aed4cd8916ff6345024a899b57101250: Status 404 returned error can't find the container with id 82f4fa1fec37ed8f25c54ee58eb9aab1aed4cd8916ff6345024a899b57101250 Sep 29 11:00:45 crc kubenswrapper[4752]: I0929 11:00:45.863888 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-56cf9c6b99-vxd7j"] Sep 29 11:00:46 crc kubenswrapper[4752]: I0929 11:00:46.004454 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6495d75b5-vdwdn" event={"ID":"bcc941f9-bb40-42c6-a04a-2dbccdb5c63d","Type":"ContainerStarted","Data":"00bf1f130c88b5a82b603f16ed9c0094bbe654c1a74d85a7d14de950ec1fa381"} Sep 29 11:00:46 crc kubenswrapper[4752]: I0929 11:00:46.006223 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-748c574d75-s6kn6" event={"ID":"22f0a092-282c-4339-b57e-29bba94f1c26","Type":"ContainerStarted","Data":"ca5b888f9fd6687be7d386b7bc6467eac7922c8704ee6ffa3bf7c1f568f852a7"} Sep 29 11:00:46 crc kubenswrapper[4752]: I0929 11:00:46.007748 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-67b5d44b7f-fzv5x" event={"ID":"02766095-b09a-47c5-b0c0-8577cf0c4df0","Type":"ContainerStarted","Data":"82f4fa1fec37ed8f25c54ee58eb9aab1aed4cd8916ff6345024a899b57101250"} Sep 29 11:00:46 crc kubenswrapper[4752]: I0929 11:00:46.009183 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-56cf9c6b99-vxd7j" event={"ID":"c95eca0a-b789-4db4-a906-d3323f1ee7ed","Type":"ContainerStarted","Data":"a5ef1f84354e7baefd8a1a62a902e70e75ef55e6b3085b39caec99c0d15a559b"} Sep 29 11:00:46 crc kubenswrapper[4752]: I0929 11:00:46.045941 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-8ff95898-m96g4"] Sep 29 11:00:46 crc kubenswrapper[4752]: W0929 11:00:46.053733 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ba7065e_6eff_42bb_acc3_2595f5cc8e71.slice/crio-3e93cc8a226acf6afc8d3e6c8a8621826dbecaec65e08022d0e7fbcac0587d8c WatchSource:0}: Error finding container 3e93cc8a226acf6afc8d3e6c8a8621826dbecaec65e08022d0e7fbcac0587d8c: Status 404 returned error can't find the container with id 3e93cc8a226acf6afc8d3e6c8a8621826dbecaec65e08022d0e7fbcac0587d8c Sep 29 11:00:46 crc kubenswrapper[4752]: I0929 11:00:46.056469 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7bf498966c-vwbvq"] Sep 29 11:00:46 crc kubenswrapper[4752]: I0929 11:00:46.153254 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6c7e19dc-afaf-4b5e-96b0-2049f29b5d0a-cert\") pod \"openstack-baremetal-operator-controller-manager-6d776955-474s5\" (UID: \"6c7e19dc-afaf-4b5e-96b0-2049f29b5d0a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-474s5" Sep 29 11:00:46 crc kubenswrapper[4752]: I0929 11:00:46.158723 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6c7e19dc-afaf-4b5e-96b0-2049f29b5d0a-cert\") pod \"openstack-baremetal-operator-controller-manager-6d776955-474s5\" (UID: \"6c7e19dc-afaf-4b5e-96b0-2049f29b5d0a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-474s5" Sep 29 11:00:46 crc kubenswrapper[4752]: I0929 11:00:46.344819 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-687b9cf756-qnmsk"] Sep 29 11:00:46 crc kubenswrapper[4752]: I0929 11:00:46.357859 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5471e4e9-e190-4c84-a23b-090158ae7133-cert\") pod \"openstack-operator-controller-manager-74b6d8f6ff-gsjz5\" (UID: \"5471e4e9-e190-4c84-a23b-090158ae7133\") " pod="openstack-operators/openstack-operator-controller-manager-74b6d8f6ff-gsjz5" Sep 29 11:00:46 crc kubenswrapper[4752]: I0929 11:00:46.361478 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-858cd69f49-c2twd"] Sep 29 11:00:46 crc kubenswrapper[4752]: I0929 11:00:46.361914 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5471e4e9-e190-4c84-a23b-090158ae7133-cert\") pod \"openstack-operator-controller-manager-74b6d8f6ff-gsjz5\" (UID: \"5471e4e9-e190-4c84-a23b-090158ae7133\") " pod="openstack-operators/openstack-operator-controller-manager-74b6d8f6ff-gsjz5" Sep 29 11:00:46 crc kubenswrapper[4752]: I0929 11:00:46.381163 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-c7c776c96-5nh8j"] Sep 29 11:00:46 crc kubenswrapper[4752]: W0929 11:00:46.390583 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb7baf7a_c952_4e2d_adaa_87815a1ad895.slice/crio-39e53cc6a2e0fe1b203b2c0637b9bccba93587eb6a0771d9253e847a79bb6dbd WatchSource:0}: Error finding container 39e53cc6a2e0fe1b203b2c0637b9bccba93587eb6a0771d9253e847a79bb6dbd: Status 404 returned error can't find the container with id 39e53cc6a2e0fe1b203b2c0637b9bccba93587eb6a0771d9253e847a79bb6dbd Sep 29 11:00:46 crc kubenswrapper[4752]: I0929 11:00:46.395753 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-7d74f4d695-gnrn4"] Sep 29 11:00:46 crc kubenswrapper[4752]: W0929 11:00:46.407662 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f6d7f0e_296b_493f_8e5b_82dc348a3e6d.slice/crio-7391138a58ce1d8c4e468ae0f9296d48c410c4504d10f32982fc14f6dc2554ba WatchSource:0}: Error finding container 7391138a58ce1d8c4e468ae0f9296d48c410c4504d10f32982fc14f6dc2554ba: Status 404 returned error can't find the container with id 7391138a58ce1d8c4e468ae0f9296d48c410c4504d10f32982fc14f6dc2554ba Sep 29 11:00:46 crc kubenswrapper[4752]: I0929 11:00:46.413042 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-695847bc78-6ffsd"] Sep 29 11:00:46 crc kubenswrapper[4752]: I0929 11:00:46.431331 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-54d766c9f9-w5nvh"] Sep 29 11:00:46 crc kubenswrapper[4752]: I0929 11:00:46.440249 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-9fc8d5567-lncbz"] Sep 29 11:00:46 crc kubenswrapper[4752]: I0929 11:00:46.440640 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-474s5" Sep 29 11:00:46 crc kubenswrapper[4752]: W0929 11:00:46.450048 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2aa7ba00_f408_407e_83cc_ce6b2c5b51fa.slice/crio-fbc6893ad4a76341a9f2d307acba77d85fd80f866ae2be3d976482a8e905db30 WatchSource:0}: Error finding container fbc6893ad4a76341a9f2d307acba77d85fd80f866ae2be3d976482a8e905db30: Status 404 returned error can't find the container with id fbc6893ad4a76341a9f2d307acba77d85fd80f866ae2be3d976482a8e905db30 Sep 29 11:00:46 crc kubenswrapper[4752]: I0929 11:00:46.451453 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-5f95c46c78-tg8bp"] Sep 29 11:00:46 crc kubenswrapper[4752]: W0929 11:00:46.451963 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod54b928b4_cc8d_4093_8aef_4d6540a226c3.slice/crio-aca8dcbbd20d7abe3d3835dcda4aaef53daae318fdc999f6071866a776b8ca3e WatchSource:0}: Error finding container aca8dcbbd20d7abe3d3835dcda4aaef53daae318fdc999f6071866a776b8ca3e: Status 404 returned error can't find the container with id aca8dcbbd20d7abe3d3835dcda4aaef53daae318fdc999f6071866a776b8ca3e Sep 29 11:00:46 crc kubenswrapper[4752]: E0929 11:00:46.454199 4752 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.66:5001/openstack-k8s-operators/watcher-operator:76b6708e5bfd574ec2485f081768f993dcf4ea88,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-w2c82,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-586b879c47-crvdt_openstack-operators(2aa7ba00-f408-407e-83cc-ce6b2c5b51fa): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Sep 29 11:00:46 crc kubenswrapper[4752]: I0929 11:00:46.456324 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-f66b554c6-rwzzh"] Sep 29 11:00:46 crc kubenswrapper[4752]: E0929 11:00:46.457531 4752 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:26db59a990341558d29c00da7503b2c5b9a415db8cc04a0006f198f30ec016d4,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-s2pwb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-5f95c46c78-tg8bp_openstack-operators(54b928b4-cc8d-4093-8aef-4d6540a226c3): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Sep 29 11:00:46 crc kubenswrapper[4752]: I0929 11:00:46.461682 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-bc7dc7bd9-48hc9"] Sep 29 11:00:46 crc kubenswrapper[4752]: I0929 11:00:46.468257 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-586b879c47-crvdt"] Sep 29 11:00:46 crc kubenswrapper[4752]: W0929 11:00:46.468471 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac414d81_10d8_4ef8_aeed_3f8bf43eae1d.slice/crio-13e26da39a791abb7426a06b5568e04ad610cca58f0b498d8ef3001734d32db8 WatchSource:0}: Error finding container 13e26da39a791abb7426a06b5568e04ad610cca58f0b498d8ef3001734d32db8: Status 404 returned error can't find the container with id 13e26da39a791abb7426a06b5568e04ad610cca58f0b498d8ef3001734d32db8 Sep 29 11:00:46 crc kubenswrapper[4752]: I0929 11:00:46.473265 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5bf96cfbc4-g82n6"] Sep 29 11:00:46 crc kubenswrapper[4752]: E0929 11:00:46.490366 4752 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:ae6fda8cafd6c3ab5d5e9c599d15b02ace61b8eacbac4de3df50427dfab6a0c0,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-59m5c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-5bf96cfbc4-g82n6_openstack-operators(ac414d81-10d8-4ef8-aeed-3f8bf43eae1d): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Sep 29 11:00:46 crc kubenswrapper[4752]: E0929 11:00:46.490617 4752 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:3c6f7d737e0196ec302f44354228d783ad3b210a75703dda3b39c15c01a67e8c,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-btdj6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-bc7dc7bd9-48hc9_openstack-operators(f6d91c53-a4c9-4516-8666-75dac446a27e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Sep 29 11:00:46 crc kubenswrapper[4752]: I0929 11:00:46.618015 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-gkcw9"] Sep 29 11:00:46 crc kubenswrapper[4752]: W0929 11:00:46.636762 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc35176b6_d50b_4e71_9c73_063a8213988c.slice/crio-a0b2a5812ebb20dbfad62876b9703373dd3f7ce142b0efe7ac5e2de37caa6071 WatchSource:0}: Error finding container a0b2a5812ebb20dbfad62876b9703373dd3f7ce142b0efe7ac5e2de37caa6071: Status 404 returned error can't find the container with id a0b2a5812ebb20dbfad62876b9703373dd3f7ce142b0efe7ac5e2de37caa6071 Sep 29 11:00:46 crc kubenswrapper[4752]: I0929 11:00:46.644140 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-74b6d8f6ff-gsjz5" Sep 29 11:00:46 crc kubenswrapper[4752]: I0929 11:00:46.686208 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-774b97b48-zgk9d"] Sep 29 11:00:46 crc kubenswrapper[4752]: I0929 11:00:46.696647 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-l55rr"] Sep 29 11:00:46 crc kubenswrapper[4752]: W0929 11:00:46.710927 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06abe84c_c450_4214_b94c_dc8ac39422bd.slice/crio-44232fe7007edcc970a716ac5be73cd7108591aa076764a4f0f38bc831d851d9 WatchSource:0}: Error finding container 44232fe7007edcc970a716ac5be73cd7108591aa076764a4f0f38bc831d851d9: Status 404 returned error can't find the container with id 44232fe7007edcc970a716ac5be73cd7108591aa076764a4f0f38bc831d851d9 Sep 29 11:00:46 crc kubenswrapper[4752]: E0929 11:00:46.739459 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-586b879c47-crvdt" podUID="2aa7ba00-f408-407e-83cc-ce6b2c5b51fa" Sep 29 11:00:46 crc kubenswrapper[4752]: E0929 11:00:46.758488 4752 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:225524223bf2a7f3a4ce95958fc9ca6fdab02745fb70374e8ff5bf1ddaceda4b,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rpqx9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-79d8469568-l55rr_openstack-operators(5b974650-8f7f-4826-b767-0dcd35bb6f3f): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Sep 29 11:00:46 crc kubenswrapper[4752]: E0929 11:00:46.759987 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-l55rr" podUID="5b974650-8f7f-4826-b767-0dcd35bb6f3f" Sep 29 11:00:46 crc kubenswrapper[4752]: E0929 11:00:46.787847 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-48hc9" podUID="f6d91c53-a4c9-4516-8666-75dac446a27e" Sep 29 11:00:46 crc kubenswrapper[4752]: E0929 11:00:46.807776 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-5bf96cfbc4-g82n6" podUID="ac414d81-10d8-4ef8-aeed-3f8bf43eae1d" Sep 29 11:00:46 crc kubenswrapper[4752]: E0929 11:00:46.813363 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-5f95c46c78-tg8bp" podUID="54b928b4-cc8d-4093-8aef-4d6540a226c3" Sep 29 11:00:47 crc kubenswrapper[4752]: I0929 11:00:47.036521 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-474s5"] Sep 29 11:00:47 crc kubenswrapper[4752]: W0929 11:00:47.061379 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6c7e19dc_afaf_4b5e_96b0_2049f29b5d0a.slice/crio-57a646732c859db406bb29e0608935c36f1189684cea0285a0e79f21af46e083 WatchSource:0}: Error finding container 57a646732c859db406bb29e0608935c36f1189684cea0285a0e79f21af46e083: Status 404 returned error can't find the container with id 57a646732c859db406bb29e0608935c36f1189684cea0285a0e79f21af46e083 Sep 29 11:00:47 crc kubenswrapper[4752]: I0929 11:00:47.066501 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-774b97b48-zgk9d" event={"ID":"06abe84c-c450-4214-b94c-dc8ac39422bd","Type":"ContainerStarted","Data":"44232fe7007edcc970a716ac5be73cd7108591aa076764a4f0f38bc831d851d9"} Sep 29 11:00:47 crc kubenswrapper[4752]: I0929 11:00:47.076211 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5bf96cfbc4-g82n6" event={"ID":"ac414d81-10d8-4ef8-aeed-3f8bf43eae1d","Type":"ContainerStarted","Data":"99c596b3d83e368426d32712c41468e09947cb24d07364333f61a6cfa53063c9"} Sep 29 11:00:47 crc kubenswrapper[4752]: I0929 11:00:47.076277 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5bf96cfbc4-g82n6" event={"ID":"ac414d81-10d8-4ef8-aeed-3f8bf43eae1d","Type":"ContainerStarted","Data":"13e26da39a791abb7426a06b5568e04ad610cca58f0b498d8ef3001734d32db8"} Sep 29 11:00:47 crc kubenswrapper[4752]: E0929 11:00:47.079396 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:ae6fda8cafd6c3ab5d5e9c599d15b02ace61b8eacbac4de3df50427dfab6a0c0\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-5bf96cfbc4-g82n6" podUID="ac414d81-10d8-4ef8-aeed-3f8bf43eae1d" Sep 29 11:00:47 crc kubenswrapper[4752]: I0929 11:00:47.082459 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-5nh8j" event={"ID":"db7baf7a-c952-4e2d-adaa-87815a1ad895","Type":"ContainerStarted","Data":"39e53cc6a2e0fe1b203b2c0637b9bccba93587eb6a0771d9253e847a79bb6dbd"} Sep 29 11:00:47 crc kubenswrapper[4752]: I0929 11:00:47.085671 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-f66b554c6-rwzzh" event={"ID":"ec44f1a0-8747-41d6-bf36-9899957065fc","Type":"ContainerStarted","Data":"ab67df183482a537608fb30482301aabef017763e12561048e5bf7168eb57e56"} Sep 29 11:00:47 crc kubenswrapper[4752]: I0929 11:00:47.097945 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-7d74f4d695-gnrn4" event={"ID":"d3d16719-6f3a-40f3-a68e-7ca209588644","Type":"ContainerStarted","Data":"5ede8e61c9cb7f5412f15bb67b6d27dd06297d3b4f8057afc3fbcbe6ff8cf719"} Sep 29 11:00:47 crc kubenswrapper[4752]: I0929 11:00:47.100310 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-586b879c47-crvdt" event={"ID":"2aa7ba00-f408-407e-83cc-ce6b2c5b51fa","Type":"ContainerStarted","Data":"705735b9f9a1f68d329388ae806fd4f3f45a0de760fd3a0ae8d4129caf84011e"} Sep 29 11:00:47 crc kubenswrapper[4752]: I0929 11:00:47.100359 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-586b879c47-crvdt" event={"ID":"2aa7ba00-f408-407e-83cc-ce6b2c5b51fa","Type":"ContainerStarted","Data":"fbc6893ad4a76341a9f2d307acba77d85fd80f866ae2be3d976482a8e905db30"} Sep 29 11:00:47 crc kubenswrapper[4752]: I0929 11:00:47.104845 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-74b6d8f6ff-gsjz5"] Sep 29 11:00:47 crc kubenswrapper[4752]: I0929 11:00:47.104904 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-858cd69f49-c2twd" event={"ID":"e06f89cc-db11-4692-ab2c-50405feb9ca1","Type":"ContainerStarted","Data":"f2f3d81791e752fcb9b827b0cb15dc25e1e3a04be4a57269d76149b2f508f29e"} Sep 29 11:00:47 crc kubenswrapper[4752]: E0929 11:00:47.106728 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.66:5001/openstack-k8s-operators/watcher-operator:76b6708e5bfd574ec2485f081768f993dcf4ea88\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-586b879c47-crvdt" podUID="2aa7ba00-f408-407e-83cc-ce6b2c5b51fa" Sep 29 11:00:47 crc kubenswrapper[4752]: I0929 11:00:47.107106 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-8ff95898-m96g4" event={"ID":"ce29c55b-caec-4b27-b9fd-e815c897c38e","Type":"ContainerStarted","Data":"7e41ec32b6097a62142ae1623b38d5d81ac3f9237145912760fc594004c94180"} Sep 29 11:00:47 crc kubenswrapper[4752]: W0929 11:00:47.114838 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5471e4e9_e190_4c84_a23b_090158ae7133.slice/crio-a71499e54b9ebd274e422c2da7aef8abfa6efaaf6cb8cb938ccc0fea2c953681 WatchSource:0}: Error finding container a71499e54b9ebd274e422c2da7aef8abfa6efaaf6cb8cb938ccc0fea2c953681: Status 404 returned error can't find the container with id a71499e54b9ebd274e422c2da7aef8abfa6efaaf6cb8cb938ccc0fea2c953681 Sep 29 11:00:47 crc kubenswrapper[4752]: I0929 11:00:47.126716 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-48hc9" event={"ID":"f6d91c53-a4c9-4516-8666-75dac446a27e","Type":"ContainerStarted","Data":"4c9a54736c0b5cdeecab2a8e37f2595b320a362caece78e18c99cad31f970ccf"} Sep 29 11:00:47 crc kubenswrapper[4752]: I0929 11:00:47.126754 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-48hc9" event={"ID":"f6d91c53-a4c9-4516-8666-75dac446a27e","Type":"ContainerStarted","Data":"e685a185e06a714b2a5bb75071c6a7251edf99c0e83e88eadb377e53895554aa"} Sep 29 11:00:47 crc kubenswrapper[4752]: E0929 11:00:47.128639 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:3c6f7d737e0196ec302f44354228d783ad3b210a75703dda3b39c15c01a67e8c\\\"\"" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-48hc9" podUID="f6d91c53-a4c9-4516-8666-75dac446a27e" Sep 29 11:00:47 crc kubenswrapper[4752]: I0929 11:00:47.130889 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-gkcw9" event={"ID":"c35176b6-d50b-4e71-9c73-063a8213988c","Type":"ContainerStarted","Data":"a0b2a5812ebb20dbfad62876b9703373dd3f7ce142b0efe7ac5e2de37caa6071"} Sep 29 11:00:47 crc kubenswrapper[4752]: I0929 11:00:47.143247 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-54d766c9f9-w5nvh" event={"ID":"97664b33-b187-472d-8e08-462174d3e49a","Type":"ContainerStarted","Data":"19800f3e930008fbb7626473e3a2f9c00509048021ce3cca9c00db0529b4ef10"} Sep 29 11:00:47 crc kubenswrapper[4752]: I0929 11:00:47.145612 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-687b9cf756-qnmsk" event={"ID":"28e91257-715f-462e-be70-361652522cb3","Type":"ContainerStarted","Data":"35cfc4d764ee2ca707733d68354b0c9a6e4a30188cfc5c17d7a45e783af04322"} Sep 29 11:00:47 crc kubenswrapper[4752]: I0929 11:00:47.146709 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7bf498966c-vwbvq" event={"ID":"4ba7065e-6eff-42bb-acc3-2595f5cc8e71","Type":"ContainerStarted","Data":"3e93cc8a226acf6afc8d3e6c8a8621826dbecaec65e08022d0e7fbcac0587d8c"} Sep 29 11:00:47 crc kubenswrapper[4752]: I0929 11:00:47.147642 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-695847bc78-6ffsd" event={"ID":"4c1a6d67-1063-44bf-a2fa-be8dde72fabf","Type":"ContainerStarted","Data":"14b68f506690c6583e745b0c307d308eae88b19d6abeae4272f008ad36db1089"} Sep 29 11:00:47 crc kubenswrapper[4752]: I0929 11:00:47.149099 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-l55rr" event={"ID":"5b974650-8f7f-4826-b767-0dcd35bb6f3f","Type":"ContainerStarted","Data":"90c1727116086bac530d04680c6c64e3a8e5d6bb47ceefb80e58e61e73324eb9"} Sep 29 11:00:47 crc kubenswrapper[4752]: E0929 11:00:47.155028 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:225524223bf2a7f3a4ce95958fc9ca6fdab02745fb70374e8ff5bf1ddaceda4b\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-l55rr" podUID="5b974650-8f7f-4826-b767-0dcd35bb6f3f" Sep 29 11:00:47 crc kubenswrapper[4752]: I0929 11:00:47.156632 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-5f95c46c78-tg8bp" event={"ID":"54b928b4-cc8d-4093-8aef-4d6540a226c3","Type":"ContainerStarted","Data":"79143059f6f031b80471b8399800ff454a6a5a620147a40e844667e3e841207a"} Sep 29 11:00:47 crc kubenswrapper[4752]: I0929 11:00:47.156673 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-5f95c46c78-tg8bp" event={"ID":"54b928b4-cc8d-4093-8aef-4d6540a226c3","Type":"ContainerStarted","Data":"aca8dcbbd20d7abe3d3835dcda4aaef53daae318fdc999f6071866a776b8ca3e"} Sep 29 11:00:47 crc kubenswrapper[4752]: E0929 11:00:47.158608 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:26db59a990341558d29c00da7503b2c5b9a415db8cc04a0006f198f30ec016d4\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-5f95c46c78-tg8bp" podUID="54b928b4-cc8d-4093-8aef-4d6540a226c3" Sep 29 11:00:47 crc kubenswrapper[4752]: I0929 11:00:47.160430 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-9fc8d5567-lncbz" event={"ID":"6f6d7f0e-296b-493f-8e5b-82dc348a3e6d","Type":"ContainerStarted","Data":"7391138a58ce1d8c4e468ae0f9296d48c410c4504d10f32982fc14f6dc2554ba"} Sep 29 11:00:48 crc kubenswrapper[4752]: I0929 11:00:48.183097 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-74b6d8f6ff-gsjz5" event={"ID":"5471e4e9-e190-4c84-a23b-090158ae7133","Type":"ContainerStarted","Data":"3e3ca79f553eacaf7ef4a4f95a3b2460a78611ffb3e6ef4f06b251297f17ab9c"} Sep 29 11:00:48 crc kubenswrapper[4752]: I0929 11:00:48.183503 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-74b6d8f6ff-gsjz5" event={"ID":"5471e4e9-e190-4c84-a23b-090158ae7133","Type":"ContainerStarted","Data":"6b207143f189a394b7186ee90f2ba0c1fb799613b417eba682d753018907bb6e"} Sep 29 11:00:48 crc kubenswrapper[4752]: I0929 11:00:48.183516 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-74b6d8f6ff-gsjz5" event={"ID":"5471e4e9-e190-4c84-a23b-090158ae7133","Type":"ContainerStarted","Data":"a71499e54b9ebd274e422c2da7aef8abfa6efaaf6cb8cb938ccc0fea2c953681"} Sep 29 11:00:48 crc kubenswrapper[4752]: I0929 11:00:48.184469 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-74b6d8f6ff-gsjz5" Sep 29 11:00:48 crc kubenswrapper[4752]: I0929 11:00:48.193631 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-474s5" event={"ID":"6c7e19dc-afaf-4b5e-96b0-2049f29b5d0a","Type":"ContainerStarted","Data":"57a646732c859db406bb29e0608935c36f1189684cea0285a0e79f21af46e083"} Sep 29 11:00:48 crc kubenswrapper[4752]: E0929 11:00:48.197140 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:225524223bf2a7f3a4ce95958fc9ca6fdab02745fb70374e8ff5bf1ddaceda4b\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-l55rr" podUID="5b974650-8f7f-4826-b767-0dcd35bb6f3f" Sep 29 11:00:48 crc kubenswrapper[4752]: E0929 11:00:48.197322 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:26db59a990341558d29c00da7503b2c5b9a415db8cc04a0006f198f30ec016d4\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-5f95c46c78-tg8bp" podUID="54b928b4-cc8d-4093-8aef-4d6540a226c3" Sep 29 11:00:48 crc kubenswrapper[4752]: E0929 11:00:48.201338 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:3c6f7d737e0196ec302f44354228d783ad3b210a75703dda3b39c15c01a67e8c\\\"\"" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-48hc9" podUID="f6d91c53-a4c9-4516-8666-75dac446a27e" Sep 29 11:00:48 crc kubenswrapper[4752]: E0929 11:00:48.202473 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.66:5001/openstack-k8s-operators/watcher-operator:76b6708e5bfd574ec2485f081768f993dcf4ea88\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-586b879c47-crvdt" podUID="2aa7ba00-f408-407e-83cc-ce6b2c5b51fa" Sep 29 11:00:48 crc kubenswrapper[4752]: E0929 11:00:48.203915 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:ae6fda8cafd6c3ab5d5e9c599d15b02ace61b8eacbac4de3df50427dfab6a0c0\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-5bf96cfbc4-g82n6" podUID="ac414d81-10d8-4ef8-aeed-3f8bf43eae1d" Sep 29 11:00:48 crc kubenswrapper[4752]: I0929 11:00:48.223241 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-74b6d8f6ff-gsjz5" podStartSLOduration=4.223222534 podStartE2EDuration="4.223222534s" podCreationTimestamp="2025-09-29 11:00:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 11:00:48.218236975 +0000 UTC m=+989.007378642" watchObservedRunningTime="2025-09-29 11:00:48.223222534 +0000 UTC m=+989.012364201" Sep 29 11:00:56 crc kubenswrapper[4752]: I0929 11:00:56.651105 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-74b6d8f6ff-gsjz5" Sep 29 11:00:58 crc kubenswrapper[4752]: I0929 11:00:58.326422 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-748c574d75-s6kn6" event={"ID":"22f0a092-282c-4339-b57e-29bba94f1c26","Type":"ContainerStarted","Data":"f41884764d8ba90def4b1608576d2fea2d4003fed0d1f6e8fa188cc07ded1ffb"} Sep 29 11:00:58 crc kubenswrapper[4752]: I0929 11:00:58.338101 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-474s5" event={"ID":"6c7e19dc-afaf-4b5e-96b0-2049f29b5d0a","Type":"ContainerStarted","Data":"53e9d1175a342a4407bc586c6a1ad97b6069edf714e7a7305da87b15835dea5d"} Sep 29 11:00:58 crc kubenswrapper[4752]: I0929 11:00:58.355452 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-56cf9c6b99-vxd7j" event={"ID":"c95eca0a-b789-4db4-a906-d3323f1ee7ed","Type":"ContainerStarted","Data":"e89e96c98b4801d10c99b4ace0eb9fc6ffc27d336eb1baa8228f7796674fd14b"} Sep 29 11:00:58 crc kubenswrapper[4752]: I0929 11:00:58.360704 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-8ff95898-m96g4" event={"ID":"ce29c55b-caec-4b27-b9fd-e815c897c38e","Type":"ContainerStarted","Data":"ca6da17769df6841cf6de52471e7885be682aaed09fab791b48915b466faf7cf"} Sep 29 11:00:58 crc kubenswrapper[4752]: I0929 11:00:58.361863 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-gkcw9" event={"ID":"c35176b6-d50b-4e71-9c73-063a8213988c","Type":"ContainerStarted","Data":"96803dba5f5ff1c2044f17daa52d763a87a3ef190e6952afb754ed0a16b56858"} Sep 29 11:00:58 crc kubenswrapper[4752]: I0929 11:00:58.379733 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-f66b554c6-rwzzh" event={"ID":"ec44f1a0-8747-41d6-bf36-9899957065fc","Type":"ContainerStarted","Data":"18a3f950db59b7ad22d8a5d7022e6cac4c6c57f7afa4cc8c9b5ee0f152d2fc53"} Sep 29 11:00:58 crc kubenswrapper[4752]: I0929 11:00:58.405442 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-54d766c9f9-w5nvh" event={"ID":"97664b33-b187-472d-8e08-462174d3e49a","Type":"ContainerStarted","Data":"93dd26d7d87679aa3b755e51a849f4acbfaa297058ba094b174860af1075aef1"} Sep 29 11:00:58 crc kubenswrapper[4752]: I0929 11:00:58.418974 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7bf498966c-vwbvq" event={"ID":"4ba7065e-6eff-42bb-acc3-2595f5cc8e71","Type":"ContainerStarted","Data":"0b3ab13d2f882db61892d7694626ba2f8e4f1b8def1f6fc47945c8f25d010f2e"} Sep 29 11:00:58 crc kubenswrapper[4752]: I0929 11:00:58.430156 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-695847bc78-6ffsd" event={"ID":"4c1a6d67-1063-44bf-a2fa-be8dde72fabf","Type":"ContainerStarted","Data":"a5d728cf297602146edb1a3499bfbab258e921ded792ee7602a4085ea5f19bf4"} Sep 29 11:00:58 crc kubenswrapper[4752]: I0929 11:00:58.447501 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-774b97b48-zgk9d" event={"ID":"06abe84c-c450-4214-b94c-dc8ac39422bd","Type":"ContainerStarted","Data":"3700303ce13edd1d749705d803944b64f0116d8ec80566d4ff22c63fd1ed45f6"} Sep 29 11:00:58 crc kubenswrapper[4752]: I0929 11:00:58.457428 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-67b5d44b7f-fzv5x" event={"ID":"02766095-b09a-47c5-b0c0-8577cf0c4df0","Type":"ContainerStarted","Data":"888d9b0f2bbaaf4e67c7096358a5f13ab887d460813618a46f191191542f1360"} Sep 29 11:00:58 crc kubenswrapper[4752]: I0929 11:00:58.458715 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-67b5d44b7f-fzv5x" Sep 29 11:00:58 crc kubenswrapper[4752]: I0929 11:00:58.472743 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6495d75b5-vdwdn" event={"ID":"bcc941f9-bb40-42c6-a04a-2dbccdb5c63d","Type":"ContainerStarted","Data":"2a5a34a2cfa5a1bf6ad11db09dcd9ab0ad5faecd85e44d38a078f135670aff42"} Sep 29 11:00:58 crc kubenswrapper[4752]: I0929 11:00:58.506019 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-67b5d44b7f-fzv5x" podStartSLOduration=3.861917851 podStartE2EDuration="15.505993514s" podCreationTimestamp="2025-09-29 11:00:43 +0000 UTC" firstStartedPulling="2025-09-29 11:00:45.733290455 +0000 UTC m=+986.522432122" lastFinishedPulling="2025-09-29 11:00:57.377366118 +0000 UTC m=+998.166507785" observedRunningTime="2025-09-29 11:00:58.498837349 +0000 UTC m=+999.287979016" watchObservedRunningTime="2025-09-29 11:00:58.505993514 +0000 UTC m=+999.295135201" Sep 29 11:00:59 crc kubenswrapper[4752]: I0929 11:00:59.542471 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6495d75b5-vdwdn" event={"ID":"bcc941f9-bb40-42c6-a04a-2dbccdb5c63d","Type":"ContainerStarted","Data":"25fa000e365a5508e3581cd38d39d6437ae6d8074362a1f047cd477275ec94ee"} Sep 29 11:00:59 crc kubenswrapper[4752]: I0929 11:00:59.555695 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-f66b554c6-rwzzh" event={"ID":"ec44f1a0-8747-41d6-bf36-9899957065fc","Type":"ContainerStarted","Data":"395c9426a970129f6371fe0dea0075c20541ffbf66fa274f293931c9cad11c19"} Sep 29 11:00:59 crc kubenswrapper[4752]: I0929 11:00:59.556495 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-f66b554c6-rwzzh" Sep 29 11:00:59 crc kubenswrapper[4752]: I0929 11:00:59.559346 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-748c574d75-s6kn6" event={"ID":"22f0a092-282c-4339-b57e-29bba94f1c26","Type":"ContainerStarted","Data":"244f291fbf1e8b86c29a8329713cfe6ef83f24f95f90f357e33ef9f93b78c577"} Sep 29 11:00:59 crc kubenswrapper[4752]: I0929 11:00:59.559975 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-748c574d75-s6kn6" Sep 29 11:00:59 crc kubenswrapper[4752]: I0929 11:00:59.562349 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-54d766c9f9-w5nvh" event={"ID":"97664b33-b187-472d-8e08-462174d3e49a","Type":"ContainerStarted","Data":"e11fbca06562d847512efb3332068745780c9e7ecd4201bf9df0059d9631c1a8"} Sep 29 11:00:59 crc kubenswrapper[4752]: I0929 11:00:59.562473 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-54d766c9f9-w5nvh" Sep 29 11:00:59 crc kubenswrapper[4752]: I0929 11:00:59.568149 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-858cd69f49-c2twd" event={"ID":"e06f89cc-db11-4692-ab2c-50405feb9ca1","Type":"ContainerStarted","Data":"2d22d91973aae4f886ef8cd384acaec1cc7214c5959c8b2b9f5cb25d63d205b7"} Sep 29 11:00:59 crc kubenswrapper[4752]: I0929 11:00:59.571025 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-56cf9c6b99-vxd7j" event={"ID":"c95eca0a-b789-4db4-a906-d3323f1ee7ed","Type":"ContainerStarted","Data":"244c1618e1f9c10e71e0e57c8feabe1fae9212e542f93b4660feb1677046bf55"} Sep 29 11:00:59 crc kubenswrapper[4752]: I0929 11:00:59.574843 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-695847bc78-6ffsd" event={"ID":"4c1a6d67-1063-44bf-a2fa-be8dde72fabf","Type":"ContainerStarted","Data":"7055723d1e1f4d221b8378dac21c42d8ccdd2d1a21c1d54863b21d63b39bd168"} Sep 29 11:00:59 crc kubenswrapper[4752]: I0929 11:00:59.587383 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-f66b554c6-rwzzh" podStartSLOduration=4.6429245980000005 podStartE2EDuration="15.587362429s" podCreationTimestamp="2025-09-29 11:00:44 +0000 UTC" firstStartedPulling="2025-09-29 11:00:46.449773713 +0000 UTC m=+987.238915380" lastFinishedPulling="2025-09-29 11:00:57.394211544 +0000 UTC m=+998.183353211" observedRunningTime="2025-09-29 11:00:59.587350668 +0000 UTC m=+1000.376492325" watchObservedRunningTime="2025-09-29 11:00:59.587362429 +0000 UTC m=+1000.376504096" Sep 29 11:00:59 crc kubenswrapper[4752]: I0929 11:00:59.595081 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-8ff95898-m96g4" event={"ID":"ce29c55b-caec-4b27-b9fd-e815c897c38e","Type":"ContainerStarted","Data":"2f0ecc6374c4b311d9177babb09d7047a94379c9da200b44f4ee5feb17cad1cd"} Sep 29 11:00:59 crc kubenswrapper[4752]: I0929 11:00:59.595244 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-8ff95898-m96g4" Sep 29 11:00:59 crc kubenswrapper[4752]: I0929 11:00:59.606550 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-687b9cf756-qnmsk" event={"ID":"28e91257-715f-462e-be70-361652522cb3","Type":"ContainerStarted","Data":"4c520c2e9e77a7b008a423381b668f650241a72b6214b9c1c11b6bf6a2877fd7"} Sep 29 11:00:59 crc kubenswrapper[4752]: I0929 11:00:59.624176 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-54d766c9f9-w5nvh" podStartSLOduration=5.6532736660000005 podStartE2EDuration="16.62415709s" podCreationTimestamp="2025-09-29 11:00:43 +0000 UTC" firstStartedPulling="2025-09-29 11:00:46.406853943 +0000 UTC m=+987.195995610" lastFinishedPulling="2025-09-29 11:00:57.377737377 +0000 UTC m=+998.166879034" observedRunningTime="2025-09-29 11:00:59.616372778 +0000 UTC m=+1000.405514445" watchObservedRunningTime="2025-09-29 11:00:59.62415709 +0000 UTC m=+1000.413298757" Sep 29 11:00:59 crc kubenswrapper[4752]: I0929 11:00:59.633714 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-5nh8j" event={"ID":"db7baf7a-c952-4e2d-adaa-87815a1ad895","Type":"ContainerStarted","Data":"c4760ccabc22898157071ffb103ebe950109a2a21ce66b6bde0a0066a952821e"} Sep 29 11:00:59 crc kubenswrapper[4752]: I0929 11:00:59.635741 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-9fc8d5567-lncbz" event={"ID":"6f6d7f0e-296b-493f-8e5b-82dc348a3e6d","Type":"ContainerStarted","Data":"ee651e00634825e7358076921786facfb7d73c4cd3eeb899de06e26b4cee94fb"} Sep 29 11:00:59 crc kubenswrapper[4752]: I0929 11:00:59.641105 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-474s5" event={"ID":"6c7e19dc-afaf-4b5e-96b0-2049f29b5d0a","Type":"ContainerStarted","Data":"145249e8b6dabf8cc85697397975d57a05e9bd562a1551adb07aece35ae2d112"} Sep 29 11:00:59 crc kubenswrapper[4752]: I0929 11:00:59.642127 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-474s5" Sep 29 11:00:59 crc kubenswrapper[4752]: I0929 11:00:59.651001 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-67b5d44b7f-fzv5x" event={"ID":"02766095-b09a-47c5-b0c0-8577cf0c4df0","Type":"ContainerStarted","Data":"ee179eab6dc237e11f9841632e69b443bca32b1a09e6f1f55c09621408f10d2d"} Sep 29 11:00:59 crc kubenswrapper[4752]: I0929 11:00:59.653792 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-748c574d75-s6kn6" podStartSLOduration=4.607342147 podStartE2EDuration="16.653765515s" podCreationTimestamp="2025-09-29 11:00:43 +0000 UTC" firstStartedPulling="2025-09-29 11:00:45.308164251 +0000 UTC m=+986.097305918" lastFinishedPulling="2025-09-29 11:00:57.354587619 +0000 UTC m=+998.143729286" observedRunningTime="2025-09-29 11:00:59.653070748 +0000 UTC m=+1000.442212425" watchObservedRunningTime="2025-09-29 11:00:59.653765515 +0000 UTC m=+1000.442907182" Sep 29 11:00:59 crc kubenswrapper[4752]: I0929 11:00:59.656486 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-7d74f4d695-gnrn4" event={"ID":"d3d16719-6f3a-40f3-a68e-7ca209588644","Type":"ContainerStarted","Data":"84c7090c045683bde2bcc2c92d3b6fb5419b05819015620ece5cd81e43452ab8"} Sep 29 11:00:59 crc kubenswrapper[4752]: I0929 11:00:59.692951 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-8ff95898-m96g4" podStartSLOduration=5.378312284 podStartE2EDuration="16.692932118s" podCreationTimestamp="2025-09-29 11:00:43 +0000 UTC" firstStartedPulling="2025-09-29 11:00:46.054566202 +0000 UTC m=+986.843707869" lastFinishedPulling="2025-09-29 11:00:57.369186016 +0000 UTC m=+998.158327703" observedRunningTime="2025-09-29 11:00:59.686374648 +0000 UTC m=+1000.475516335" watchObservedRunningTime="2025-09-29 11:00:59.692932118 +0000 UTC m=+1000.482073785" Sep 29 11:00:59 crc kubenswrapper[4752]: I0929 11:00:59.735233 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-474s5" podStartSLOduration=5.415279481 podStartE2EDuration="15.735216312s" podCreationTimestamp="2025-09-29 11:00:44 +0000 UTC" firstStartedPulling="2025-09-29 11:00:47.071702666 +0000 UTC m=+987.860844333" lastFinishedPulling="2025-09-29 11:00:57.391639497 +0000 UTC m=+998.180781164" observedRunningTime="2025-09-29 11:00:59.732547132 +0000 UTC m=+1000.521688799" watchObservedRunningTime="2025-09-29 11:00:59.735216312 +0000 UTC m=+1000.524357979" Sep 29 11:01:00 crc kubenswrapper[4752]: I0929 11:01:00.683693 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-774b97b48-zgk9d" event={"ID":"06abe84c-c450-4214-b94c-dc8ac39422bd","Type":"ContainerStarted","Data":"b5c5374f77398e15f4abba3602e78e5edd30fd56b546a8766b594975a7a3eb6c"} Sep 29 11:01:00 crc kubenswrapper[4752]: I0929 11:01:00.684081 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-774b97b48-zgk9d" Sep 29 11:01:00 crc kubenswrapper[4752]: I0929 11:01:00.686365 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-5nh8j" event={"ID":"db7baf7a-c952-4e2d-adaa-87815a1ad895","Type":"ContainerStarted","Data":"d30b23417e9550716f3af46350da06b13c7316a52ecdb4ba4e2047f8f31d9624"} Sep 29 11:01:00 crc kubenswrapper[4752]: I0929 11:01:00.686585 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-5nh8j" Sep 29 11:01:00 crc kubenswrapper[4752]: I0929 11:01:00.688407 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-9fc8d5567-lncbz" event={"ID":"6f6d7f0e-296b-493f-8e5b-82dc348a3e6d","Type":"ContainerStarted","Data":"0714f7e1ec6291041bf10805cc5cc76d20bb7f384d66b160f4fa4a2ad7482d93"} Sep 29 11:01:00 crc kubenswrapper[4752]: I0929 11:01:00.688554 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-9fc8d5567-lncbz" Sep 29 11:01:00 crc kubenswrapper[4752]: I0929 11:01:00.690847 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-gkcw9" event={"ID":"c35176b6-d50b-4e71-9c73-063a8213988c","Type":"ContainerStarted","Data":"1a666fde4431fa5e8c53a328bfacfae363589c250c0cbd2091b4012ee0152f81"} Sep 29 11:01:00 crc kubenswrapper[4752]: I0929 11:01:00.690942 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-gkcw9" Sep 29 11:01:00 crc kubenswrapper[4752]: I0929 11:01:00.693716 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-7d74f4d695-gnrn4" event={"ID":"d3d16719-6f3a-40f3-a68e-7ca209588644","Type":"ContainerStarted","Data":"861ea513f678d130626ba92e93aae73699f8a397a6b4432093f46e602108f773"} Sep 29 11:01:00 crc kubenswrapper[4752]: I0929 11:01:00.693775 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-7d74f4d695-gnrn4" Sep 29 11:01:00 crc kubenswrapper[4752]: I0929 11:01:00.700708 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-687b9cf756-qnmsk" event={"ID":"28e91257-715f-462e-be70-361652522cb3","Type":"ContainerStarted","Data":"a1af9ff003c2430be35c79ff532218af62ed59b11a539cb0f6a0c52f23993301"} Sep 29 11:01:00 crc kubenswrapper[4752]: I0929 11:01:00.701122 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-687b9cf756-qnmsk" Sep 29 11:01:00 crc kubenswrapper[4752]: I0929 11:01:00.706553 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-774b97b48-zgk9d" podStartSLOduration=6.032349188 podStartE2EDuration="16.70653135s" podCreationTimestamp="2025-09-29 11:00:44 +0000 UTC" firstStartedPulling="2025-09-29 11:00:46.724071796 +0000 UTC m=+987.513213463" lastFinishedPulling="2025-09-29 11:00:57.398253958 +0000 UTC m=+998.187395625" observedRunningTime="2025-09-29 11:01:00.700391981 +0000 UTC m=+1001.489533668" watchObservedRunningTime="2025-09-29 11:01:00.70653135 +0000 UTC m=+1001.495673017" Sep 29 11:01:00 crc kubenswrapper[4752]: I0929 11:01:00.709277 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7bf498966c-vwbvq" event={"ID":"4ba7065e-6eff-42bb-acc3-2595f5cc8e71","Type":"ContainerStarted","Data":"d5d3836423db26fc04eb3a2946e205bcc093445c665c4bea340575a185dd8156"} Sep 29 11:01:00 crc kubenswrapper[4752]: I0929 11:01:00.709419 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7bf498966c-vwbvq" Sep 29 11:01:00 crc kubenswrapper[4752]: I0929 11:01:00.713359 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-858cd69f49-c2twd" event={"ID":"e06f89cc-db11-4692-ab2c-50405feb9ca1","Type":"ContainerStarted","Data":"53782f500dcf56d2950faf3995b22d19213ddb9cc2fd940c5067f7546deabacf"} Sep 29 11:01:00 crc kubenswrapper[4752]: I0929 11:01:00.714542 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-56cf9c6b99-vxd7j" Sep 29 11:01:00 crc kubenswrapper[4752]: I0929 11:01:00.714585 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-6495d75b5-vdwdn" Sep 29 11:01:00 crc kubenswrapper[4752]: I0929 11:01:00.723354 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-gkcw9" podStartSLOduration=6.95472189 podStartE2EDuration="17.723326244s" podCreationTimestamp="2025-09-29 11:00:43 +0000 UTC" firstStartedPulling="2025-09-29 11:00:46.688527307 +0000 UTC m=+987.477668974" lastFinishedPulling="2025-09-29 11:00:57.457131661 +0000 UTC m=+998.246273328" observedRunningTime="2025-09-29 11:01:00.717247157 +0000 UTC m=+1001.506388824" watchObservedRunningTime="2025-09-29 11:01:00.723326244 +0000 UTC m=+1001.512467921" Sep 29 11:01:00 crc kubenswrapper[4752]: I0929 11:01:00.741450 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-9fc8d5567-lncbz" podStartSLOduration=6.757938723 podStartE2EDuration="17.741361781s" podCreationTimestamp="2025-09-29 11:00:43 +0000 UTC" firstStartedPulling="2025-09-29 11:00:46.412167261 +0000 UTC m=+987.201308928" lastFinishedPulling="2025-09-29 11:00:57.395590319 +0000 UTC m=+998.184731986" observedRunningTime="2025-09-29 11:01:00.735635843 +0000 UTC m=+1001.524777510" watchObservedRunningTime="2025-09-29 11:01:00.741361781 +0000 UTC m=+1001.530503448" Sep 29 11:01:00 crc kubenswrapper[4752]: I0929 11:01:00.760506 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-5nh8j" podStartSLOduration=6.768058922 podStartE2EDuration="17.760467584s" podCreationTimestamp="2025-09-29 11:00:43 +0000 UTC" firstStartedPulling="2025-09-29 11:00:46.40403872 +0000 UTC m=+987.193180387" lastFinishedPulling="2025-09-29 11:00:57.396447382 +0000 UTC m=+998.185589049" observedRunningTime="2025-09-29 11:01:00.758291508 +0000 UTC m=+1001.547433195" watchObservedRunningTime="2025-09-29 11:01:00.760467584 +0000 UTC m=+1001.549609251" Sep 29 11:01:00 crc kubenswrapper[4752]: I0929 11:01:00.777026 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-7d74f4d695-gnrn4" podStartSLOduration=6.783464262 podStartE2EDuration="17.777008182s" podCreationTimestamp="2025-09-29 11:00:43 +0000 UTC" firstStartedPulling="2025-09-29 11:00:46.407923161 +0000 UTC m=+987.197064828" lastFinishedPulling="2025-09-29 11:00:57.401467081 +0000 UTC m=+998.190608748" observedRunningTime="2025-09-29 11:01:00.774003015 +0000 UTC m=+1001.563144692" watchObservedRunningTime="2025-09-29 11:01:00.777008182 +0000 UTC m=+1001.566149849" Sep 29 11:01:00 crc kubenswrapper[4752]: I0929 11:01:00.794796 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-56cf9c6b99-vxd7j" podStartSLOduration=6.297033612 podStartE2EDuration="17.794763972s" podCreationTimestamp="2025-09-29 11:00:43 +0000 UTC" firstStartedPulling="2025-09-29 11:00:45.897909661 +0000 UTC m=+986.687051328" lastFinishedPulling="2025-09-29 11:00:57.395640031 +0000 UTC m=+998.184781688" observedRunningTime="2025-09-29 11:01:00.791785135 +0000 UTC m=+1001.580926832" watchObservedRunningTime="2025-09-29 11:01:00.794763972 +0000 UTC m=+1001.583905639" Sep 29 11:01:00 crc kubenswrapper[4752]: I0929 11:01:00.820714 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-6495d75b5-vdwdn" podStartSLOduration=6.138918042 podStartE2EDuration="17.820688481s" podCreationTimestamp="2025-09-29 11:00:43 +0000 UTC" firstStartedPulling="2025-09-29 11:00:45.714593841 +0000 UTC m=+986.503735508" lastFinishedPulling="2025-09-29 11:00:57.39636428 +0000 UTC m=+998.185505947" observedRunningTime="2025-09-29 11:01:00.816312918 +0000 UTC m=+1001.605454585" watchObservedRunningTime="2025-09-29 11:01:00.820688481 +0000 UTC m=+1001.609830158" Sep 29 11:01:00 crc kubenswrapper[4752]: I0929 11:01:00.843844 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-858cd69f49-c2twd" podStartSLOduration=6.898225569 podStartE2EDuration="17.84379638s" podCreationTimestamp="2025-09-29 11:00:43 +0000 UTC" firstStartedPulling="2025-09-29 11:00:46.447922025 +0000 UTC m=+987.237063692" lastFinishedPulling="2025-09-29 11:00:57.393492836 +0000 UTC m=+998.182634503" observedRunningTime="2025-09-29 11:01:00.836512831 +0000 UTC m=+1001.625654498" watchObservedRunningTime="2025-09-29 11:01:00.84379638 +0000 UTC m=+1001.632938047" Sep 29 11:01:00 crc kubenswrapper[4752]: I0929 11:01:00.863468 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7bf498966c-vwbvq" podStartSLOduration=6.552486039 podStartE2EDuration="17.863444238s" podCreationTimestamp="2025-09-29 11:00:43 +0000 UTC" firstStartedPulling="2025-09-29 11:00:46.058095874 +0000 UTC m=+986.847237541" lastFinishedPulling="2025-09-29 11:00:57.369054073 +0000 UTC m=+998.158195740" observedRunningTime="2025-09-29 11:01:00.861165488 +0000 UTC m=+1001.650307155" watchObservedRunningTime="2025-09-29 11:01:00.863444238 +0000 UTC m=+1001.652585905" Sep 29 11:01:00 crc kubenswrapper[4752]: I0929 11:01:00.897994 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-695847bc78-6ffsd" podStartSLOduration=6.94776257 podStartE2EDuration="17.897974541s" podCreationTimestamp="2025-09-29 11:00:43 +0000 UTC" firstStartedPulling="2025-09-29 11:00:46.404083491 +0000 UTC m=+987.193225158" lastFinishedPulling="2025-09-29 11:00:57.354295452 +0000 UTC m=+998.143437129" observedRunningTime="2025-09-29 11:01:00.888360822 +0000 UTC m=+1001.677502499" watchObservedRunningTime="2025-09-29 11:01:00.897974541 +0000 UTC m=+1001.687116208" Sep 29 11:01:00 crc kubenswrapper[4752]: I0929 11:01:00.914091 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-687b9cf756-qnmsk" podStartSLOduration=6.898717031 podStartE2EDuration="17.914057056s" podCreationTimestamp="2025-09-29 11:00:43 +0000 UTC" firstStartedPulling="2025-09-29 11:00:46.379752262 +0000 UTC m=+987.168893929" lastFinishedPulling="2025-09-29 11:00:57.395092287 +0000 UTC m=+998.184233954" observedRunningTime="2025-09-29 11:01:00.907834766 +0000 UTC m=+1001.696976433" watchObservedRunningTime="2025-09-29 11:01:00.914057056 +0000 UTC m=+1001.703198723" Sep 29 11:01:01 crc kubenswrapper[4752]: I0929 11:01:01.722408 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-858cd69f49-c2twd" Sep 29 11:01:03 crc kubenswrapper[4752]: I0929 11:01:03.763867 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-l55rr" event={"ID":"5b974650-8f7f-4826-b767-0dcd35bb6f3f","Type":"ContainerStarted","Data":"f8da68f2d739933b3f892ebe60c2320a23fcd7c9c4afaf45c2d6776b530936ae"} Sep 29 11:01:03 crc kubenswrapper[4752]: I0929 11:01:03.787162 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-5f95c46c78-tg8bp" event={"ID":"54b928b4-cc8d-4093-8aef-4d6540a226c3","Type":"ContainerStarted","Data":"24b83eef781e01e4bf28b8e4bc6ac9a1e1d89513b827c1b6bc71a55b0eded001"} Sep 29 11:01:03 crc kubenswrapper[4752]: I0929 11:01:03.787909 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-5f95c46c78-tg8bp" Sep 29 11:01:03 crc kubenswrapper[4752]: I0929 11:01:03.791229 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-586b879c47-crvdt" event={"ID":"2aa7ba00-f408-407e-83cc-ce6b2c5b51fa","Type":"ContainerStarted","Data":"264687c86f5501ed88817cc13db302ab21529cc10bcfd4654e6f99fe20b5c690"} Sep 29 11:01:03 crc kubenswrapper[4752]: I0929 11:01:03.791771 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-586b879c47-crvdt" Sep 29 11:01:03 crc kubenswrapper[4752]: I0929 11:01:03.793167 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-l55rr" podStartSLOduration=3.3363846600000002 podStartE2EDuration="19.793158119s" podCreationTimestamp="2025-09-29 11:00:44 +0000 UTC" firstStartedPulling="2025-09-29 11:00:46.758357703 +0000 UTC m=+987.547499370" lastFinishedPulling="2025-09-29 11:01:03.215131162 +0000 UTC m=+1004.004272829" observedRunningTime="2025-09-29 11:01:03.788271693 +0000 UTC m=+1004.577413360" watchObservedRunningTime="2025-09-29 11:01:03.793158119 +0000 UTC m=+1004.582299786" Sep 29 11:01:03 crc kubenswrapper[4752]: I0929 11:01:03.839625 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-5f95c46c78-tg8bp" podStartSLOduration=3.101427554 podStartE2EDuration="19.839595211s" podCreationTimestamp="2025-09-29 11:00:44 +0000 UTC" firstStartedPulling="2025-09-29 11:00:46.45742181 +0000 UTC m=+987.246563477" lastFinishedPulling="2025-09-29 11:01:03.195589467 +0000 UTC m=+1003.984731134" observedRunningTime="2025-09-29 11:01:03.836825619 +0000 UTC m=+1004.625967286" watchObservedRunningTime="2025-09-29 11:01:03.839595211 +0000 UTC m=+1004.628736878" Sep 29 11:01:03 crc kubenswrapper[4752]: I0929 11:01:03.861574 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-586b879c47-crvdt" podStartSLOduration=3.121019061 podStartE2EDuration="19.861555998s" podCreationTimestamp="2025-09-29 11:00:44 +0000 UTC" firstStartedPulling="2025-09-29 11:00:46.454023543 +0000 UTC m=+987.243165210" lastFinishedPulling="2025-09-29 11:01:03.19456048 +0000 UTC m=+1003.983702147" observedRunningTime="2025-09-29 11:01:03.859066163 +0000 UTC m=+1004.648207830" watchObservedRunningTime="2025-09-29 11:01:03.861555998 +0000 UTC m=+1004.650697665" Sep 29 11:01:04 crc kubenswrapper[4752]: I0929 11:01:04.054723 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-748c574d75-s6kn6" Sep 29 11:01:04 crc kubenswrapper[4752]: I0929 11:01:04.078932 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-6495d75b5-vdwdn" Sep 29 11:01:04 crc kubenswrapper[4752]: I0929 11:01:04.106391 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-67b5d44b7f-fzv5x" Sep 29 11:01:04 crc kubenswrapper[4752]: I0929 11:01:04.179775 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-8ff95898-m96g4" Sep 29 11:01:04 crc kubenswrapper[4752]: I0929 11:01:04.291991 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-56cf9c6b99-vxd7j" Sep 29 11:01:04 crc kubenswrapper[4752]: I0929 11:01:04.350092 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-9fc8d5567-lncbz" Sep 29 11:01:04 crc kubenswrapper[4752]: I0929 11:01:04.380402 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-7d74f4d695-gnrn4" Sep 29 11:01:04 crc kubenswrapper[4752]: I0929 11:01:04.383211 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7bf498966c-vwbvq" Sep 29 11:01:04 crc kubenswrapper[4752]: I0929 11:01:04.487165 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-695847bc78-6ffsd" Sep 29 11:01:04 crc kubenswrapper[4752]: I0929 11:01:04.491475 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-695847bc78-6ffsd" Sep 29 11:01:04 crc kubenswrapper[4752]: I0929 11:01:04.539994 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-687b9cf756-qnmsk" Sep 29 11:01:04 crc kubenswrapper[4752]: I0929 11:01:04.568475 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-858cd69f49-c2twd" Sep 29 11:01:04 crc kubenswrapper[4752]: I0929 11:01:04.568849 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-5nh8j" Sep 29 11:01:04 crc kubenswrapper[4752]: I0929 11:01:04.634770 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-54d766c9f9-w5nvh" Sep 29 11:01:04 crc kubenswrapper[4752]: I0929 11:01:04.801217 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-774b97b48-zgk9d" Sep 29 11:01:04 crc kubenswrapper[4752]: I0929 11:01:04.878445 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-f66b554c6-rwzzh" Sep 29 11:01:04 crc kubenswrapper[4752]: I0929 11:01:04.899856 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-gkcw9" Sep 29 11:01:06 crc kubenswrapper[4752]: I0929 11:01:06.448932 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-474s5" Sep 29 11:01:12 crc kubenswrapper[4752]: I0929 11:01:12.882188 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5bf96cfbc4-g82n6" event={"ID":"ac414d81-10d8-4ef8-aeed-3f8bf43eae1d","Type":"ContainerStarted","Data":"30eb6d9569623f99c8659db1b7e5bf269c00ea90f1f9724557e6f5106be6aa6c"} Sep 29 11:01:12 crc kubenswrapper[4752]: I0929 11:01:12.884061 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-5bf96cfbc4-g82n6" Sep 29 11:01:12 crc kubenswrapper[4752]: I0929 11:01:12.889973 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-48hc9" event={"ID":"f6d91c53-a4c9-4516-8666-75dac446a27e","Type":"ContainerStarted","Data":"551b78b9f6b3651e8e694b630ecfe60e2d8b25718591b00dcfed5498382887ba"} Sep 29 11:01:12 crc kubenswrapper[4752]: I0929 11:01:12.890262 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-48hc9" Sep 29 11:01:12 crc kubenswrapper[4752]: I0929 11:01:12.904554 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-5bf96cfbc4-g82n6" podStartSLOduration=3.199257704 podStartE2EDuration="28.904534878s" podCreationTimestamp="2025-09-29 11:00:44 +0000 UTC" firstStartedPulling="2025-09-29 11:00:46.490178137 +0000 UTC m=+987.279319804" lastFinishedPulling="2025-09-29 11:01:12.195455311 +0000 UTC m=+1012.984596978" observedRunningTime="2025-09-29 11:01:12.903397918 +0000 UTC m=+1013.692539585" watchObservedRunningTime="2025-09-29 11:01:12.904534878 +0000 UTC m=+1013.693676545" Sep 29 11:01:12 crc kubenswrapper[4752]: I0929 11:01:12.920848 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-48hc9" podStartSLOduration=3.226666594 podStartE2EDuration="28.9208291s" podCreationTimestamp="2025-09-29 11:00:44 +0000 UTC" firstStartedPulling="2025-09-29 11:00:46.490420634 +0000 UTC m=+987.279562291" lastFinishedPulling="2025-09-29 11:01:12.18458313 +0000 UTC m=+1012.973724797" observedRunningTime="2025-09-29 11:01:12.918967881 +0000 UTC m=+1013.708109558" watchObservedRunningTime="2025-09-29 11:01:12.9208291 +0000 UTC m=+1013.709970777" Sep 29 11:01:14 crc kubenswrapper[4752]: I0929 11:01:14.922987 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-586b879c47-crvdt" Sep 29 11:01:14 crc kubenswrapper[4752]: I0929 11:01:14.987908 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-5f95c46c78-tg8bp" Sep 29 11:01:24 crc kubenswrapper[4752]: I0929 11:01:24.843328 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-5bf96cfbc4-g82n6" Sep 29 11:01:25 crc kubenswrapper[4752]: I0929 11:01:25.412990 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-48hc9" Sep 29 11:01:29 crc kubenswrapper[4752]: I0929 11:01:29.613678 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-d99bfc6df-bjdz6"] Sep 29 11:01:29 crc kubenswrapper[4752]: I0929 11:01:29.614538 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-controller-operator-d99bfc6df-bjdz6" podUID="0f791715-ebb0-42db-b8fe-d8ab636b6bb2" containerName="operator" containerID="cri-o://11a66f3c5379b55a900c64074dc7d5dc5543cd96b9dab995ba249b4ca7985f79" gracePeriod=10 Sep 29 11:01:29 crc kubenswrapper[4752]: I0929 11:01:29.614611 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-controller-operator-d99bfc6df-bjdz6" podUID="0f791715-ebb0-42db-b8fe-d8ab636b6bb2" containerName="kube-rbac-proxy" containerID="cri-o://391ef9bae562fcc2f50ccfa518484444a0d56e60fc13efacfdb5cef3e57933b1" gracePeriod=10 Sep 29 11:01:29 crc kubenswrapper[4752]: I0929 11:01:29.641566 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-586b879c47-crvdt"] Sep 29 11:01:29 crc kubenswrapper[4752]: I0929 11:01:29.641854 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/watcher-operator-controller-manager-586b879c47-crvdt" podUID="2aa7ba00-f408-407e-83cc-ce6b2c5b51fa" containerName="kube-rbac-proxy" containerID="cri-o://705735b9f9a1f68d329388ae806fd4f3f45a0de760fd3a0ae8d4129caf84011e" gracePeriod=10 Sep 29 11:01:29 crc kubenswrapper[4752]: I0929 11:01:29.641940 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/watcher-operator-controller-manager-586b879c47-crvdt" podUID="2aa7ba00-f408-407e-83cc-ce6b2c5b51fa" containerName="manager" containerID="cri-o://264687c86f5501ed88817cc13db302ab21529cc10bcfd4654e6f99fe20b5c690" gracePeriod=10 Sep 29 11:01:30 crc kubenswrapper[4752]: I0929 11:01:30.051842 4752 generic.go:334] "Generic (PLEG): container finished" podID="0f791715-ebb0-42db-b8fe-d8ab636b6bb2" containerID="391ef9bae562fcc2f50ccfa518484444a0d56e60fc13efacfdb5cef3e57933b1" exitCode=0 Sep 29 11:01:30 crc kubenswrapper[4752]: I0929 11:01:30.052221 4752 generic.go:334] "Generic (PLEG): container finished" podID="0f791715-ebb0-42db-b8fe-d8ab636b6bb2" containerID="11a66f3c5379b55a900c64074dc7d5dc5543cd96b9dab995ba249b4ca7985f79" exitCode=0 Sep 29 11:01:30 crc kubenswrapper[4752]: I0929 11:01:30.051924 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-d99bfc6df-bjdz6" event={"ID":"0f791715-ebb0-42db-b8fe-d8ab636b6bb2","Type":"ContainerDied","Data":"391ef9bae562fcc2f50ccfa518484444a0d56e60fc13efacfdb5cef3e57933b1"} Sep 29 11:01:30 crc kubenswrapper[4752]: I0929 11:01:30.052320 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-d99bfc6df-bjdz6" event={"ID":"0f791715-ebb0-42db-b8fe-d8ab636b6bb2","Type":"ContainerDied","Data":"11a66f3c5379b55a900c64074dc7d5dc5543cd96b9dab995ba249b4ca7985f79"} Sep 29 11:01:30 crc kubenswrapper[4752]: I0929 11:01:30.084398 4752 generic.go:334] "Generic (PLEG): container finished" podID="2aa7ba00-f408-407e-83cc-ce6b2c5b51fa" containerID="264687c86f5501ed88817cc13db302ab21529cc10bcfd4654e6f99fe20b5c690" exitCode=0 Sep 29 11:01:30 crc kubenswrapper[4752]: I0929 11:01:30.084438 4752 generic.go:334] "Generic (PLEG): container finished" podID="2aa7ba00-f408-407e-83cc-ce6b2c5b51fa" containerID="705735b9f9a1f68d329388ae806fd4f3f45a0de760fd3a0ae8d4129caf84011e" exitCode=0 Sep 29 11:01:30 crc kubenswrapper[4752]: I0929 11:01:30.084459 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-586b879c47-crvdt" event={"ID":"2aa7ba00-f408-407e-83cc-ce6b2c5b51fa","Type":"ContainerDied","Data":"264687c86f5501ed88817cc13db302ab21529cc10bcfd4654e6f99fe20b5c690"} Sep 29 11:01:30 crc kubenswrapper[4752]: I0929 11:01:30.084486 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-586b879c47-crvdt" event={"ID":"2aa7ba00-f408-407e-83cc-ce6b2c5b51fa","Type":"ContainerDied","Data":"705735b9f9a1f68d329388ae806fd4f3f45a0de760fd3a0ae8d4129caf84011e"} Sep 29 11:01:30 crc kubenswrapper[4752]: I0929 11:01:30.208615 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-d99bfc6df-bjdz6" Sep 29 11:01:30 crc kubenswrapper[4752]: I0929 11:01:30.213530 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-586b879c47-crvdt" Sep 29 11:01:30 crc kubenswrapper[4752]: I0929 11:01:30.291566 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gl8wr\" (UniqueName: \"kubernetes.io/projected/0f791715-ebb0-42db-b8fe-d8ab636b6bb2-kube-api-access-gl8wr\") pod \"0f791715-ebb0-42db-b8fe-d8ab636b6bb2\" (UID: \"0f791715-ebb0-42db-b8fe-d8ab636b6bb2\") " Sep 29 11:01:30 crc kubenswrapper[4752]: I0929 11:01:30.291755 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w2c82\" (UniqueName: \"kubernetes.io/projected/2aa7ba00-f408-407e-83cc-ce6b2c5b51fa-kube-api-access-w2c82\") pod \"2aa7ba00-f408-407e-83cc-ce6b2c5b51fa\" (UID: \"2aa7ba00-f408-407e-83cc-ce6b2c5b51fa\") " Sep 29 11:01:30 crc kubenswrapper[4752]: I0929 11:01:30.297488 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f791715-ebb0-42db-b8fe-d8ab636b6bb2-kube-api-access-gl8wr" (OuterVolumeSpecName: "kube-api-access-gl8wr") pod "0f791715-ebb0-42db-b8fe-d8ab636b6bb2" (UID: "0f791715-ebb0-42db-b8fe-d8ab636b6bb2"). InnerVolumeSpecName "kube-api-access-gl8wr". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 11:01:30 crc kubenswrapper[4752]: I0929 11:01:30.298728 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2aa7ba00-f408-407e-83cc-ce6b2c5b51fa-kube-api-access-w2c82" (OuterVolumeSpecName: "kube-api-access-w2c82") pod "2aa7ba00-f408-407e-83cc-ce6b2c5b51fa" (UID: "2aa7ba00-f408-407e-83cc-ce6b2c5b51fa"). InnerVolumeSpecName "kube-api-access-w2c82". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 11:01:30 crc kubenswrapper[4752]: I0929 11:01:30.392905 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gl8wr\" (UniqueName: \"kubernetes.io/projected/0f791715-ebb0-42db-b8fe-d8ab636b6bb2-kube-api-access-gl8wr\") on node \"crc\" DevicePath \"\"" Sep 29 11:01:30 crc kubenswrapper[4752]: I0929 11:01:30.393166 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w2c82\" (UniqueName: \"kubernetes.io/projected/2aa7ba00-f408-407e-83cc-ce6b2c5b51fa-kube-api-access-w2c82\") on node \"crc\" DevicePath \"\"" Sep 29 11:01:31 crc kubenswrapper[4752]: I0929 11:01:31.096952 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-586b879c47-crvdt" event={"ID":"2aa7ba00-f408-407e-83cc-ce6b2c5b51fa","Type":"ContainerDied","Data":"fbc6893ad4a76341a9f2d307acba77d85fd80f866ae2be3d976482a8e905db30"} Sep 29 11:01:31 crc kubenswrapper[4752]: I0929 11:01:31.097013 4752 scope.go:117] "RemoveContainer" containerID="264687c86f5501ed88817cc13db302ab21529cc10bcfd4654e6f99fe20b5c690" Sep 29 11:01:31 crc kubenswrapper[4752]: I0929 11:01:31.097113 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-586b879c47-crvdt" Sep 29 11:01:31 crc kubenswrapper[4752]: I0929 11:01:31.102846 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-d99bfc6df-bjdz6" event={"ID":"0f791715-ebb0-42db-b8fe-d8ab636b6bb2","Type":"ContainerDied","Data":"07cb7bf2865db26872bcfb7f57e495159a9b768908854359059416eab3069b00"} Sep 29 11:01:31 crc kubenswrapper[4752]: I0929 11:01:31.102964 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-d99bfc6df-bjdz6" Sep 29 11:01:31 crc kubenswrapper[4752]: I0929 11:01:31.127414 4752 scope.go:117] "RemoveContainer" containerID="705735b9f9a1f68d329388ae806fd4f3f45a0de760fd3a0ae8d4129caf84011e" Sep 29 11:01:31 crc kubenswrapper[4752]: I0929 11:01:31.136179 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-586b879c47-crvdt"] Sep 29 11:01:31 crc kubenswrapper[4752]: I0929 11:01:31.148561 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-586b879c47-crvdt"] Sep 29 11:01:31 crc kubenswrapper[4752]: I0929 11:01:31.156015 4752 scope.go:117] "RemoveContainer" containerID="391ef9bae562fcc2f50ccfa518484444a0d56e60fc13efacfdb5cef3e57933b1" Sep 29 11:01:31 crc kubenswrapper[4752]: I0929 11:01:31.157895 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-d99bfc6df-bjdz6"] Sep 29 11:01:31 crc kubenswrapper[4752]: I0929 11:01:31.164025 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-d99bfc6df-bjdz6"] Sep 29 11:01:31 crc kubenswrapper[4752]: I0929 11:01:31.177742 4752 scope.go:117] "RemoveContainer" containerID="11a66f3c5379b55a900c64074dc7d5dc5543cd96b9dab995ba249b4ca7985f79" Sep 29 11:01:31 crc kubenswrapper[4752]: I0929 11:01:31.767507 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-index-dmdnl"] Sep 29 11:01:31 crc kubenswrapper[4752]: E0929 11:01:31.768220 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2aa7ba00-f408-407e-83cc-ce6b2c5b51fa" containerName="kube-rbac-proxy" Sep 29 11:01:31 crc kubenswrapper[4752]: I0929 11:01:31.768246 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="2aa7ba00-f408-407e-83cc-ce6b2c5b51fa" containerName="kube-rbac-proxy" Sep 29 11:01:31 crc kubenswrapper[4752]: E0929 11:01:31.768263 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f791715-ebb0-42db-b8fe-d8ab636b6bb2" containerName="kube-rbac-proxy" Sep 29 11:01:31 crc kubenswrapper[4752]: I0929 11:01:31.768270 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f791715-ebb0-42db-b8fe-d8ab636b6bb2" containerName="kube-rbac-proxy" Sep 29 11:01:31 crc kubenswrapper[4752]: E0929 11:01:31.768284 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f791715-ebb0-42db-b8fe-d8ab636b6bb2" containerName="operator" Sep 29 11:01:31 crc kubenswrapper[4752]: I0929 11:01:31.768291 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f791715-ebb0-42db-b8fe-d8ab636b6bb2" containerName="operator" Sep 29 11:01:31 crc kubenswrapper[4752]: E0929 11:01:31.768307 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2aa7ba00-f408-407e-83cc-ce6b2c5b51fa" containerName="manager" Sep 29 11:01:31 crc kubenswrapper[4752]: I0929 11:01:31.768314 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="2aa7ba00-f408-407e-83cc-ce6b2c5b51fa" containerName="manager" Sep 29 11:01:31 crc kubenswrapper[4752]: I0929 11:01:31.768486 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f791715-ebb0-42db-b8fe-d8ab636b6bb2" containerName="kube-rbac-proxy" Sep 29 11:01:31 crc kubenswrapper[4752]: I0929 11:01:31.768503 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f791715-ebb0-42db-b8fe-d8ab636b6bb2" containerName="operator" Sep 29 11:01:31 crc kubenswrapper[4752]: I0929 11:01:31.768525 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="2aa7ba00-f408-407e-83cc-ce6b2c5b51fa" containerName="manager" Sep 29 11:01:31 crc kubenswrapper[4752]: I0929 11:01:31.768534 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="2aa7ba00-f408-407e-83cc-ce6b2c5b51fa" containerName="kube-rbac-proxy" Sep 29 11:01:31 crc kubenswrapper[4752]: I0929 11:01:31.769063 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-index-dmdnl" Sep 29 11:01:31 crc kubenswrapper[4752]: I0929 11:01:31.774155 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-index-dockercfg-wrdj4" Sep 29 11:01:31 crc kubenswrapper[4752]: I0929 11:01:31.780615 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-index-dmdnl"] Sep 29 11:01:31 crc kubenswrapper[4752]: I0929 11:01:31.914063 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9hff\" (UniqueName: \"kubernetes.io/projected/39307fc3-f44d-447a-80b4-f10d10871be4-kube-api-access-c9hff\") pod \"watcher-operator-index-dmdnl\" (UID: \"39307fc3-f44d-447a-80b4-f10d10871be4\") " pod="openstack-operators/watcher-operator-index-dmdnl" Sep 29 11:01:32 crc kubenswrapper[4752]: I0929 11:01:32.015233 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9hff\" (UniqueName: \"kubernetes.io/projected/39307fc3-f44d-447a-80b4-f10d10871be4-kube-api-access-c9hff\") pod \"watcher-operator-index-dmdnl\" (UID: \"39307fc3-f44d-447a-80b4-f10d10871be4\") " pod="openstack-operators/watcher-operator-index-dmdnl" Sep 29 11:01:32 crc kubenswrapper[4752]: I0929 11:01:32.039236 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9hff\" (UniqueName: \"kubernetes.io/projected/39307fc3-f44d-447a-80b4-f10d10871be4-kube-api-access-c9hff\") pod \"watcher-operator-index-dmdnl\" (UID: \"39307fc3-f44d-447a-80b4-f10d10871be4\") " pod="openstack-operators/watcher-operator-index-dmdnl" Sep 29 11:01:32 crc kubenswrapper[4752]: I0929 11:01:32.044146 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f791715-ebb0-42db-b8fe-d8ab636b6bb2" path="/var/lib/kubelet/pods/0f791715-ebb0-42db-b8fe-d8ab636b6bb2/volumes" Sep 29 11:01:32 crc kubenswrapper[4752]: I0929 11:01:32.045060 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2aa7ba00-f408-407e-83cc-ce6b2c5b51fa" path="/var/lib/kubelet/pods/2aa7ba00-f408-407e-83cc-ce6b2c5b51fa/volumes" Sep 29 11:01:32 crc kubenswrapper[4752]: I0929 11:01:32.092436 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-index-dmdnl" Sep 29 11:01:32 crc kubenswrapper[4752]: I0929 11:01:32.515680 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-index-dmdnl"] Sep 29 11:01:33 crc kubenswrapper[4752]: I0929 11:01:33.120276 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-index-dmdnl" event={"ID":"39307fc3-f44d-447a-80b4-f10d10871be4","Type":"ContainerStarted","Data":"df9c5cb936f71f6ad9dd2e82e9a18087d20f1b90bde85f161266010237b84145"} Sep 29 11:01:33 crc kubenswrapper[4752]: I0929 11:01:33.120738 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-index-dmdnl" event={"ID":"39307fc3-f44d-447a-80b4-f10d10871be4","Type":"ContainerStarted","Data":"c8d0d91dda7aee34c503ceed1300e0c3899764c4ef0af25ef77828469ea31456"} Sep 29 11:01:33 crc kubenswrapper[4752]: I0929 11:01:33.146646 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-index-dmdnl" podStartSLOduration=1.975725854 podStartE2EDuration="2.146630244s" podCreationTimestamp="2025-09-29 11:01:31 +0000 UTC" firstStartedPulling="2025-09-29 11:01:32.523625903 +0000 UTC m=+1033.312767570" lastFinishedPulling="2025-09-29 11:01:32.694530293 +0000 UTC m=+1033.483671960" observedRunningTime="2025-09-29 11:01:33.144062348 +0000 UTC m=+1033.933204015" watchObservedRunningTime="2025-09-29 11:01:33.146630244 +0000 UTC m=+1033.935771911" Sep 29 11:01:42 crc kubenswrapper[4752]: I0929 11:01:42.093550 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-index-dmdnl" Sep 29 11:01:42 crc kubenswrapper[4752]: I0929 11:01:42.093967 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/watcher-operator-index-dmdnl" Sep 29 11:01:42 crc kubenswrapper[4752]: I0929 11:01:42.145279 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/watcher-operator-index-dmdnl" Sep 29 11:01:42 crc kubenswrapper[4752]: I0929 11:01:42.232468 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-index-dmdnl" Sep 29 11:01:45 crc kubenswrapper[4752]: I0929 11:01:45.614751 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ac6ff96320e17949fb797c48f279c0d92700e860ad7f5bcf96b6fff45664bpm"] Sep 29 11:01:45 crc kubenswrapper[4752]: I0929 11:01:45.617349 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ac6ff96320e17949fb797c48f279c0d92700e860ad7f5bcf96b6fff45664bpm" Sep 29 11:01:45 crc kubenswrapper[4752]: I0929 11:01:45.622320 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-x25gs" Sep 29 11:01:45 crc kubenswrapper[4752]: I0929 11:01:45.625463 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ac6ff96320e17949fb797c48f279c0d92700e860ad7f5bcf96b6fff45664bpm"] Sep 29 11:01:45 crc kubenswrapper[4752]: I0929 11:01:45.721842 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f1ec6fb4-143b-4af3-ac83-9a4e73a138eb-bundle\") pod \"ac6ff96320e17949fb797c48f279c0d92700e860ad7f5bcf96b6fff45664bpm\" (UID: \"f1ec6fb4-143b-4af3-ac83-9a4e73a138eb\") " pod="openstack-operators/ac6ff96320e17949fb797c48f279c0d92700e860ad7f5bcf96b6fff45664bpm" Sep 29 11:01:45 crc kubenswrapper[4752]: I0929 11:01:45.722030 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f1ec6fb4-143b-4af3-ac83-9a4e73a138eb-util\") pod \"ac6ff96320e17949fb797c48f279c0d92700e860ad7f5bcf96b6fff45664bpm\" (UID: \"f1ec6fb4-143b-4af3-ac83-9a4e73a138eb\") " pod="openstack-operators/ac6ff96320e17949fb797c48f279c0d92700e860ad7f5bcf96b6fff45664bpm" Sep 29 11:01:45 crc kubenswrapper[4752]: I0929 11:01:45.722068 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5ch6\" (UniqueName: \"kubernetes.io/projected/f1ec6fb4-143b-4af3-ac83-9a4e73a138eb-kube-api-access-w5ch6\") pod \"ac6ff96320e17949fb797c48f279c0d92700e860ad7f5bcf96b6fff45664bpm\" (UID: \"f1ec6fb4-143b-4af3-ac83-9a4e73a138eb\") " pod="openstack-operators/ac6ff96320e17949fb797c48f279c0d92700e860ad7f5bcf96b6fff45664bpm" Sep 29 11:01:45 crc kubenswrapper[4752]: I0929 11:01:45.824233 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f1ec6fb4-143b-4af3-ac83-9a4e73a138eb-bundle\") pod \"ac6ff96320e17949fb797c48f279c0d92700e860ad7f5bcf96b6fff45664bpm\" (UID: \"f1ec6fb4-143b-4af3-ac83-9a4e73a138eb\") " pod="openstack-operators/ac6ff96320e17949fb797c48f279c0d92700e860ad7f5bcf96b6fff45664bpm" Sep 29 11:01:45 crc kubenswrapper[4752]: I0929 11:01:45.824462 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f1ec6fb4-143b-4af3-ac83-9a4e73a138eb-util\") pod \"ac6ff96320e17949fb797c48f279c0d92700e860ad7f5bcf96b6fff45664bpm\" (UID: \"f1ec6fb4-143b-4af3-ac83-9a4e73a138eb\") " pod="openstack-operators/ac6ff96320e17949fb797c48f279c0d92700e860ad7f5bcf96b6fff45664bpm" Sep 29 11:01:45 crc kubenswrapper[4752]: I0929 11:01:45.824540 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5ch6\" (UniqueName: \"kubernetes.io/projected/f1ec6fb4-143b-4af3-ac83-9a4e73a138eb-kube-api-access-w5ch6\") pod \"ac6ff96320e17949fb797c48f279c0d92700e860ad7f5bcf96b6fff45664bpm\" (UID: \"f1ec6fb4-143b-4af3-ac83-9a4e73a138eb\") " pod="openstack-operators/ac6ff96320e17949fb797c48f279c0d92700e860ad7f5bcf96b6fff45664bpm" Sep 29 11:01:45 crc kubenswrapper[4752]: I0929 11:01:45.824773 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f1ec6fb4-143b-4af3-ac83-9a4e73a138eb-bundle\") pod \"ac6ff96320e17949fb797c48f279c0d92700e860ad7f5bcf96b6fff45664bpm\" (UID: \"f1ec6fb4-143b-4af3-ac83-9a4e73a138eb\") " pod="openstack-operators/ac6ff96320e17949fb797c48f279c0d92700e860ad7f5bcf96b6fff45664bpm" Sep 29 11:01:45 crc kubenswrapper[4752]: I0929 11:01:45.825863 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f1ec6fb4-143b-4af3-ac83-9a4e73a138eb-util\") pod \"ac6ff96320e17949fb797c48f279c0d92700e860ad7f5bcf96b6fff45664bpm\" (UID: \"f1ec6fb4-143b-4af3-ac83-9a4e73a138eb\") " pod="openstack-operators/ac6ff96320e17949fb797c48f279c0d92700e860ad7f5bcf96b6fff45664bpm" Sep 29 11:01:45 crc kubenswrapper[4752]: I0929 11:01:45.849827 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5ch6\" (UniqueName: \"kubernetes.io/projected/f1ec6fb4-143b-4af3-ac83-9a4e73a138eb-kube-api-access-w5ch6\") pod \"ac6ff96320e17949fb797c48f279c0d92700e860ad7f5bcf96b6fff45664bpm\" (UID: \"f1ec6fb4-143b-4af3-ac83-9a4e73a138eb\") " pod="openstack-operators/ac6ff96320e17949fb797c48f279c0d92700e860ad7f5bcf96b6fff45664bpm" Sep 29 11:01:45 crc kubenswrapper[4752]: I0929 11:01:45.954684 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ac6ff96320e17949fb797c48f279c0d92700e860ad7f5bcf96b6fff45664bpm" Sep 29 11:01:46 crc kubenswrapper[4752]: I0929 11:01:46.384489 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ac6ff96320e17949fb797c48f279c0d92700e860ad7f5bcf96b6fff45664bpm"] Sep 29 11:01:47 crc kubenswrapper[4752]: I0929 11:01:47.231884 4752 generic.go:334] "Generic (PLEG): container finished" podID="f1ec6fb4-143b-4af3-ac83-9a4e73a138eb" containerID="0a65037883db62dcd63f721c43d750e4c8704343b31d1092b898a7b1fa75623a" exitCode=0 Sep 29 11:01:47 crc kubenswrapper[4752]: I0929 11:01:47.231936 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ac6ff96320e17949fb797c48f279c0d92700e860ad7f5bcf96b6fff45664bpm" event={"ID":"f1ec6fb4-143b-4af3-ac83-9a4e73a138eb","Type":"ContainerDied","Data":"0a65037883db62dcd63f721c43d750e4c8704343b31d1092b898a7b1fa75623a"} Sep 29 11:01:47 crc kubenswrapper[4752]: I0929 11:01:47.232320 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ac6ff96320e17949fb797c48f279c0d92700e860ad7f5bcf96b6fff45664bpm" event={"ID":"f1ec6fb4-143b-4af3-ac83-9a4e73a138eb","Type":"ContainerStarted","Data":"805a7760f0c1fc010a3305d6b31ac36f84a72d2da4030034a71816fceab036de"} Sep 29 11:01:48 crc kubenswrapper[4752]: I0929 11:01:48.243649 4752 generic.go:334] "Generic (PLEG): container finished" podID="f1ec6fb4-143b-4af3-ac83-9a4e73a138eb" containerID="7439931729d1aee5bd5279e6bccc3e093a14c854c6294790e3a6c2794bc63159" exitCode=0 Sep 29 11:01:48 crc kubenswrapper[4752]: I0929 11:01:48.243711 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ac6ff96320e17949fb797c48f279c0d92700e860ad7f5bcf96b6fff45664bpm" event={"ID":"f1ec6fb4-143b-4af3-ac83-9a4e73a138eb","Type":"ContainerDied","Data":"7439931729d1aee5bd5279e6bccc3e093a14c854c6294790e3a6c2794bc63159"} Sep 29 11:01:49 crc kubenswrapper[4752]: I0929 11:01:49.257125 4752 generic.go:334] "Generic (PLEG): container finished" podID="f1ec6fb4-143b-4af3-ac83-9a4e73a138eb" containerID="f13a813482c4f6aef3f7ef13ab938d5b773f418cfdf0bcae1c6abcc47f3e83e9" exitCode=0 Sep 29 11:01:49 crc kubenswrapper[4752]: I0929 11:01:49.257792 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ac6ff96320e17949fb797c48f279c0d92700e860ad7f5bcf96b6fff45664bpm" event={"ID":"f1ec6fb4-143b-4af3-ac83-9a4e73a138eb","Type":"ContainerDied","Data":"f13a813482c4f6aef3f7ef13ab938d5b773f418cfdf0bcae1c6abcc47f3e83e9"} Sep 29 11:01:50 crc kubenswrapper[4752]: I0929 11:01:50.584836 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ac6ff96320e17949fb797c48f279c0d92700e860ad7f5bcf96b6fff45664bpm" Sep 29 11:01:50 crc kubenswrapper[4752]: I0929 11:01:50.696392 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f1ec6fb4-143b-4af3-ac83-9a4e73a138eb-bundle\") pod \"f1ec6fb4-143b-4af3-ac83-9a4e73a138eb\" (UID: \"f1ec6fb4-143b-4af3-ac83-9a4e73a138eb\") " Sep 29 11:01:50 crc kubenswrapper[4752]: I0929 11:01:50.696441 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5ch6\" (UniqueName: \"kubernetes.io/projected/f1ec6fb4-143b-4af3-ac83-9a4e73a138eb-kube-api-access-w5ch6\") pod \"f1ec6fb4-143b-4af3-ac83-9a4e73a138eb\" (UID: \"f1ec6fb4-143b-4af3-ac83-9a4e73a138eb\") " Sep 29 11:01:50 crc kubenswrapper[4752]: I0929 11:01:50.696598 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f1ec6fb4-143b-4af3-ac83-9a4e73a138eb-util\") pod \"f1ec6fb4-143b-4af3-ac83-9a4e73a138eb\" (UID: \"f1ec6fb4-143b-4af3-ac83-9a4e73a138eb\") " Sep 29 11:01:50 crc kubenswrapper[4752]: I0929 11:01:50.697416 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1ec6fb4-143b-4af3-ac83-9a4e73a138eb-bundle" (OuterVolumeSpecName: "bundle") pod "f1ec6fb4-143b-4af3-ac83-9a4e73a138eb" (UID: "f1ec6fb4-143b-4af3-ac83-9a4e73a138eb"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 11:01:50 crc kubenswrapper[4752]: I0929 11:01:50.703220 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1ec6fb4-143b-4af3-ac83-9a4e73a138eb-kube-api-access-w5ch6" (OuterVolumeSpecName: "kube-api-access-w5ch6") pod "f1ec6fb4-143b-4af3-ac83-9a4e73a138eb" (UID: "f1ec6fb4-143b-4af3-ac83-9a4e73a138eb"). InnerVolumeSpecName "kube-api-access-w5ch6". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 11:01:50 crc kubenswrapper[4752]: I0929 11:01:50.716024 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1ec6fb4-143b-4af3-ac83-9a4e73a138eb-util" (OuterVolumeSpecName: "util") pod "f1ec6fb4-143b-4af3-ac83-9a4e73a138eb" (UID: "f1ec6fb4-143b-4af3-ac83-9a4e73a138eb"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 11:01:50 crc kubenswrapper[4752]: I0929 11:01:50.798346 4752 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f1ec6fb4-143b-4af3-ac83-9a4e73a138eb-util\") on node \"crc\" DevicePath \"\"" Sep 29 11:01:50 crc kubenswrapper[4752]: I0929 11:01:50.798414 4752 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f1ec6fb4-143b-4af3-ac83-9a4e73a138eb-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 11:01:50 crc kubenswrapper[4752]: I0929 11:01:50.798433 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w5ch6\" (UniqueName: \"kubernetes.io/projected/f1ec6fb4-143b-4af3-ac83-9a4e73a138eb-kube-api-access-w5ch6\") on node \"crc\" DevicePath \"\"" Sep 29 11:01:51 crc kubenswrapper[4752]: I0929 11:01:51.282080 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ac6ff96320e17949fb797c48f279c0d92700e860ad7f5bcf96b6fff45664bpm" event={"ID":"f1ec6fb4-143b-4af3-ac83-9a4e73a138eb","Type":"ContainerDied","Data":"805a7760f0c1fc010a3305d6b31ac36f84a72d2da4030034a71816fceab036de"} Sep 29 11:01:51 crc kubenswrapper[4752]: I0929 11:01:51.282459 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="805a7760f0c1fc010a3305d6b31ac36f84a72d2da4030034a71816fceab036de" Sep 29 11:01:51 crc kubenswrapper[4752]: I0929 11:01:51.282175 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ac6ff96320e17949fb797c48f279c0d92700e860ad7f5bcf96b6fff45664bpm" Sep 29 11:01:54 crc kubenswrapper[4752]: I0929 11:01:54.264679 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-599b7c5d86-rsmf6"] Sep 29 11:01:54 crc kubenswrapper[4752]: E0929 11:01:54.265520 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1ec6fb4-143b-4af3-ac83-9a4e73a138eb" containerName="extract" Sep 29 11:01:54 crc kubenswrapper[4752]: I0929 11:01:54.265536 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1ec6fb4-143b-4af3-ac83-9a4e73a138eb" containerName="extract" Sep 29 11:01:54 crc kubenswrapper[4752]: E0929 11:01:54.265552 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1ec6fb4-143b-4af3-ac83-9a4e73a138eb" containerName="pull" Sep 29 11:01:54 crc kubenswrapper[4752]: I0929 11:01:54.265560 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1ec6fb4-143b-4af3-ac83-9a4e73a138eb" containerName="pull" Sep 29 11:01:54 crc kubenswrapper[4752]: E0929 11:01:54.265588 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1ec6fb4-143b-4af3-ac83-9a4e73a138eb" containerName="util" Sep 29 11:01:54 crc kubenswrapper[4752]: I0929 11:01:54.265614 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1ec6fb4-143b-4af3-ac83-9a4e73a138eb" containerName="util" Sep 29 11:01:54 crc kubenswrapper[4752]: I0929 11:01:54.265894 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1ec6fb4-143b-4af3-ac83-9a4e73a138eb" containerName="extract" Sep 29 11:01:54 crc kubenswrapper[4752]: I0929 11:01:54.266888 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-599b7c5d86-rsmf6" Sep 29 11:01:54 crc kubenswrapper[4752]: I0929 11:01:54.270498 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-service-cert" Sep 29 11:01:54 crc kubenswrapper[4752]: I0929 11:01:54.271536 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-wg26p" Sep 29 11:01:54 crc kubenswrapper[4752]: I0929 11:01:54.287408 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-599b7c5d86-rsmf6"] Sep 29 11:01:54 crc kubenswrapper[4752]: I0929 11:01:54.450967 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d3815df6-d40a-4370-8c81-ec44ee45503b-apiservice-cert\") pod \"watcher-operator-controller-manager-599b7c5d86-rsmf6\" (UID: \"d3815df6-d40a-4370-8c81-ec44ee45503b\") " pod="openstack-operators/watcher-operator-controller-manager-599b7c5d86-rsmf6" Sep 29 11:01:54 crc kubenswrapper[4752]: I0929 11:01:54.451131 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hwbp\" (UniqueName: \"kubernetes.io/projected/d3815df6-d40a-4370-8c81-ec44ee45503b-kube-api-access-9hwbp\") pod \"watcher-operator-controller-manager-599b7c5d86-rsmf6\" (UID: \"d3815df6-d40a-4370-8c81-ec44ee45503b\") " pod="openstack-operators/watcher-operator-controller-manager-599b7c5d86-rsmf6" Sep 29 11:01:54 crc kubenswrapper[4752]: I0929 11:01:54.451250 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d3815df6-d40a-4370-8c81-ec44ee45503b-webhook-cert\") pod \"watcher-operator-controller-manager-599b7c5d86-rsmf6\" (UID: \"d3815df6-d40a-4370-8c81-ec44ee45503b\") " pod="openstack-operators/watcher-operator-controller-manager-599b7c5d86-rsmf6" Sep 29 11:01:54 crc kubenswrapper[4752]: I0929 11:01:54.553128 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d3815df6-d40a-4370-8c81-ec44ee45503b-apiservice-cert\") pod \"watcher-operator-controller-manager-599b7c5d86-rsmf6\" (UID: \"d3815df6-d40a-4370-8c81-ec44ee45503b\") " pod="openstack-operators/watcher-operator-controller-manager-599b7c5d86-rsmf6" Sep 29 11:01:54 crc kubenswrapper[4752]: I0929 11:01:54.553464 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hwbp\" (UniqueName: \"kubernetes.io/projected/d3815df6-d40a-4370-8c81-ec44ee45503b-kube-api-access-9hwbp\") pod \"watcher-operator-controller-manager-599b7c5d86-rsmf6\" (UID: \"d3815df6-d40a-4370-8c81-ec44ee45503b\") " pod="openstack-operators/watcher-operator-controller-manager-599b7c5d86-rsmf6" Sep 29 11:01:54 crc kubenswrapper[4752]: I0929 11:01:54.553613 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d3815df6-d40a-4370-8c81-ec44ee45503b-webhook-cert\") pod \"watcher-operator-controller-manager-599b7c5d86-rsmf6\" (UID: \"d3815df6-d40a-4370-8c81-ec44ee45503b\") " pod="openstack-operators/watcher-operator-controller-manager-599b7c5d86-rsmf6" Sep 29 11:01:54 crc kubenswrapper[4752]: I0929 11:01:54.561129 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d3815df6-d40a-4370-8c81-ec44ee45503b-webhook-cert\") pod \"watcher-operator-controller-manager-599b7c5d86-rsmf6\" (UID: \"d3815df6-d40a-4370-8c81-ec44ee45503b\") " pod="openstack-operators/watcher-operator-controller-manager-599b7c5d86-rsmf6" Sep 29 11:01:54 crc kubenswrapper[4752]: I0929 11:01:54.561193 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d3815df6-d40a-4370-8c81-ec44ee45503b-apiservice-cert\") pod \"watcher-operator-controller-manager-599b7c5d86-rsmf6\" (UID: \"d3815df6-d40a-4370-8c81-ec44ee45503b\") " pod="openstack-operators/watcher-operator-controller-manager-599b7c5d86-rsmf6" Sep 29 11:01:54 crc kubenswrapper[4752]: I0929 11:01:54.573077 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hwbp\" (UniqueName: \"kubernetes.io/projected/d3815df6-d40a-4370-8c81-ec44ee45503b-kube-api-access-9hwbp\") pod \"watcher-operator-controller-manager-599b7c5d86-rsmf6\" (UID: \"d3815df6-d40a-4370-8c81-ec44ee45503b\") " pod="openstack-operators/watcher-operator-controller-manager-599b7c5d86-rsmf6" Sep 29 11:01:54 crc kubenswrapper[4752]: I0929 11:01:54.593190 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-599b7c5d86-rsmf6" Sep 29 11:01:55 crc kubenswrapper[4752]: I0929 11:01:55.077298 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-599b7c5d86-rsmf6"] Sep 29 11:01:55 crc kubenswrapper[4752]: W0929 11:01:55.077970 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3815df6_d40a_4370_8c81_ec44ee45503b.slice/crio-2bea76149348ec0ff9b3fa4e82a0f8e7d96290e949dad564c0bd364f6d75522d WatchSource:0}: Error finding container 2bea76149348ec0ff9b3fa4e82a0f8e7d96290e949dad564c0bd364f6d75522d: Status 404 returned error can't find the container with id 2bea76149348ec0ff9b3fa4e82a0f8e7d96290e949dad564c0bd364f6d75522d Sep 29 11:01:55 crc kubenswrapper[4752]: I0929 11:01:55.322836 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-599b7c5d86-rsmf6" event={"ID":"d3815df6-d40a-4370-8c81-ec44ee45503b","Type":"ContainerStarted","Data":"0c10dfe5194d227008198977e32ee296365f71ba61104fc728b2434dec7ca45c"} Sep 29 11:01:55 crc kubenswrapper[4752]: I0929 11:01:55.322900 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-599b7c5d86-rsmf6" event={"ID":"d3815df6-d40a-4370-8c81-ec44ee45503b","Type":"ContainerStarted","Data":"2bea76149348ec0ff9b3fa4e82a0f8e7d96290e949dad564c0bd364f6d75522d"} Sep 29 11:01:56 crc kubenswrapper[4752]: I0929 11:01:56.175046 4752 patch_prober.go:28] interesting pod/machine-config-daemon-mgrvs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 11:01:56 crc kubenswrapper[4752]: I0929 11:01:56.175470 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 11:01:56 crc kubenswrapper[4752]: I0929 11:01:56.336848 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-599b7c5d86-rsmf6" event={"ID":"d3815df6-d40a-4370-8c81-ec44ee45503b","Type":"ContainerStarted","Data":"0a015d772498660425528c3c8101e402acfdc1c6eac19650c38d8f7444f3df5c"} Sep 29 11:01:56 crc kubenswrapper[4752]: I0929 11:01:56.337225 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-599b7c5d86-rsmf6" Sep 29 11:01:56 crc kubenswrapper[4752]: I0929 11:01:56.368417 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-599b7c5d86-rsmf6" podStartSLOduration=2.368384004 podStartE2EDuration="2.368384004s" podCreationTimestamp="2025-09-29 11:01:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 11:01:56.360639034 +0000 UTC m=+1057.149780711" watchObservedRunningTime="2025-09-29 11:01:56.368384004 +0000 UTC m=+1057.157525671" Sep 29 11:02:04 crc kubenswrapper[4752]: I0929 11:02:04.600186 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-599b7c5d86-rsmf6" Sep 29 11:02:08 crc kubenswrapper[4752]: I0929 11:02:08.244871 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-565b894b7f-l7zgn"] Sep 29 11:02:08 crc kubenswrapper[4752]: I0929 11:02:08.246788 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-565b894b7f-l7zgn" Sep 29 11:02:08 crc kubenswrapper[4752]: I0929 11:02:08.269565 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-565b894b7f-l7zgn"] Sep 29 11:02:08 crc kubenswrapper[4752]: I0929 11:02:08.368042 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2lk6\" (UniqueName: \"kubernetes.io/projected/2dd90864-f3b9-4972-933f-d6c7a146f8c0-kube-api-access-n2lk6\") pod \"watcher-operator-controller-manager-565b894b7f-l7zgn\" (UID: \"2dd90864-f3b9-4972-933f-d6c7a146f8c0\") " pod="openstack-operators/watcher-operator-controller-manager-565b894b7f-l7zgn" Sep 29 11:02:08 crc kubenswrapper[4752]: I0929 11:02:08.368192 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2dd90864-f3b9-4972-933f-d6c7a146f8c0-apiservice-cert\") pod \"watcher-operator-controller-manager-565b894b7f-l7zgn\" (UID: \"2dd90864-f3b9-4972-933f-d6c7a146f8c0\") " pod="openstack-operators/watcher-operator-controller-manager-565b894b7f-l7zgn" Sep 29 11:02:08 crc kubenswrapper[4752]: I0929 11:02:08.368215 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2dd90864-f3b9-4972-933f-d6c7a146f8c0-webhook-cert\") pod \"watcher-operator-controller-manager-565b894b7f-l7zgn\" (UID: \"2dd90864-f3b9-4972-933f-d6c7a146f8c0\") " pod="openstack-operators/watcher-operator-controller-manager-565b894b7f-l7zgn" Sep 29 11:02:08 crc kubenswrapper[4752]: I0929 11:02:08.469577 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2lk6\" (UniqueName: \"kubernetes.io/projected/2dd90864-f3b9-4972-933f-d6c7a146f8c0-kube-api-access-n2lk6\") pod \"watcher-operator-controller-manager-565b894b7f-l7zgn\" (UID: \"2dd90864-f3b9-4972-933f-d6c7a146f8c0\") " pod="openstack-operators/watcher-operator-controller-manager-565b894b7f-l7zgn" Sep 29 11:02:08 crc kubenswrapper[4752]: I0929 11:02:08.469660 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2dd90864-f3b9-4972-933f-d6c7a146f8c0-webhook-cert\") pod \"watcher-operator-controller-manager-565b894b7f-l7zgn\" (UID: \"2dd90864-f3b9-4972-933f-d6c7a146f8c0\") " pod="openstack-operators/watcher-operator-controller-manager-565b894b7f-l7zgn" Sep 29 11:02:08 crc kubenswrapper[4752]: I0929 11:02:08.469680 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2dd90864-f3b9-4972-933f-d6c7a146f8c0-apiservice-cert\") pod \"watcher-operator-controller-manager-565b894b7f-l7zgn\" (UID: \"2dd90864-f3b9-4972-933f-d6c7a146f8c0\") " pod="openstack-operators/watcher-operator-controller-manager-565b894b7f-l7zgn" Sep 29 11:02:08 crc kubenswrapper[4752]: I0929 11:02:08.476064 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2dd90864-f3b9-4972-933f-d6c7a146f8c0-webhook-cert\") pod \"watcher-operator-controller-manager-565b894b7f-l7zgn\" (UID: \"2dd90864-f3b9-4972-933f-d6c7a146f8c0\") " pod="openstack-operators/watcher-operator-controller-manager-565b894b7f-l7zgn" Sep 29 11:02:08 crc kubenswrapper[4752]: I0929 11:02:08.476127 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2dd90864-f3b9-4972-933f-d6c7a146f8c0-apiservice-cert\") pod \"watcher-operator-controller-manager-565b894b7f-l7zgn\" (UID: \"2dd90864-f3b9-4972-933f-d6c7a146f8c0\") " pod="openstack-operators/watcher-operator-controller-manager-565b894b7f-l7zgn" Sep 29 11:02:08 crc kubenswrapper[4752]: I0929 11:02:08.495157 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2lk6\" (UniqueName: \"kubernetes.io/projected/2dd90864-f3b9-4972-933f-d6c7a146f8c0-kube-api-access-n2lk6\") pod \"watcher-operator-controller-manager-565b894b7f-l7zgn\" (UID: \"2dd90864-f3b9-4972-933f-d6c7a146f8c0\") " pod="openstack-operators/watcher-operator-controller-manager-565b894b7f-l7zgn" Sep 29 11:02:08 crc kubenswrapper[4752]: I0929 11:02:08.577449 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-565b894b7f-l7zgn" Sep 29 11:02:09 crc kubenswrapper[4752]: I0929 11:02:09.018956 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-565b894b7f-l7zgn"] Sep 29 11:02:09 crc kubenswrapper[4752]: I0929 11:02:09.438592 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-565b894b7f-l7zgn" event={"ID":"2dd90864-f3b9-4972-933f-d6c7a146f8c0","Type":"ContainerStarted","Data":"8ba759b5d54ce6fa673156f913277a6c67d4dfa73d1f1deac35e8e93fe2a286b"} Sep 29 11:02:09 crc kubenswrapper[4752]: I0929 11:02:09.439041 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-565b894b7f-l7zgn" event={"ID":"2dd90864-f3b9-4972-933f-d6c7a146f8c0","Type":"ContainerStarted","Data":"d4584e542e053ce3dba1ef22486518051e22522049891f9c2ab0b117dd1dc58b"} Sep 29 11:02:09 crc kubenswrapper[4752]: I0929 11:02:09.439065 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-565b894b7f-l7zgn" Sep 29 11:02:09 crc kubenswrapper[4752]: I0929 11:02:09.439082 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-565b894b7f-l7zgn" event={"ID":"2dd90864-f3b9-4972-933f-d6c7a146f8c0","Type":"ContainerStarted","Data":"8160eeac70490100a61409b38758578dd8553e434e0f367ef83122d931c6df58"} Sep 29 11:02:09 crc kubenswrapper[4752]: I0929 11:02:09.459547 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-565b894b7f-l7zgn" podStartSLOduration=1.459526768 podStartE2EDuration="1.459526768s" podCreationTimestamp="2025-09-29 11:02:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 11:02:09.455714079 +0000 UTC m=+1070.244855766" watchObservedRunningTime="2025-09-29 11:02:09.459526768 +0000 UTC m=+1070.248668435" Sep 29 11:02:18 crc kubenswrapper[4752]: I0929 11:02:18.583181 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-565b894b7f-l7zgn" Sep 29 11:02:18 crc kubenswrapper[4752]: I0929 11:02:18.654156 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-599b7c5d86-rsmf6"] Sep 29 11:02:18 crc kubenswrapper[4752]: I0929 11:02:18.654516 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/watcher-operator-controller-manager-599b7c5d86-rsmf6" podUID="d3815df6-d40a-4370-8c81-ec44ee45503b" containerName="manager" containerID="cri-o://0c10dfe5194d227008198977e32ee296365f71ba61104fc728b2434dec7ca45c" gracePeriod=10 Sep 29 11:02:18 crc kubenswrapper[4752]: I0929 11:02:18.654893 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/watcher-operator-controller-manager-599b7c5d86-rsmf6" podUID="d3815df6-d40a-4370-8c81-ec44ee45503b" containerName="kube-rbac-proxy" containerID="cri-o://0a015d772498660425528c3c8101e402acfdc1c6eac19650c38d8f7444f3df5c" gracePeriod=10 Sep 29 11:02:19 crc kubenswrapper[4752]: I0929 11:02:19.099896 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-599b7c5d86-rsmf6" Sep 29 11:02:19 crc kubenswrapper[4752]: I0929 11:02:19.246112 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9hwbp\" (UniqueName: \"kubernetes.io/projected/d3815df6-d40a-4370-8c81-ec44ee45503b-kube-api-access-9hwbp\") pod \"d3815df6-d40a-4370-8c81-ec44ee45503b\" (UID: \"d3815df6-d40a-4370-8c81-ec44ee45503b\") " Sep 29 11:02:19 crc kubenswrapper[4752]: I0929 11:02:19.246220 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d3815df6-d40a-4370-8c81-ec44ee45503b-apiservice-cert\") pod \"d3815df6-d40a-4370-8c81-ec44ee45503b\" (UID: \"d3815df6-d40a-4370-8c81-ec44ee45503b\") " Sep 29 11:02:19 crc kubenswrapper[4752]: I0929 11:02:19.246376 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d3815df6-d40a-4370-8c81-ec44ee45503b-webhook-cert\") pod \"d3815df6-d40a-4370-8c81-ec44ee45503b\" (UID: \"d3815df6-d40a-4370-8c81-ec44ee45503b\") " Sep 29 11:02:19 crc kubenswrapper[4752]: I0929 11:02:19.251739 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3815df6-d40a-4370-8c81-ec44ee45503b-kube-api-access-9hwbp" (OuterVolumeSpecName: "kube-api-access-9hwbp") pod "d3815df6-d40a-4370-8c81-ec44ee45503b" (UID: "d3815df6-d40a-4370-8c81-ec44ee45503b"). InnerVolumeSpecName "kube-api-access-9hwbp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 11:02:19 crc kubenswrapper[4752]: I0929 11:02:19.251942 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3815df6-d40a-4370-8c81-ec44ee45503b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "d3815df6-d40a-4370-8c81-ec44ee45503b" (UID: "d3815df6-d40a-4370-8c81-ec44ee45503b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:02:19 crc kubenswrapper[4752]: I0929 11:02:19.252202 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3815df6-d40a-4370-8c81-ec44ee45503b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "d3815df6-d40a-4370-8c81-ec44ee45503b" (UID: "d3815df6-d40a-4370-8c81-ec44ee45503b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:02:19 crc kubenswrapper[4752]: I0929 11:02:19.349456 4752 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d3815df6-d40a-4370-8c81-ec44ee45503b-webhook-cert\") on node \"crc\" DevicePath \"\"" Sep 29 11:02:19 crc kubenswrapper[4752]: I0929 11:02:19.349491 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9hwbp\" (UniqueName: \"kubernetes.io/projected/d3815df6-d40a-4370-8c81-ec44ee45503b-kube-api-access-9hwbp\") on node \"crc\" DevicePath \"\"" Sep 29 11:02:19 crc kubenswrapper[4752]: I0929 11:02:19.349506 4752 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d3815df6-d40a-4370-8c81-ec44ee45503b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Sep 29 11:02:19 crc kubenswrapper[4752]: I0929 11:02:19.527991 4752 generic.go:334] "Generic (PLEG): container finished" podID="d3815df6-d40a-4370-8c81-ec44ee45503b" containerID="0a015d772498660425528c3c8101e402acfdc1c6eac19650c38d8f7444f3df5c" exitCode=0 Sep 29 11:02:19 crc kubenswrapper[4752]: I0929 11:02:19.528022 4752 generic.go:334] "Generic (PLEG): container finished" podID="d3815df6-d40a-4370-8c81-ec44ee45503b" containerID="0c10dfe5194d227008198977e32ee296365f71ba61104fc728b2434dec7ca45c" exitCode=0 Sep 29 11:02:19 crc kubenswrapper[4752]: I0929 11:02:19.528038 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-599b7c5d86-rsmf6" event={"ID":"d3815df6-d40a-4370-8c81-ec44ee45503b","Type":"ContainerDied","Data":"0a015d772498660425528c3c8101e402acfdc1c6eac19650c38d8f7444f3df5c"} Sep 29 11:02:19 crc kubenswrapper[4752]: I0929 11:02:19.528098 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-599b7c5d86-rsmf6" event={"ID":"d3815df6-d40a-4370-8c81-ec44ee45503b","Type":"ContainerDied","Data":"0c10dfe5194d227008198977e32ee296365f71ba61104fc728b2434dec7ca45c"} Sep 29 11:02:19 crc kubenswrapper[4752]: I0929 11:02:19.528062 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-599b7c5d86-rsmf6" Sep 29 11:02:19 crc kubenswrapper[4752]: I0929 11:02:19.528113 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-599b7c5d86-rsmf6" event={"ID":"d3815df6-d40a-4370-8c81-ec44ee45503b","Type":"ContainerDied","Data":"2bea76149348ec0ff9b3fa4e82a0f8e7d96290e949dad564c0bd364f6d75522d"} Sep 29 11:02:19 crc kubenswrapper[4752]: I0929 11:02:19.528124 4752 scope.go:117] "RemoveContainer" containerID="0a015d772498660425528c3c8101e402acfdc1c6eac19650c38d8f7444f3df5c" Sep 29 11:02:19 crc kubenswrapper[4752]: I0929 11:02:19.555251 4752 scope.go:117] "RemoveContainer" containerID="0c10dfe5194d227008198977e32ee296365f71ba61104fc728b2434dec7ca45c" Sep 29 11:02:19 crc kubenswrapper[4752]: I0929 11:02:19.567317 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-599b7c5d86-rsmf6"] Sep 29 11:02:19 crc kubenswrapper[4752]: I0929 11:02:19.575952 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-599b7c5d86-rsmf6"] Sep 29 11:02:19 crc kubenswrapper[4752]: I0929 11:02:19.576861 4752 scope.go:117] "RemoveContainer" containerID="0a015d772498660425528c3c8101e402acfdc1c6eac19650c38d8f7444f3df5c" Sep 29 11:02:19 crc kubenswrapper[4752]: E0929 11:02:19.577362 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a015d772498660425528c3c8101e402acfdc1c6eac19650c38d8f7444f3df5c\": container with ID starting with 0a015d772498660425528c3c8101e402acfdc1c6eac19650c38d8f7444f3df5c not found: ID does not exist" containerID="0a015d772498660425528c3c8101e402acfdc1c6eac19650c38d8f7444f3df5c" Sep 29 11:02:19 crc kubenswrapper[4752]: I0929 11:02:19.577404 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a015d772498660425528c3c8101e402acfdc1c6eac19650c38d8f7444f3df5c"} err="failed to get container status \"0a015d772498660425528c3c8101e402acfdc1c6eac19650c38d8f7444f3df5c\": rpc error: code = NotFound desc = could not find container \"0a015d772498660425528c3c8101e402acfdc1c6eac19650c38d8f7444f3df5c\": container with ID starting with 0a015d772498660425528c3c8101e402acfdc1c6eac19650c38d8f7444f3df5c not found: ID does not exist" Sep 29 11:02:19 crc kubenswrapper[4752]: I0929 11:02:19.577435 4752 scope.go:117] "RemoveContainer" containerID="0c10dfe5194d227008198977e32ee296365f71ba61104fc728b2434dec7ca45c" Sep 29 11:02:19 crc kubenswrapper[4752]: E0929 11:02:19.577946 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c10dfe5194d227008198977e32ee296365f71ba61104fc728b2434dec7ca45c\": container with ID starting with 0c10dfe5194d227008198977e32ee296365f71ba61104fc728b2434dec7ca45c not found: ID does not exist" containerID="0c10dfe5194d227008198977e32ee296365f71ba61104fc728b2434dec7ca45c" Sep 29 11:02:19 crc kubenswrapper[4752]: I0929 11:02:19.577990 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c10dfe5194d227008198977e32ee296365f71ba61104fc728b2434dec7ca45c"} err="failed to get container status \"0c10dfe5194d227008198977e32ee296365f71ba61104fc728b2434dec7ca45c\": rpc error: code = NotFound desc = could not find container \"0c10dfe5194d227008198977e32ee296365f71ba61104fc728b2434dec7ca45c\": container with ID starting with 0c10dfe5194d227008198977e32ee296365f71ba61104fc728b2434dec7ca45c not found: ID does not exist" Sep 29 11:02:19 crc kubenswrapper[4752]: I0929 11:02:19.578044 4752 scope.go:117] "RemoveContainer" containerID="0a015d772498660425528c3c8101e402acfdc1c6eac19650c38d8f7444f3df5c" Sep 29 11:02:19 crc kubenswrapper[4752]: I0929 11:02:19.578524 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a015d772498660425528c3c8101e402acfdc1c6eac19650c38d8f7444f3df5c"} err="failed to get container status \"0a015d772498660425528c3c8101e402acfdc1c6eac19650c38d8f7444f3df5c\": rpc error: code = NotFound desc = could not find container \"0a015d772498660425528c3c8101e402acfdc1c6eac19650c38d8f7444f3df5c\": container with ID starting with 0a015d772498660425528c3c8101e402acfdc1c6eac19650c38d8f7444f3df5c not found: ID does not exist" Sep 29 11:02:19 crc kubenswrapper[4752]: I0929 11:02:19.578695 4752 scope.go:117] "RemoveContainer" containerID="0c10dfe5194d227008198977e32ee296365f71ba61104fc728b2434dec7ca45c" Sep 29 11:02:19 crc kubenswrapper[4752]: I0929 11:02:19.579579 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c10dfe5194d227008198977e32ee296365f71ba61104fc728b2434dec7ca45c"} err="failed to get container status \"0c10dfe5194d227008198977e32ee296365f71ba61104fc728b2434dec7ca45c\": rpc error: code = NotFound desc = could not find container \"0c10dfe5194d227008198977e32ee296365f71ba61104fc728b2434dec7ca45c\": container with ID starting with 0c10dfe5194d227008198977e32ee296365f71ba61104fc728b2434dec7ca45c not found: ID does not exist" Sep 29 11:02:20 crc kubenswrapper[4752]: I0929 11:02:20.040961 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3815df6-d40a-4370-8c81-ec44ee45503b" path="/var/lib/kubelet/pods/d3815df6-d40a-4370-8c81-ec44ee45503b/volumes" Sep 29 11:02:26 crc kubenswrapper[4752]: I0929 11:02:26.175771 4752 patch_prober.go:28] interesting pod/machine-config-daemon-mgrvs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 11:02:26 crc kubenswrapper[4752]: I0929 11:02:26.176287 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 11:02:31 crc kubenswrapper[4752]: I0929 11:02:31.234753 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/rabbitmq-server-0"] Sep 29 11:02:31 crc kubenswrapper[4752]: E0929 11:02:31.237270 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3815df6-d40a-4370-8c81-ec44ee45503b" containerName="manager" Sep 29 11:02:31 crc kubenswrapper[4752]: I0929 11:02:31.237316 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3815df6-d40a-4370-8c81-ec44ee45503b" containerName="manager" Sep 29 11:02:31 crc kubenswrapper[4752]: E0929 11:02:31.237351 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3815df6-d40a-4370-8c81-ec44ee45503b" containerName="kube-rbac-proxy" Sep 29 11:02:31 crc kubenswrapper[4752]: I0929 11:02:31.237364 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3815df6-d40a-4370-8c81-ec44ee45503b" containerName="kube-rbac-proxy" Sep 29 11:02:31 crc kubenswrapper[4752]: I0929 11:02:31.237560 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3815df6-d40a-4370-8c81-ec44ee45503b" containerName="manager" Sep 29 11:02:31 crc kubenswrapper[4752]: I0929 11:02:31.237582 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3815df6-d40a-4370-8c81-ec44ee45503b" containerName="kube-rbac-proxy" Sep 29 11:02:31 crc kubenswrapper[4752]: I0929 11:02:31.239054 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/rabbitmq-server-0" Sep 29 11:02:31 crc kubenswrapper[4752]: I0929 11:02:31.242487 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"watcher-kuttl-default"/"openshift-service-ca.crt" Sep 29 11:02:31 crc kubenswrapper[4752]: I0929 11:02:31.242568 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-rabbitmq-svc" Sep 29 11:02:31 crc kubenswrapper[4752]: I0929 11:02:31.242995 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"watcher-kuttl-default"/"rabbitmq-config-data" Sep 29 11:02:31 crc kubenswrapper[4752]: I0929 11:02:31.243053 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"watcher-kuttl-default"/"rabbitmq-plugins-conf" Sep 29 11:02:31 crc kubenswrapper[4752]: I0929 11:02:31.243172 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"watcher-kuttl-default"/"rabbitmq-server-conf" Sep 29 11:02:31 crc kubenswrapper[4752]: I0929 11:02:31.244628 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"rabbitmq-server-dockercfg-ct8vp" Sep 29 11:02:31 crc kubenswrapper[4752]: I0929 11:02:31.246161 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"rabbitmq-default-user" Sep 29 11:02:31 crc kubenswrapper[4752]: I0929 11:02:31.246649 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"rabbitmq-erlang-cookie" Sep 29 11:02:31 crc kubenswrapper[4752]: I0929 11:02:31.246865 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"watcher-kuttl-default"/"kube-root-ca.crt" Sep 29 11:02:31 crc kubenswrapper[4752]: I0929 11:02:31.264097 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/rabbitmq-server-0"] Sep 29 11:02:31 crc kubenswrapper[4752]: I0929 11:02:31.330107 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0b2b1ef4-961c-4803-856d-4d6deb42cc10-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"0b2b1ef4-961c-4803-856d-4d6deb42cc10\") " pod="watcher-kuttl-default/rabbitmq-server-0" Sep 29 11:02:31 crc kubenswrapper[4752]: I0929 11:02:31.330175 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a1330311-a981-4a87-bb2b-f948999d3ebc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a1330311-a981-4a87-bb2b-f948999d3ebc\") pod \"rabbitmq-server-0\" (UID: \"0b2b1ef4-961c-4803-856d-4d6deb42cc10\") " pod="watcher-kuttl-default/rabbitmq-server-0" Sep 29 11:02:31 crc kubenswrapper[4752]: I0929 11:02:31.330217 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0b2b1ef4-961c-4803-856d-4d6deb42cc10-config-data\") pod \"rabbitmq-server-0\" (UID: \"0b2b1ef4-961c-4803-856d-4d6deb42cc10\") " pod="watcher-kuttl-default/rabbitmq-server-0" Sep 29 11:02:31 crc kubenswrapper[4752]: I0929 11:02:31.330269 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0b2b1ef4-961c-4803-856d-4d6deb42cc10-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"0b2b1ef4-961c-4803-856d-4d6deb42cc10\") " pod="watcher-kuttl-default/rabbitmq-server-0" Sep 29 11:02:31 crc kubenswrapper[4752]: I0929 11:02:31.330309 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0b2b1ef4-961c-4803-856d-4d6deb42cc10-server-conf\") pod \"rabbitmq-server-0\" (UID: \"0b2b1ef4-961c-4803-856d-4d6deb42cc10\") " pod="watcher-kuttl-default/rabbitmq-server-0" Sep 29 11:02:31 crc kubenswrapper[4752]: I0929 11:02:31.330363 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0b2b1ef4-961c-4803-856d-4d6deb42cc10-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"0b2b1ef4-961c-4803-856d-4d6deb42cc10\") " pod="watcher-kuttl-default/rabbitmq-server-0" Sep 29 11:02:31 crc kubenswrapper[4752]: I0929 11:02:31.330427 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prjc8\" (UniqueName: \"kubernetes.io/projected/0b2b1ef4-961c-4803-856d-4d6deb42cc10-kube-api-access-prjc8\") pod \"rabbitmq-server-0\" (UID: \"0b2b1ef4-961c-4803-856d-4d6deb42cc10\") " pod="watcher-kuttl-default/rabbitmq-server-0" Sep 29 11:02:31 crc kubenswrapper[4752]: I0929 11:02:31.330470 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0b2b1ef4-961c-4803-856d-4d6deb42cc10-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"0b2b1ef4-961c-4803-856d-4d6deb42cc10\") " pod="watcher-kuttl-default/rabbitmq-server-0" Sep 29 11:02:31 crc kubenswrapper[4752]: I0929 11:02:31.330499 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0b2b1ef4-961c-4803-856d-4d6deb42cc10-pod-info\") pod \"rabbitmq-server-0\" (UID: \"0b2b1ef4-961c-4803-856d-4d6deb42cc10\") " pod="watcher-kuttl-default/rabbitmq-server-0" Sep 29 11:02:31 crc kubenswrapper[4752]: I0929 11:02:31.330533 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0b2b1ef4-961c-4803-856d-4d6deb42cc10-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"0b2b1ef4-961c-4803-856d-4d6deb42cc10\") " pod="watcher-kuttl-default/rabbitmq-server-0" Sep 29 11:02:31 crc kubenswrapper[4752]: I0929 11:02:31.330555 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0b2b1ef4-961c-4803-856d-4d6deb42cc10-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"0b2b1ef4-961c-4803-856d-4d6deb42cc10\") " pod="watcher-kuttl-default/rabbitmq-server-0" Sep 29 11:02:31 crc kubenswrapper[4752]: I0929 11:02:31.432312 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0b2b1ef4-961c-4803-856d-4d6deb42cc10-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"0b2b1ef4-961c-4803-856d-4d6deb42cc10\") " pod="watcher-kuttl-default/rabbitmq-server-0" Sep 29 11:02:31 crc kubenswrapper[4752]: I0929 11:02:31.432785 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0b2b1ef4-961c-4803-856d-4d6deb42cc10-pod-info\") pod \"rabbitmq-server-0\" (UID: \"0b2b1ef4-961c-4803-856d-4d6deb42cc10\") " pod="watcher-kuttl-default/rabbitmq-server-0" Sep 29 11:02:31 crc kubenswrapper[4752]: I0929 11:02:31.432854 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0b2b1ef4-961c-4803-856d-4d6deb42cc10-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"0b2b1ef4-961c-4803-856d-4d6deb42cc10\") " pod="watcher-kuttl-default/rabbitmq-server-0" Sep 29 11:02:31 crc kubenswrapper[4752]: I0929 11:02:31.432872 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0b2b1ef4-961c-4803-856d-4d6deb42cc10-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"0b2b1ef4-961c-4803-856d-4d6deb42cc10\") " pod="watcher-kuttl-default/rabbitmq-server-0" Sep 29 11:02:31 crc kubenswrapper[4752]: I0929 11:02:31.432913 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0b2b1ef4-961c-4803-856d-4d6deb42cc10-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"0b2b1ef4-961c-4803-856d-4d6deb42cc10\") " pod="watcher-kuttl-default/rabbitmq-server-0" Sep 29 11:02:31 crc kubenswrapper[4752]: I0929 11:02:31.432950 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0b2b1ef4-961c-4803-856d-4d6deb42cc10-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"0b2b1ef4-961c-4803-856d-4d6deb42cc10\") " pod="watcher-kuttl-default/rabbitmq-server-0" Sep 29 11:02:31 crc kubenswrapper[4752]: I0929 11:02:31.432995 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a1330311-a981-4a87-bb2b-f948999d3ebc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a1330311-a981-4a87-bb2b-f948999d3ebc\") pod \"rabbitmq-server-0\" (UID: \"0b2b1ef4-961c-4803-856d-4d6deb42cc10\") " pod="watcher-kuttl-default/rabbitmq-server-0" Sep 29 11:02:31 crc kubenswrapper[4752]: I0929 11:02:31.433030 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0b2b1ef4-961c-4803-856d-4d6deb42cc10-config-data\") pod \"rabbitmq-server-0\" (UID: \"0b2b1ef4-961c-4803-856d-4d6deb42cc10\") " pod="watcher-kuttl-default/rabbitmq-server-0" Sep 29 11:02:31 crc kubenswrapper[4752]: I0929 11:02:31.434106 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0b2b1ef4-961c-4803-856d-4d6deb42cc10-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"0b2b1ef4-961c-4803-856d-4d6deb42cc10\") " pod="watcher-kuttl-default/rabbitmq-server-0" Sep 29 11:02:31 crc kubenswrapper[4752]: I0929 11:02:31.434170 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0b2b1ef4-961c-4803-856d-4d6deb42cc10-config-data\") pod \"rabbitmq-server-0\" (UID: \"0b2b1ef4-961c-4803-856d-4d6deb42cc10\") " pod="watcher-kuttl-default/rabbitmq-server-0" Sep 29 11:02:31 crc kubenswrapper[4752]: I0929 11:02:31.434258 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0b2b1ef4-961c-4803-856d-4d6deb42cc10-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"0b2b1ef4-961c-4803-856d-4d6deb42cc10\") " pod="watcher-kuttl-default/rabbitmq-server-0" Sep 29 11:02:31 crc kubenswrapper[4752]: I0929 11:02:31.434673 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0b2b1ef4-961c-4803-856d-4d6deb42cc10-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"0b2b1ef4-961c-4803-856d-4d6deb42cc10\") " pod="watcher-kuttl-default/rabbitmq-server-0" Sep 29 11:02:31 crc kubenswrapper[4752]: I0929 11:02:31.434765 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0b2b1ef4-961c-4803-856d-4d6deb42cc10-server-conf\") pod \"rabbitmq-server-0\" (UID: \"0b2b1ef4-961c-4803-856d-4d6deb42cc10\") " pod="watcher-kuttl-default/rabbitmq-server-0" Sep 29 11:02:31 crc kubenswrapper[4752]: I0929 11:02:31.434872 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0b2b1ef4-961c-4803-856d-4d6deb42cc10-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"0b2b1ef4-961c-4803-856d-4d6deb42cc10\") " pod="watcher-kuttl-default/rabbitmq-server-0" Sep 29 11:02:31 crc kubenswrapper[4752]: I0929 11:02:31.434977 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prjc8\" (UniqueName: \"kubernetes.io/projected/0b2b1ef4-961c-4803-856d-4d6deb42cc10-kube-api-access-prjc8\") pod \"rabbitmq-server-0\" (UID: \"0b2b1ef4-961c-4803-856d-4d6deb42cc10\") " pod="watcher-kuttl-default/rabbitmq-server-0" Sep 29 11:02:31 crc kubenswrapper[4752]: I0929 11:02:31.437355 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0b2b1ef4-961c-4803-856d-4d6deb42cc10-server-conf\") pod \"rabbitmq-server-0\" (UID: \"0b2b1ef4-961c-4803-856d-4d6deb42cc10\") " pod="watcher-kuttl-default/rabbitmq-server-0" Sep 29 11:02:31 crc kubenswrapper[4752]: I0929 11:02:31.438325 4752 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Sep 29 11:02:31 crc kubenswrapper[4752]: I0929 11:02:31.438376 4752 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a1330311-a981-4a87-bb2b-f948999d3ebc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a1330311-a981-4a87-bb2b-f948999d3ebc\") pod \"rabbitmq-server-0\" (UID: \"0b2b1ef4-961c-4803-856d-4d6deb42cc10\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1de73c5276b558ccdc9f3ff2c572cc75802899e6a1c2ec0b23d117c8ed05ecd7/globalmount\"" pod="watcher-kuttl-default/rabbitmq-server-0" Sep 29 11:02:31 crc kubenswrapper[4752]: I0929 11:02:31.438499 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0b2b1ef4-961c-4803-856d-4d6deb42cc10-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"0b2b1ef4-961c-4803-856d-4d6deb42cc10\") " pod="watcher-kuttl-default/rabbitmq-server-0" Sep 29 11:02:31 crc kubenswrapper[4752]: I0929 11:02:31.442345 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0b2b1ef4-961c-4803-856d-4d6deb42cc10-pod-info\") pod \"rabbitmq-server-0\" (UID: \"0b2b1ef4-961c-4803-856d-4d6deb42cc10\") " pod="watcher-kuttl-default/rabbitmq-server-0" Sep 29 11:02:31 crc kubenswrapper[4752]: I0929 11:02:31.443871 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0b2b1ef4-961c-4803-856d-4d6deb42cc10-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"0b2b1ef4-961c-4803-856d-4d6deb42cc10\") " pod="watcher-kuttl-default/rabbitmq-server-0" Sep 29 11:02:31 crc kubenswrapper[4752]: I0929 11:02:31.445613 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0b2b1ef4-961c-4803-856d-4d6deb42cc10-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"0b2b1ef4-961c-4803-856d-4d6deb42cc10\") " pod="watcher-kuttl-default/rabbitmq-server-0" Sep 29 11:02:31 crc kubenswrapper[4752]: I0929 11:02:31.459376 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prjc8\" (UniqueName: \"kubernetes.io/projected/0b2b1ef4-961c-4803-856d-4d6deb42cc10-kube-api-access-prjc8\") pod \"rabbitmq-server-0\" (UID: \"0b2b1ef4-961c-4803-856d-4d6deb42cc10\") " pod="watcher-kuttl-default/rabbitmq-server-0" Sep 29 11:02:31 crc kubenswrapper[4752]: I0929 11:02:31.480386 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a1330311-a981-4a87-bb2b-f948999d3ebc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a1330311-a981-4a87-bb2b-f948999d3ebc\") pod \"rabbitmq-server-0\" (UID: \"0b2b1ef4-961c-4803-856d-4d6deb42cc10\") " pod="watcher-kuttl-default/rabbitmq-server-0" Sep 29 11:02:31 crc kubenswrapper[4752]: I0929 11:02:31.502312 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/rabbitmq-notifications-server-0"] Sep 29 11:02:31 crc kubenswrapper[4752]: I0929 11:02:31.504144 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Sep 29 11:02:31 crc kubenswrapper[4752]: I0929 11:02:31.507034 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"watcher-kuttl-default"/"rabbitmq-notifications-config-data" Sep 29 11:02:31 crc kubenswrapper[4752]: I0929 11:02:31.507039 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"rabbitmq-notifications-erlang-cookie" Sep 29 11:02:31 crc kubenswrapper[4752]: I0929 11:02:31.507121 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"watcher-kuttl-default"/"rabbitmq-notifications-server-conf" Sep 29 11:02:31 crc kubenswrapper[4752]: I0929 11:02:31.508418 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-rabbitmq-notifications-svc" Sep 29 11:02:31 crc kubenswrapper[4752]: I0929 11:02:31.508889 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"watcher-kuttl-default"/"rabbitmq-notifications-plugins-conf" Sep 29 11:02:31 crc kubenswrapper[4752]: I0929 11:02:31.509176 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"rabbitmq-notifications-default-user" Sep 29 11:02:31 crc kubenswrapper[4752]: I0929 11:02:31.513411 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"rabbitmq-notifications-server-dockercfg-pvrnv" Sep 29 11:02:31 crc kubenswrapper[4752]: I0929 11:02:31.519766 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/rabbitmq-notifications-server-0"] Sep 29 11:02:31 crc kubenswrapper[4752]: I0929 11:02:31.566230 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/rabbitmq-server-0" Sep 29 11:02:31 crc kubenswrapper[4752]: I0929 11:02:31.638063 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/23451e36-d03b-4039-ba05-d20e013b089b-rabbitmq-confd\") pod \"rabbitmq-notifications-server-0\" (UID: \"23451e36-d03b-4039-ba05-d20e013b089b\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Sep 29 11:02:31 crc kubenswrapper[4752]: I0929 11:02:31.638154 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/23451e36-d03b-4039-ba05-d20e013b089b-rabbitmq-plugins\") pod \"rabbitmq-notifications-server-0\" (UID: \"23451e36-d03b-4039-ba05-d20e013b089b\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Sep 29 11:02:31 crc kubenswrapper[4752]: I0929 11:02:31.638178 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/23451e36-d03b-4039-ba05-d20e013b089b-erlang-cookie-secret\") pod \"rabbitmq-notifications-server-0\" (UID: \"23451e36-d03b-4039-ba05-d20e013b089b\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Sep 29 11:02:31 crc kubenswrapper[4752]: I0929 11:02:31.638205 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5jzl\" (UniqueName: \"kubernetes.io/projected/23451e36-d03b-4039-ba05-d20e013b089b-kube-api-access-z5jzl\") pod \"rabbitmq-notifications-server-0\" (UID: \"23451e36-d03b-4039-ba05-d20e013b089b\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Sep 29 11:02:31 crc kubenswrapper[4752]: I0929 11:02:31.638282 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/23451e36-d03b-4039-ba05-d20e013b089b-rabbitmq-tls\") pod \"rabbitmq-notifications-server-0\" (UID: \"23451e36-d03b-4039-ba05-d20e013b089b\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Sep 29 11:02:31 crc kubenswrapper[4752]: I0929 11:02:31.638322 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-7382014f-ce88-4222-b67c-cd4d68596739\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7382014f-ce88-4222-b67c-cd4d68596739\") pod \"rabbitmq-notifications-server-0\" (UID: \"23451e36-d03b-4039-ba05-d20e013b089b\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Sep 29 11:02:31 crc kubenswrapper[4752]: I0929 11:02:31.638350 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/23451e36-d03b-4039-ba05-d20e013b089b-server-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"23451e36-d03b-4039-ba05-d20e013b089b\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Sep 29 11:02:31 crc kubenswrapper[4752]: I0929 11:02:31.638399 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/23451e36-d03b-4039-ba05-d20e013b089b-rabbitmq-erlang-cookie\") pod \"rabbitmq-notifications-server-0\" (UID: \"23451e36-d03b-4039-ba05-d20e013b089b\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Sep 29 11:02:31 crc kubenswrapper[4752]: I0929 11:02:31.638421 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/23451e36-d03b-4039-ba05-d20e013b089b-pod-info\") pod \"rabbitmq-notifications-server-0\" (UID: \"23451e36-d03b-4039-ba05-d20e013b089b\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Sep 29 11:02:31 crc kubenswrapper[4752]: I0929 11:02:31.638530 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/23451e36-d03b-4039-ba05-d20e013b089b-config-data\") pod \"rabbitmq-notifications-server-0\" (UID: \"23451e36-d03b-4039-ba05-d20e013b089b\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Sep 29 11:02:31 crc kubenswrapper[4752]: I0929 11:02:31.638605 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/23451e36-d03b-4039-ba05-d20e013b089b-plugins-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"23451e36-d03b-4039-ba05-d20e013b089b\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Sep 29 11:02:31 crc kubenswrapper[4752]: I0929 11:02:31.740078 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-7382014f-ce88-4222-b67c-cd4d68596739\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7382014f-ce88-4222-b67c-cd4d68596739\") pod \"rabbitmq-notifications-server-0\" (UID: \"23451e36-d03b-4039-ba05-d20e013b089b\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Sep 29 11:02:31 crc kubenswrapper[4752]: I0929 11:02:31.740481 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/23451e36-d03b-4039-ba05-d20e013b089b-server-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"23451e36-d03b-4039-ba05-d20e013b089b\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Sep 29 11:02:31 crc kubenswrapper[4752]: I0929 11:02:31.740525 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/23451e36-d03b-4039-ba05-d20e013b089b-rabbitmq-erlang-cookie\") pod \"rabbitmq-notifications-server-0\" (UID: \"23451e36-d03b-4039-ba05-d20e013b089b\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Sep 29 11:02:31 crc kubenswrapper[4752]: I0929 11:02:31.740540 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/23451e36-d03b-4039-ba05-d20e013b089b-pod-info\") pod \"rabbitmq-notifications-server-0\" (UID: \"23451e36-d03b-4039-ba05-d20e013b089b\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Sep 29 11:02:31 crc kubenswrapper[4752]: I0929 11:02:31.740580 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/23451e36-d03b-4039-ba05-d20e013b089b-config-data\") pod \"rabbitmq-notifications-server-0\" (UID: \"23451e36-d03b-4039-ba05-d20e013b089b\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Sep 29 11:02:31 crc kubenswrapper[4752]: I0929 11:02:31.740603 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/23451e36-d03b-4039-ba05-d20e013b089b-plugins-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"23451e36-d03b-4039-ba05-d20e013b089b\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Sep 29 11:02:31 crc kubenswrapper[4752]: I0929 11:02:31.740648 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/23451e36-d03b-4039-ba05-d20e013b089b-rabbitmq-confd\") pod \"rabbitmq-notifications-server-0\" (UID: \"23451e36-d03b-4039-ba05-d20e013b089b\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Sep 29 11:02:31 crc kubenswrapper[4752]: I0929 11:02:31.740682 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/23451e36-d03b-4039-ba05-d20e013b089b-rabbitmq-plugins\") pod \"rabbitmq-notifications-server-0\" (UID: \"23451e36-d03b-4039-ba05-d20e013b089b\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Sep 29 11:02:31 crc kubenswrapper[4752]: I0929 11:02:31.740701 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/23451e36-d03b-4039-ba05-d20e013b089b-erlang-cookie-secret\") pod \"rabbitmq-notifications-server-0\" (UID: \"23451e36-d03b-4039-ba05-d20e013b089b\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Sep 29 11:02:31 crc kubenswrapper[4752]: I0929 11:02:31.740718 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5jzl\" (UniqueName: \"kubernetes.io/projected/23451e36-d03b-4039-ba05-d20e013b089b-kube-api-access-z5jzl\") pod \"rabbitmq-notifications-server-0\" (UID: \"23451e36-d03b-4039-ba05-d20e013b089b\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Sep 29 11:02:31 crc kubenswrapper[4752]: I0929 11:02:31.740739 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/23451e36-d03b-4039-ba05-d20e013b089b-rabbitmq-tls\") pod \"rabbitmq-notifications-server-0\" (UID: \"23451e36-d03b-4039-ba05-d20e013b089b\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Sep 29 11:02:31 crc kubenswrapper[4752]: I0929 11:02:31.742622 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/23451e36-d03b-4039-ba05-d20e013b089b-rabbitmq-plugins\") pod \"rabbitmq-notifications-server-0\" (UID: \"23451e36-d03b-4039-ba05-d20e013b089b\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Sep 29 11:02:31 crc kubenswrapper[4752]: I0929 11:02:31.742860 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/23451e36-d03b-4039-ba05-d20e013b089b-config-data\") pod \"rabbitmq-notifications-server-0\" (UID: \"23451e36-d03b-4039-ba05-d20e013b089b\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Sep 29 11:02:31 crc kubenswrapper[4752]: I0929 11:02:31.743333 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/23451e36-d03b-4039-ba05-d20e013b089b-plugins-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"23451e36-d03b-4039-ba05-d20e013b089b\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Sep 29 11:02:31 crc kubenswrapper[4752]: I0929 11:02:31.743458 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/23451e36-d03b-4039-ba05-d20e013b089b-server-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"23451e36-d03b-4039-ba05-d20e013b089b\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Sep 29 11:02:31 crc kubenswrapper[4752]: I0929 11:02:31.743505 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/23451e36-d03b-4039-ba05-d20e013b089b-rabbitmq-erlang-cookie\") pod \"rabbitmq-notifications-server-0\" (UID: \"23451e36-d03b-4039-ba05-d20e013b089b\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Sep 29 11:02:31 crc kubenswrapper[4752]: I0929 11:02:31.745215 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/23451e36-d03b-4039-ba05-d20e013b089b-rabbitmq-tls\") pod \"rabbitmq-notifications-server-0\" (UID: \"23451e36-d03b-4039-ba05-d20e013b089b\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Sep 29 11:02:31 crc kubenswrapper[4752]: I0929 11:02:31.746588 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/23451e36-d03b-4039-ba05-d20e013b089b-rabbitmq-confd\") pod \"rabbitmq-notifications-server-0\" (UID: \"23451e36-d03b-4039-ba05-d20e013b089b\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Sep 29 11:02:31 crc kubenswrapper[4752]: I0929 11:02:31.746689 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/23451e36-d03b-4039-ba05-d20e013b089b-pod-info\") pod \"rabbitmq-notifications-server-0\" (UID: \"23451e36-d03b-4039-ba05-d20e013b089b\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Sep 29 11:02:31 crc kubenswrapper[4752]: I0929 11:02:31.765659 4752 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Sep 29 11:02:31 crc kubenswrapper[4752]: I0929 11:02:31.765710 4752 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-7382014f-ce88-4222-b67c-cd4d68596739\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7382014f-ce88-4222-b67c-cd4d68596739\") pod \"rabbitmq-notifications-server-0\" (UID: \"23451e36-d03b-4039-ba05-d20e013b089b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/8c71a766b12cf0efcb82715530642b3b277e73b6d2d3aa96492112692614b281/globalmount\"" pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Sep 29 11:02:31 crc kubenswrapper[4752]: I0929 11:02:31.768949 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5jzl\" (UniqueName: \"kubernetes.io/projected/23451e36-d03b-4039-ba05-d20e013b089b-kube-api-access-z5jzl\") pod \"rabbitmq-notifications-server-0\" (UID: \"23451e36-d03b-4039-ba05-d20e013b089b\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Sep 29 11:02:31 crc kubenswrapper[4752]: I0929 11:02:31.778348 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/23451e36-d03b-4039-ba05-d20e013b089b-erlang-cookie-secret\") pod \"rabbitmq-notifications-server-0\" (UID: \"23451e36-d03b-4039-ba05-d20e013b089b\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Sep 29 11:02:31 crc kubenswrapper[4752]: I0929 11:02:31.813325 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-7382014f-ce88-4222-b67c-cd4d68596739\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7382014f-ce88-4222-b67c-cd4d68596739\") pod \"rabbitmq-notifications-server-0\" (UID: \"23451e36-d03b-4039-ba05-d20e013b089b\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Sep 29 11:02:32 crc kubenswrapper[4752]: I0929 11:02:32.089622 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/rabbitmq-server-0"] Sep 29 11:02:32 crc kubenswrapper[4752]: W0929 11:02:32.098843 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b2b1ef4_961c_4803_856d_4d6deb42cc10.slice/crio-4f48a7a235d93d27f67448a71ab41bc30bdeed0373bde6576531322d6e1c997e WatchSource:0}: Error finding container 4f48a7a235d93d27f67448a71ab41bc30bdeed0373bde6576531322d6e1c997e: Status 404 returned error can't find the container with id 4f48a7a235d93d27f67448a71ab41bc30bdeed0373bde6576531322d6e1c997e Sep 29 11:02:32 crc kubenswrapper[4752]: I0929 11:02:32.129460 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Sep 29 11:02:32 crc kubenswrapper[4752]: I0929 11:02:32.560448 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/rabbitmq-notifications-server-0"] Sep 29 11:02:32 crc kubenswrapper[4752]: I0929 11:02:32.622349 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/rabbitmq-notifications-server-0" event={"ID":"23451e36-d03b-4039-ba05-d20e013b089b","Type":"ContainerStarted","Data":"ba260d5ff7d5fca49f90da1bf933e6ee4bc109717c6aed0f1e82c0b0a99cce0b"} Sep 29 11:02:32 crc kubenswrapper[4752]: I0929 11:02:32.623770 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/rabbitmq-server-0" event={"ID":"0b2b1ef4-961c-4803-856d-4d6deb42cc10","Type":"ContainerStarted","Data":"4f48a7a235d93d27f67448a71ab41bc30bdeed0373bde6576531322d6e1c997e"} Sep 29 11:02:32 crc kubenswrapper[4752]: I0929 11:02:32.780003 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/openstack-galera-0"] Sep 29 11:02:32 crc kubenswrapper[4752]: I0929 11:02:32.782067 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/openstack-galera-0" Sep 29 11:02:32 crc kubenswrapper[4752]: I0929 11:02:32.786861 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"galera-openstack-dockercfg-lb7tm" Sep 29 11:02:32 crc kubenswrapper[4752]: I0929 11:02:32.787428 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"watcher-kuttl-default"/"openstack-scripts" Sep 29 11:02:32 crc kubenswrapper[4752]: I0929 11:02:32.789390 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"osp-secret" Sep 29 11:02:32 crc kubenswrapper[4752]: I0929 11:02:32.790136 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-galera-openstack-svc" Sep 29 11:02:32 crc kubenswrapper[4752]: I0929 11:02:32.790185 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"watcher-kuttl-default"/"openstack-config-data" Sep 29 11:02:32 crc kubenswrapper[4752]: I0929 11:02:32.798493 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/openstack-galera-0"] Sep 29 11:02:32 crc kubenswrapper[4752]: I0929 11:02:32.799983 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"combined-ca-bundle" Sep 29 11:02:32 crc kubenswrapper[4752]: I0929 11:02:32.864587 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-17a7d7ad-f311-4455-98e3-66a855d8e60e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-17a7d7ad-f311-4455-98e3-66a855d8e60e\") pod \"openstack-galera-0\" (UID: \"58c49438-2c74-4c5e-b476-7fff98957387\") " pod="watcher-kuttl-default/openstack-galera-0" Sep 29 11:02:32 crc kubenswrapper[4752]: I0929 11:02:32.864649 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/58c49438-2c74-4c5e-b476-7fff98957387-config-data-default\") pod \"openstack-galera-0\" (UID: \"58c49438-2c74-4c5e-b476-7fff98957387\") " pod="watcher-kuttl-default/openstack-galera-0" Sep 29 11:02:32 crc kubenswrapper[4752]: I0929 11:02:32.864722 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/58c49438-2c74-4c5e-b476-7fff98957387-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"58c49438-2c74-4c5e-b476-7fff98957387\") " pod="watcher-kuttl-default/openstack-galera-0" Sep 29 11:02:32 crc kubenswrapper[4752]: I0929 11:02:32.864752 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khwmm\" (UniqueName: \"kubernetes.io/projected/58c49438-2c74-4c5e-b476-7fff98957387-kube-api-access-khwmm\") pod \"openstack-galera-0\" (UID: \"58c49438-2c74-4c5e-b476-7fff98957387\") " pod="watcher-kuttl-default/openstack-galera-0" Sep 29 11:02:32 crc kubenswrapper[4752]: I0929 11:02:32.864789 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/58c49438-2c74-4c5e-b476-7fff98957387-secrets\") pod \"openstack-galera-0\" (UID: \"58c49438-2c74-4c5e-b476-7fff98957387\") " pod="watcher-kuttl-default/openstack-galera-0" Sep 29 11:02:32 crc kubenswrapper[4752]: I0929 11:02:32.864877 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/58c49438-2c74-4c5e-b476-7fff98957387-kolla-config\") pod \"openstack-galera-0\" (UID: \"58c49438-2c74-4c5e-b476-7fff98957387\") " pod="watcher-kuttl-default/openstack-galera-0" Sep 29 11:02:32 crc kubenswrapper[4752]: I0929 11:02:32.864927 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/58c49438-2c74-4c5e-b476-7fff98957387-operator-scripts\") pod \"openstack-galera-0\" (UID: \"58c49438-2c74-4c5e-b476-7fff98957387\") " pod="watcher-kuttl-default/openstack-galera-0" Sep 29 11:02:32 crc kubenswrapper[4752]: I0929 11:02:32.864975 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58c49438-2c74-4c5e-b476-7fff98957387-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"58c49438-2c74-4c5e-b476-7fff98957387\") " pod="watcher-kuttl-default/openstack-galera-0" Sep 29 11:02:32 crc kubenswrapper[4752]: I0929 11:02:32.864999 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/58c49438-2c74-4c5e-b476-7fff98957387-config-data-generated\") pod \"openstack-galera-0\" (UID: \"58c49438-2c74-4c5e-b476-7fff98957387\") " pod="watcher-kuttl-default/openstack-galera-0" Sep 29 11:02:32 crc kubenswrapper[4752]: I0929 11:02:32.967739 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/58c49438-2c74-4c5e-b476-7fff98957387-config-data-generated\") pod \"openstack-galera-0\" (UID: \"58c49438-2c74-4c5e-b476-7fff98957387\") " pod="watcher-kuttl-default/openstack-galera-0" Sep 29 11:02:32 crc kubenswrapper[4752]: I0929 11:02:32.967842 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-17a7d7ad-f311-4455-98e3-66a855d8e60e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-17a7d7ad-f311-4455-98e3-66a855d8e60e\") pod \"openstack-galera-0\" (UID: \"58c49438-2c74-4c5e-b476-7fff98957387\") " pod="watcher-kuttl-default/openstack-galera-0" Sep 29 11:02:32 crc kubenswrapper[4752]: I0929 11:02:32.967865 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/58c49438-2c74-4c5e-b476-7fff98957387-config-data-default\") pod \"openstack-galera-0\" (UID: \"58c49438-2c74-4c5e-b476-7fff98957387\") " pod="watcher-kuttl-default/openstack-galera-0" Sep 29 11:02:32 crc kubenswrapper[4752]: I0929 11:02:32.967915 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/58c49438-2c74-4c5e-b476-7fff98957387-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"58c49438-2c74-4c5e-b476-7fff98957387\") " pod="watcher-kuttl-default/openstack-galera-0" Sep 29 11:02:32 crc kubenswrapper[4752]: I0929 11:02:32.967936 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khwmm\" (UniqueName: \"kubernetes.io/projected/58c49438-2c74-4c5e-b476-7fff98957387-kube-api-access-khwmm\") pod \"openstack-galera-0\" (UID: \"58c49438-2c74-4c5e-b476-7fff98957387\") " pod="watcher-kuttl-default/openstack-galera-0" Sep 29 11:02:32 crc kubenswrapper[4752]: I0929 11:02:32.967960 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/58c49438-2c74-4c5e-b476-7fff98957387-secrets\") pod \"openstack-galera-0\" (UID: \"58c49438-2c74-4c5e-b476-7fff98957387\") " pod="watcher-kuttl-default/openstack-galera-0" Sep 29 11:02:32 crc kubenswrapper[4752]: I0929 11:02:32.967977 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/58c49438-2c74-4c5e-b476-7fff98957387-kolla-config\") pod \"openstack-galera-0\" (UID: \"58c49438-2c74-4c5e-b476-7fff98957387\") " pod="watcher-kuttl-default/openstack-galera-0" Sep 29 11:02:32 crc kubenswrapper[4752]: I0929 11:02:32.968012 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/58c49438-2c74-4c5e-b476-7fff98957387-operator-scripts\") pod \"openstack-galera-0\" (UID: \"58c49438-2c74-4c5e-b476-7fff98957387\") " pod="watcher-kuttl-default/openstack-galera-0" Sep 29 11:02:32 crc kubenswrapper[4752]: I0929 11:02:32.968049 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58c49438-2c74-4c5e-b476-7fff98957387-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"58c49438-2c74-4c5e-b476-7fff98957387\") " pod="watcher-kuttl-default/openstack-galera-0" Sep 29 11:02:32 crc kubenswrapper[4752]: I0929 11:02:32.968753 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/58c49438-2c74-4c5e-b476-7fff98957387-kolla-config\") pod \"openstack-galera-0\" (UID: \"58c49438-2c74-4c5e-b476-7fff98957387\") " pod="watcher-kuttl-default/openstack-galera-0" Sep 29 11:02:32 crc kubenswrapper[4752]: I0929 11:02:32.968829 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/58c49438-2c74-4c5e-b476-7fff98957387-config-data-default\") pod \"openstack-galera-0\" (UID: \"58c49438-2c74-4c5e-b476-7fff98957387\") " pod="watcher-kuttl-default/openstack-galera-0" Sep 29 11:02:32 crc kubenswrapper[4752]: I0929 11:02:32.978321 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/58c49438-2c74-4c5e-b476-7fff98957387-config-data-generated\") pod \"openstack-galera-0\" (UID: \"58c49438-2c74-4c5e-b476-7fff98957387\") " pod="watcher-kuttl-default/openstack-galera-0" Sep 29 11:02:32 crc kubenswrapper[4752]: I0929 11:02:32.978740 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/58c49438-2c74-4c5e-b476-7fff98957387-operator-scripts\") pod \"openstack-galera-0\" (UID: \"58c49438-2c74-4c5e-b476-7fff98957387\") " pod="watcher-kuttl-default/openstack-galera-0" Sep 29 11:02:32 crc kubenswrapper[4752]: I0929 11:02:32.979256 4752 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Sep 29 11:02:32 crc kubenswrapper[4752]: I0929 11:02:32.979281 4752 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-17a7d7ad-f311-4455-98e3-66a855d8e60e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-17a7d7ad-f311-4455-98e3-66a855d8e60e\") pod \"openstack-galera-0\" (UID: \"58c49438-2c74-4c5e-b476-7fff98957387\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/4dc8aa1b6ab41dfb983d51253843a25fd1f75a7571eebb3f88d31a436e12b6af/globalmount\"" pod="watcher-kuttl-default/openstack-galera-0" Sep 29 11:02:32 crc kubenswrapper[4752]: I0929 11:02:32.979629 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58c49438-2c74-4c5e-b476-7fff98957387-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"58c49438-2c74-4c5e-b476-7fff98957387\") " pod="watcher-kuttl-default/openstack-galera-0" Sep 29 11:02:32 crc kubenswrapper[4752]: I0929 11:02:32.981171 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/58c49438-2c74-4c5e-b476-7fff98957387-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"58c49438-2c74-4c5e-b476-7fff98957387\") " pod="watcher-kuttl-default/openstack-galera-0" Sep 29 11:02:32 crc kubenswrapper[4752]: I0929 11:02:32.990426 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/58c49438-2c74-4c5e-b476-7fff98957387-secrets\") pod \"openstack-galera-0\" (UID: \"58c49438-2c74-4c5e-b476-7fff98957387\") " pod="watcher-kuttl-default/openstack-galera-0" Sep 29 11:02:32 crc kubenswrapper[4752]: I0929 11:02:32.994671 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khwmm\" (UniqueName: \"kubernetes.io/projected/58c49438-2c74-4c5e-b476-7fff98957387-kube-api-access-khwmm\") pod \"openstack-galera-0\" (UID: \"58c49438-2c74-4c5e-b476-7fff98957387\") " pod="watcher-kuttl-default/openstack-galera-0" Sep 29 11:02:33 crc kubenswrapper[4752]: I0929 11:02:33.017387 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/memcached-0"] Sep 29 11:02:33 crc kubenswrapper[4752]: I0929 11:02:33.021378 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/memcached-0" Sep 29 11:02:33 crc kubenswrapper[4752]: I0929 11:02:33.039269 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-memcached-svc" Sep 29 11:02:33 crc kubenswrapper[4752]: I0929 11:02:33.040126 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"watcher-kuttl-default"/"memcached-config-data" Sep 29 11:02:33 crc kubenswrapper[4752]: I0929 11:02:33.040179 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"memcached-memcached-dockercfg-cfn5s" Sep 29 11:02:33 crc kubenswrapper[4752]: I0929 11:02:33.062881 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/memcached-0"] Sep 29 11:02:33 crc kubenswrapper[4752]: I0929 11:02:33.069149 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bfbc51a1-9e9b-4af1-865c-d6228444dded-config-data\") pod \"memcached-0\" (UID: \"bfbc51a1-9e9b-4af1-865c-d6228444dded\") " pod="watcher-kuttl-default/memcached-0" Sep 29 11:02:33 crc kubenswrapper[4752]: I0929 11:02:33.069225 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2l57\" (UniqueName: \"kubernetes.io/projected/bfbc51a1-9e9b-4af1-865c-d6228444dded-kube-api-access-p2l57\") pod \"memcached-0\" (UID: \"bfbc51a1-9e9b-4af1-865c-d6228444dded\") " pod="watcher-kuttl-default/memcached-0" Sep 29 11:02:33 crc kubenswrapper[4752]: I0929 11:02:33.069247 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/bfbc51a1-9e9b-4af1-865c-d6228444dded-memcached-tls-certs\") pod \"memcached-0\" (UID: \"bfbc51a1-9e9b-4af1-865c-d6228444dded\") " pod="watcher-kuttl-default/memcached-0" Sep 29 11:02:33 crc kubenswrapper[4752]: I0929 11:02:33.069267 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/bfbc51a1-9e9b-4af1-865c-d6228444dded-kolla-config\") pod \"memcached-0\" (UID: \"bfbc51a1-9e9b-4af1-865c-d6228444dded\") " pod="watcher-kuttl-default/memcached-0" Sep 29 11:02:33 crc kubenswrapper[4752]: I0929 11:02:33.069297 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfbc51a1-9e9b-4af1-865c-d6228444dded-combined-ca-bundle\") pod \"memcached-0\" (UID: \"bfbc51a1-9e9b-4af1-865c-d6228444dded\") " pod="watcher-kuttl-default/memcached-0" Sep 29 11:02:33 crc kubenswrapper[4752]: I0929 11:02:33.083449 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-17a7d7ad-f311-4455-98e3-66a855d8e60e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-17a7d7ad-f311-4455-98e3-66a855d8e60e\") pod \"openstack-galera-0\" (UID: \"58c49438-2c74-4c5e-b476-7fff98957387\") " pod="watcher-kuttl-default/openstack-galera-0" Sep 29 11:02:33 crc kubenswrapper[4752]: I0929 11:02:33.101273 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/openstack-galera-0" Sep 29 11:02:33 crc kubenswrapper[4752]: I0929 11:02:33.175787 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bfbc51a1-9e9b-4af1-865c-d6228444dded-config-data\") pod \"memcached-0\" (UID: \"bfbc51a1-9e9b-4af1-865c-d6228444dded\") " pod="watcher-kuttl-default/memcached-0" Sep 29 11:02:33 crc kubenswrapper[4752]: I0929 11:02:33.176028 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2l57\" (UniqueName: \"kubernetes.io/projected/bfbc51a1-9e9b-4af1-865c-d6228444dded-kube-api-access-p2l57\") pod \"memcached-0\" (UID: \"bfbc51a1-9e9b-4af1-865c-d6228444dded\") " pod="watcher-kuttl-default/memcached-0" Sep 29 11:02:33 crc kubenswrapper[4752]: I0929 11:02:33.176079 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/bfbc51a1-9e9b-4af1-865c-d6228444dded-memcached-tls-certs\") pod \"memcached-0\" (UID: \"bfbc51a1-9e9b-4af1-865c-d6228444dded\") " pod="watcher-kuttl-default/memcached-0" Sep 29 11:02:33 crc kubenswrapper[4752]: I0929 11:02:33.176113 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/bfbc51a1-9e9b-4af1-865c-d6228444dded-kolla-config\") pod \"memcached-0\" (UID: \"bfbc51a1-9e9b-4af1-865c-d6228444dded\") " pod="watcher-kuttl-default/memcached-0" Sep 29 11:02:33 crc kubenswrapper[4752]: I0929 11:02:33.176174 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfbc51a1-9e9b-4af1-865c-d6228444dded-combined-ca-bundle\") pod \"memcached-0\" (UID: \"bfbc51a1-9e9b-4af1-865c-d6228444dded\") " pod="watcher-kuttl-default/memcached-0" Sep 29 11:02:33 crc kubenswrapper[4752]: I0929 11:02:33.176685 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bfbc51a1-9e9b-4af1-865c-d6228444dded-config-data\") pod \"memcached-0\" (UID: \"bfbc51a1-9e9b-4af1-865c-d6228444dded\") " pod="watcher-kuttl-default/memcached-0" Sep 29 11:02:33 crc kubenswrapper[4752]: I0929 11:02:33.177709 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/bfbc51a1-9e9b-4af1-865c-d6228444dded-kolla-config\") pod \"memcached-0\" (UID: \"bfbc51a1-9e9b-4af1-865c-d6228444dded\") " pod="watcher-kuttl-default/memcached-0" Sep 29 11:02:33 crc kubenswrapper[4752]: I0929 11:02:33.182343 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfbc51a1-9e9b-4af1-865c-d6228444dded-combined-ca-bundle\") pod \"memcached-0\" (UID: \"bfbc51a1-9e9b-4af1-865c-d6228444dded\") " pod="watcher-kuttl-default/memcached-0" Sep 29 11:02:33 crc kubenswrapper[4752]: I0929 11:02:33.191361 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/bfbc51a1-9e9b-4af1-865c-d6228444dded-memcached-tls-certs\") pod \"memcached-0\" (UID: \"bfbc51a1-9e9b-4af1-865c-d6228444dded\") " pod="watcher-kuttl-default/memcached-0" Sep 29 11:02:33 crc kubenswrapper[4752]: I0929 11:02:33.202363 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2l57\" (UniqueName: \"kubernetes.io/projected/bfbc51a1-9e9b-4af1-865c-d6228444dded-kube-api-access-p2l57\") pod \"memcached-0\" (UID: \"bfbc51a1-9e9b-4af1-865c-d6228444dded\") " pod="watcher-kuttl-default/memcached-0" Sep 29 11:02:33 crc kubenswrapper[4752]: I0929 11:02:33.364912 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/kube-state-metrics-0"] Sep 29 11:02:33 crc kubenswrapper[4752]: I0929 11:02:33.365958 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/kube-state-metrics-0" Sep 29 11:02:33 crc kubenswrapper[4752]: I0929 11:02:33.371470 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/kube-state-metrics-0"] Sep 29 11:02:33 crc kubenswrapper[4752]: I0929 11:02:33.376622 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"telemetry-ceilometer-dockercfg-f76j6" Sep 29 11:02:33 crc kubenswrapper[4752]: I0929 11:02:33.413588 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/memcached-0" Sep 29 11:02:33 crc kubenswrapper[4752]: I0929 11:02:33.507851 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vv5qf\" (UniqueName: \"kubernetes.io/projected/238407f7-3389-4791-becd-0852b1e66cea-kube-api-access-vv5qf\") pod \"kube-state-metrics-0\" (UID: \"238407f7-3389-4791-becd-0852b1e66cea\") " pod="watcher-kuttl-default/kube-state-metrics-0" Sep 29 11:02:33 crc kubenswrapper[4752]: I0929 11:02:33.609146 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vv5qf\" (UniqueName: \"kubernetes.io/projected/238407f7-3389-4791-becd-0852b1e66cea-kube-api-access-vv5qf\") pod \"kube-state-metrics-0\" (UID: \"238407f7-3389-4791-becd-0852b1e66cea\") " pod="watcher-kuttl-default/kube-state-metrics-0" Sep 29 11:02:33 crc kubenswrapper[4752]: I0929 11:02:33.665219 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vv5qf\" (UniqueName: \"kubernetes.io/projected/238407f7-3389-4791-becd-0852b1e66cea-kube-api-access-vv5qf\") pod \"kube-state-metrics-0\" (UID: \"238407f7-3389-4791-becd-0852b1e66cea\") " pod="watcher-kuttl-default/kube-state-metrics-0" Sep 29 11:02:33 crc kubenswrapper[4752]: I0929 11:02:33.683083 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/openstack-galera-0"] Sep 29 11:02:33 crc kubenswrapper[4752]: I0929 11:02:33.699819 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/kube-state-metrics-0" Sep 29 11:02:34 crc kubenswrapper[4752]: I0929 11:02:34.057499 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/alertmanager-metric-storage-0"] Sep 29 11:02:34 crc kubenswrapper[4752]: I0929 11:02:34.059639 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/alertmanager-metric-storage-0" Sep 29 11:02:34 crc kubenswrapper[4752]: I0929 11:02:34.069561 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"alertmanager-metric-storage-web-config" Sep 29 11:02:34 crc kubenswrapper[4752]: I0929 11:02:34.072166 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"alertmanager-metric-storage-tls-assets-0" Sep 29 11:02:34 crc kubenswrapper[4752]: I0929 11:02:34.073662 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"alertmanager-metric-storage-generated" Sep 29 11:02:34 crc kubenswrapper[4752]: I0929 11:02:34.074018 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"metric-storage-alertmanager-dockercfg-xnct7" Sep 29 11:02:34 crc kubenswrapper[4752]: I0929 11:02:34.081559 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/alertmanager-metric-storage-0"] Sep 29 11:02:34 crc kubenswrapper[4752]: I0929 11:02:34.115704 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a00720d9-6655-4234-a3ea-25de9303b7e4-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"a00720d9-6655-4234-a3ea-25de9303b7e4\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Sep 29 11:02:34 crc kubenswrapper[4752]: I0929 11:02:34.115853 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a00720d9-6655-4234-a3ea-25de9303b7e4-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"a00720d9-6655-4234-a3ea-25de9303b7e4\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Sep 29 11:02:34 crc kubenswrapper[4752]: I0929 11:02:34.115890 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jl6dd\" (UniqueName: \"kubernetes.io/projected/a00720d9-6655-4234-a3ea-25de9303b7e4-kube-api-access-jl6dd\") pod \"alertmanager-metric-storage-0\" (UID: \"a00720d9-6655-4234-a3ea-25de9303b7e4\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Sep 29 11:02:34 crc kubenswrapper[4752]: I0929 11:02:34.115917 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/a00720d9-6655-4234-a3ea-25de9303b7e4-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"a00720d9-6655-4234-a3ea-25de9303b7e4\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Sep 29 11:02:34 crc kubenswrapper[4752]: I0929 11:02:34.115970 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/a00720d9-6655-4234-a3ea-25de9303b7e4-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"a00720d9-6655-4234-a3ea-25de9303b7e4\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Sep 29 11:02:34 crc kubenswrapper[4752]: I0929 11:02:34.115998 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a00720d9-6655-4234-a3ea-25de9303b7e4-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"a00720d9-6655-4234-a3ea-25de9303b7e4\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Sep 29 11:02:34 crc kubenswrapper[4752]: I0929 11:02:34.209840 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/memcached-0"] Sep 29 11:02:34 crc kubenswrapper[4752]: I0929 11:02:34.223687 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jl6dd\" (UniqueName: \"kubernetes.io/projected/a00720d9-6655-4234-a3ea-25de9303b7e4-kube-api-access-jl6dd\") pod \"alertmanager-metric-storage-0\" (UID: \"a00720d9-6655-4234-a3ea-25de9303b7e4\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Sep 29 11:02:34 crc kubenswrapper[4752]: I0929 11:02:34.223755 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/a00720d9-6655-4234-a3ea-25de9303b7e4-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"a00720d9-6655-4234-a3ea-25de9303b7e4\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Sep 29 11:02:34 crc kubenswrapper[4752]: I0929 11:02:34.223814 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/a00720d9-6655-4234-a3ea-25de9303b7e4-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"a00720d9-6655-4234-a3ea-25de9303b7e4\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Sep 29 11:02:34 crc kubenswrapper[4752]: I0929 11:02:34.223838 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a00720d9-6655-4234-a3ea-25de9303b7e4-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"a00720d9-6655-4234-a3ea-25de9303b7e4\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Sep 29 11:02:34 crc kubenswrapper[4752]: I0929 11:02:34.223910 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a00720d9-6655-4234-a3ea-25de9303b7e4-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"a00720d9-6655-4234-a3ea-25de9303b7e4\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Sep 29 11:02:34 crc kubenswrapper[4752]: I0929 11:02:34.223971 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a00720d9-6655-4234-a3ea-25de9303b7e4-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"a00720d9-6655-4234-a3ea-25de9303b7e4\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Sep 29 11:02:34 crc kubenswrapper[4752]: I0929 11:02:34.224510 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/a00720d9-6655-4234-a3ea-25de9303b7e4-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"a00720d9-6655-4234-a3ea-25de9303b7e4\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Sep 29 11:02:34 crc kubenswrapper[4752]: W0929 11:02:34.229063 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbfbc51a1_9e9b_4af1_865c_d6228444dded.slice/crio-6ddb9bc63dfae563b661b87acb1a4be7e0ae0b3f0c445a286c62523a0cdfc511 WatchSource:0}: Error finding container 6ddb9bc63dfae563b661b87acb1a4be7e0ae0b3f0c445a286c62523a0cdfc511: Status 404 returned error can't find the container with id 6ddb9bc63dfae563b661b87acb1a4be7e0ae0b3f0c445a286c62523a0cdfc511 Sep 29 11:02:34 crc kubenswrapper[4752]: I0929 11:02:34.230635 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a00720d9-6655-4234-a3ea-25de9303b7e4-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"a00720d9-6655-4234-a3ea-25de9303b7e4\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Sep 29 11:02:34 crc kubenswrapper[4752]: I0929 11:02:34.231570 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a00720d9-6655-4234-a3ea-25de9303b7e4-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"a00720d9-6655-4234-a3ea-25de9303b7e4\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Sep 29 11:02:34 crc kubenswrapper[4752]: I0929 11:02:34.231573 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/a00720d9-6655-4234-a3ea-25de9303b7e4-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"a00720d9-6655-4234-a3ea-25de9303b7e4\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Sep 29 11:02:34 crc kubenswrapper[4752]: I0929 11:02:34.235780 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a00720d9-6655-4234-a3ea-25de9303b7e4-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"a00720d9-6655-4234-a3ea-25de9303b7e4\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Sep 29 11:02:34 crc kubenswrapper[4752]: I0929 11:02:34.251750 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jl6dd\" (UniqueName: \"kubernetes.io/projected/a00720d9-6655-4234-a3ea-25de9303b7e4-kube-api-access-jl6dd\") pod \"alertmanager-metric-storage-0\" (UID: \"a00720d9-6655-4234-a3ea-25de9303b7e4\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Sep 29 11:02:34 crc kubenswrapper[4752]: I0929 11:02:34.400488 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/alertmanager-metric-storage-0" Sep 29 11:02:34 crc kubenswrapper[4752]: I0929 11:02:34.440794 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/kube-state-metrics-0"] Sep 29 11:02:34 crc kubenswrapper[4752]: I0929 11:02:34.592007 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-ui-dashboards-6584dc9448-w8t78"] Sep 29 11:02:34 crc kubenswrapper[4752]: I0929 11:02:34.595704 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-ui-dashboards-6584dc9448-w8t78" Sep 29 11:02:34 crc kubenswrapper[4752]: I0929 11:02:34.602975 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-ui-dashboards" Sep 29 11:02:34 crc kubenswrapper[4752]: I0929 11:02:34.603239 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-ui-dashboards-sa-dockercfg-6kwm5" Sep 29 11:02:34 crc kubenswrapper[4752]: I0929 11:02:34.604298 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-ui-dashboards-6584dc9448-w8t78"] Sep 29 11:02:34 crc kubenswrapper[4752]: I0929 11:02:34.713252 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/kube-state-metrics-0" event={"ID":"238407f7-3389-4791-becd-0852b1e66cea","Type":"ContainerStarted","Data":"93dffe600ec8f59a4c6baa802188934701b56823fb0c2c00a7d0152e4e51dd24"} Sep 29 11:02:34 crc kubenswrapper[4752]: I0929 11:02:34.742084 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/memcached-0" event={"ID":"bfbc51a1-9e9b-4af1-865c-d6228444dded","Type":"ContainerStarted","Data":"6ddb9bc63dfae563b661b87acb1a4be7e0ae0b3f0c445a286c62523a0cdfc511"} Sep 29 11:02:34 crc kubenswrapper[4752]: I0929 11:02:34.743021 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4tpg\" (UniqueName: \"kubernetes.io/projected/a408d775-16c4-45e5-9750-e3b3f8141bd6-kube-api-access-j4tpg\") pod \"observability-ui-dashboards-6584dc9448-w8t78\" (UID: \"a408d775-16c4-45e5-9750-e3b3f8141bd6\") " pod="openshift-operators/observability-ui-dashboards-6584dc9448-w8t78" Sep 29 11:02:34 crc kubenswrapper[4752]: I0929 11:02:34.743095 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a408d775-16c4-45e5-9750-e3b3f8141bd6-serving-cert\") pod \"observability-ui-dashboards-6584dc9448-w8t78\" (UID: \"a408d775-16c4-45e5-9750-e3b3f8141bd6\") " pod="openshift-operators/observability-ui-dashboards-6584dc9448-w8t78" Sep 29 11:02:34 crc kubenswrapper[4752]: I0929 11:02:34.769101 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/openstack-galera-0" event={"ID":"58c49438-2c74-4c5e-b476-7fff98957387","Type":"ContainerStarted","Data":"d7de82f14bed68b0c420a2292f7c65625a13a7f64974243f5b75536a22917be3"} Sep 29 11:02:34 crc kubenswrapper[4752]: I0929 11:02:34.803861 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/prometheus-metric-storage-0"] Sep 29 11:02:34 crc kubenswrapper[4752]: I0929 11:02:34.805921 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/prometheus-metric-storage-0" Sep 29 11:02:34 crc kubenswrapper[4752]: I0929 11:02:34.811484 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"prometheus-metric-storage" Sep 29 11:02:34 crc kubenswrapper[4752]: I0929 11:02:34.817325 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"watcher-kuttl-default"/"prometheus-metric-storage-rulefiles-0" Sep 29 11:02:34 crc kubenswrapper[4752]: I0929 11:02:34.817634 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Sep 29 11:02:34 crc kubenswrapper[4752]: I0929 11:02:34.817781 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"prometheus-metric-storage-web-config" Sep 29 11:02:34 crc kubenswrapper[4752]: I0929 11:02:34.818029 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"metric-storage-prometheus-dockercfg-bdf9r" Sep 29 11:02:34 crc kubenswrapper[4752]: I0929 11:02:34.831423 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"prometheus-metric-storage-tls-assets-0" Sep 29 11:02:34 crc kubenswrapper[4752]: I0929 11:02:34.846173 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4tpg\" (UniqueName: \"kubernetes.io/projected/a408d775-16c4-45e5-9750-e3b3f8141bd6-kube-api-access-j4tpg\") pod \"observability-ui-dashboards-6584dc9448-w8t78\" (UID: \"a408d775-16c4-45e5-9750-e3b3f8141bd6\") " pod="openshift-operators/observability-ui-dashboards-6584dc9448-w8t78" Sep 29 11:02:34 crc kubenswrapper[4752]: I0929 11:02:34.846265 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a408d775-16c4-45e5-9750-e3b3f8141bd6-serving-cert\") pod \"observability-ui-dashboards-6584dc9448-w8t78\" (UID: \"a408d775-16c4-45e5-9750-e3b3f8141bd6\") " pod="openshift-operators/observability-ui-dashboards-6584dc9448-w8t78" Sep 29 11:02:34 crc kubenswrapper[4752]: E0929 11:02:34.846570 4752 secret.go:188] Couldn't get secret openshift-operators/observability-ui-dashboards: secret "observability-ui-dashboards" not found Sep 29 11:02:34 crc kubenswrapper[4752]: E0929 11:02:34.846631 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a408d775-16c4-45e5-9750-e3b3f8141bd6-serving-cert podName:a408d775-16c4-45e5-9750-e3b3f8141bd6 nodeName:}" failed. No retries permitted until 2025-09-29 11:02:35.346609443 +0000 UTC m=+1096.135751110 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/a408d775-16c4-45e5-9750-e3b3f8141bd6-serving-cert") pod "observability-ui-dashboards-6584dc9448-w8t78" (UID: "a408d775-16c4-45e5-9750-e3b3f8141bd6") : secret "observability-ui-dashboards" not found Sep 29 11:02:34 crc kubenswrapper[4752]: I0929 11:02:34.860619 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/prometheus-metric-storage-0"] Sep 29 11:02:34 crc kubenswrapper[4752]: I0929 11:02:34.940958 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4tpg\" (UniqueName: \"kubernetes.io/projected/a408d775-16c4-45e5-9750-e3b3f8141bd6-kube-api-access-j4tpg\") pod \"observability-ui-dashboards-6584dc9448-w8t78\" (UID: \"a408d775-16c4-45e5-9750-e3b3f8141bd6\") " pod="openshift-operators/observability-ui-dashboards-6584dc9448-w8t78" Sep 29 11:02:34 crc kubenswrapper[4752]: I0929 11:02:34.953612 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4c27e500-7063-40cc-9a6d-4e8fa0df4a98-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"4c27e500-7063-40cc-9a6d-4e8fa0df4a98\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Sep 29 11:02:34 crc kubenswrapper[4752]: I0929 11:02:34.953675 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-65a84658-aa50-4259-8efb-e2e46a6339b3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-65a84658-aa50-4259-8efb-e2e46a6339b3\") pod \"prometheus-metric-storage-0\" (UID: \"4c27e500-7063-40cc-9a6d-4e8fa0df4a98\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Sep 29 11:02:34 crc kubenswrapper[4752]: I0929 11:02:34.953724 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/4c27e500-7063-40cc-9a6d-4e8fa0df4a98-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"4c27e500-7063-40cc-9a6d-4e8fa0df4a98\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Sep 29 11:02:34 crc kubenswrapper[4752]: I0929 11:02:34.953744 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4c27e500-7063-40cc-9a6d-4e8fa0df4a98-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"4c27e500-7063-40cc-9a6d-4e8fa0df4a98\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Sep 29 11:02:34 crc kubenswrapper[4752]: I0929 11:02:34.953767 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4c27e500-7063-40cc-9a6d-4e8fa0df4a98-config\") pod \"prometheus-metric-storage-0\" (UID: \"4c27e500-7063-40cc-9a6d-4e8fa0df4a98\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Sep 29 11:02:34 crc kubenswrapper[4752]: I0929 11:02:34.953787 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4c27e500-7063-40cc-9a6d-4e8fa0df4a98-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"4c27e500-7063-40cc-9a6d-4e8fa0df4a98\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Sep 29 11:02:34 crc kubenswrapper[4752]: I0929 11:02:34.953818 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l86vs\" (UniqueName: \"kubernetes.io/projected/4c27e500-7063-40cc-9a6d-4e8fa0df4a98-kube-api-access-l86vs\") pod \"prometheus-metric-storage-0\" (UID: \"4c27e500-7063-40cc-9a6d-4e8fa0df4a98\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Sep 29 11:02:34 crc kubenswrapper[4752]: I0929 11:02:34.953845 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/4c27e500-7063-40cc-9a6d-4e8fa0df4a98-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"4c27e500-7063-40cc-9a6d-4e8fa0df4a98\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Sep 29 11:02:35 crc kubenswrapper[4752]: I0929 11:02:35.058670 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/4c27e500-7063-40cc-9a6d-4e8fa0df4a98-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"4c27e500-7063-40cc-9a6d-4e8fa0df4a98\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Sep 29 11:02:35 crc kubenswrapper[4752]: I0929 11:02:35.058783 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4c27e500-7063-40cc-9a6d-4e8fa0df4a98-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"4c27e500-7063-40cc-9a6d-4e8fa0df4a98\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Sep 29 11:02:35 crc kubenswrapper[4752]: I0929 11:02:35.058854 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-65a84658-aa50-4259-8efb-e2e46a6339b3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-65a84658-aa50-4259-8efb-e2e46a6339b3\") pod \"prometheus-metric-storage-0\" (UID: \"4c27e500-7063-40cc-9a6d-4e8fa0df4a98\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Sep 29 11:02:35 crc kubenswrapper[4752]: I0929 11:02:35.058900 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/4c27e500-7063-40cc-9a6d-4e8fa0df4a98-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"4c27e500-7063-40cc-9a6d-4e8fa0df4a98\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Sep 29 11:02:35 crc kubenswrapper[4752]: I0929 11:02:35.058927 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4c27e500-7063-40cc-9a6d-4e8fa0df4a98-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"4c27e500-7063-40cc-9a6d-4e8fa0df4a98\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Sep 29 11:02:35 crc kubenswrapper[4752]: I0929 11:02:35.058954 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4c27e500-7063-40cc-9a6d-4e8fa0df4a98-config\") pod \"prometheus-metric-storage-0\" (UID: \"4c27e500-7063-40cc-9a6d-4e8fa0df4a98\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Sep 29 11:02:35 crc kubenswrapper[4752]: I0929 11:02:35.058982 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4c27e500-7063-40cc-9a6d-4e8fa0df4a98-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"4c27e500-7063-40cc-9a6d-4e8fa0df4a98\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Sep 29 11:02:35 crc kubenswrapper[4752]: I0929 11:02:35.059008 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l86vs\" (UniqueName: \"kubernetes.io/projected/4c27e500-7063-40cc-9a6d-4e8fa0df4a98-kube-api-access-l86vs\") pod \"prometheus-metric-storage-0\" (UID: \"4c27e500-7063-40cc-9a6d-4e8fa0df4a98\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Sep 29 11:02:35 crc kubenswrapper[4752]: I0929 11:02:35.082218 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/4c27e500-7063-40cc-9a6d-4e8fa0df4a98-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"4c27e500-7063-40cc-9a6d-4e8fa0df4a98\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Sep 29 11:02:35 crc kubenswrapper[4752]: I0929 11:02:35.084272 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/4c27e500-7063-40cc-9a6d-4e8fa0df4a98-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"4c27e500-7063-40cc-9a6d-4e8fa0df4a98\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Sep 29 11:02:35 crc kubenswrapper[4752]: I0929 11:02:35.084503 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4c27e500-7063-40cc-9a6d-4e8fa0df4a98-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"4c27e500-7063-40cc-9a6d-4e8fa0df4a98\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Sep 29 11:02:35 crc kubenswrapper[4752]: I0929 11:02:35.092554 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4c27e500-7063-40cc-9a6d-4e8fa0df4a98-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"4c27e500-7063-40cc-9a6d-4e8fa0df4a98\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Sep 29 11:02:35 crc kubenswrapper[4752]: I0929 11:02:35.097961 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4c27e500-7063-40cc-9a6d-4e8fa0df4a98-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"4c27e500-7063-40cc-9a6d-4e8fa0df4a98\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Sep 29 11:02:35 crc kubenswrapper[4752]: I0929 11:02:35.100635 4752 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Sep 29 11:02:35 crc kubenswrapper[4752]: I0929 11:02:35.103897 4752 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-65a84658-aa50-4259-8efb-e2e46a6339b3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-65a84658-aa50-4259-8efb-e2e46a6339b3\") pod \"prometheus-metric-storage-0\" (UID: \"4c27e500-7063-40cc-9a6d-4e8fa0df4a98\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/9b2293af6eff455224fb1fa348a6b5097aab5a7c0afb95e3cd736ba39d87a712/globalmount\"" pod="watcher-kuttl-default/prometheus-metric-storage-0" Sep 29 11:02:35 crc kubenswrapper[4752]: I0929 11:02:35.118598 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/4c27e500-7063-40cc-9a6d-4e8fa0df4a98-config\") pod \"prometheus-metric-storage-0\" (UID: \"4c27e500-7063-40cc-9a6d-4e8fa0df4a98\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Sep 29 11:02:35 crc kubenswrapper[4752]: I0929 11:02:35.123367 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l86vs\" (UniqueName: \"kubernetes.io/projected/4c27e500-7063-40cc-9a6d-4e8fa0df4a98-kube-api-access-l86vs\") pod \"prometheus-metric-storage-0\" (UID: \"4c27e500-7063-40cc-9a6d-4e8fa0df4a98\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Sep 29 11:02:35 crc kubenswrapper[4752]: I0929 11:02:35.199042 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-55f6b65f7f-d24mw"] Sep 29 11:02:35 crc kubenswrapper[4752]: I0929 11:02:35.200103 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-55f6b65f7f-d24mw" Sep 29 11:02:35 crc kubenswrapper[4752]: I0929 11:02:35.240125 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-55f6b65f7f-d24mw"] Sep 29 11:02:35 crc kubenswrapper[4752]: I0929 11:02:35.263991 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8384ab5b-5da0-4f20-9786-b9b78d89a1e4-oauth-serving-cert\") pod \"console-55f6b65f7f-d24mw\" (UID: \"8384ab5b-5da0-4f20-9786-b9b78d89a1e4\") " pod="openshift-console/console-55f6b65f7f-d24mw" Sep 29 11:02:35 crc kubenswrapper[4752]: I0929 11:02:35.264039 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqphs\" (UniqueName: \"kubernetes.io/projected/8384ab5b-5da0-4f20-9786-b9b78d89a1e4-kube-api-access-vqphs\") pod \"console-55f6b65f7f-d24mw\" (UID: \"8384ab5b-5da0-4f20-9786-b9b78d89a1e4\") " pod="openshift-console/console-55f6b65f7f-d24mw" Sep 29 11:02:35 crc kubenswrapper[4752]: I0929 11:02:35.264098 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8384ab5b-5da0-4f20-9786-b9b78d89a1e4-trusted-ca-bundle\") pod \"console-55f6b65f7f-d24mw\" (UID: \"8384ab5b-5da0-4f20-9786-b9b78d89a1e4\") " pod="openshift-console/console-55f6b65f7f-d24mw" Sep 29 11:02:35 crc kubenswrapper[4752]: I0929 11:02:35.264163 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8384ab5b-5da0-4f20-9786-b9b78d89a1e4-service-ca\") pod \"console-55f6b65f7f-d24mw\" (UID: \"8384ab5b-5da0-4f20-9786-b9b78d89a1e4\") " pod="openshift-console/console-55f6b65f7f-d24mw" Sep 29 11:02:35 crc kubenswrapper[4752]: I0929 11:02:35.264209 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8384ab5b-5da0-4f20-9786-b9b78d89a1e4-console-oauth-config\") pod \"console-55f6b65f7f-d24mw\" (UID: \"8384ab5b-5da0-4f20-9786-b9b78d89a1e4\") " pod="openshift-console/console-55f6b65f7f-d24mw" Sep 29 11:02:35 crc kubenswrapper[4752]: I0929 11:02:35.264236 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8384ab5b-5da0-4f20-9786-b9b78d89a1e4-console-serving-cert\") pod \"console-55f6b65f7f-d24mw\" (UID: \"8384ab5b-5da0-4f20-9786-b9b78d89a1e4\") " pod="openshift-console/console-55f6b65f7f-d24mw" Sep 29 11:02:35 crc kubenswrapper[4752]: I0929 11:02:35.264286 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8384ab5b-5da0-4f20-9786-b9b78d89a1e4-console-config\") pod \"console-55f6b65f7f-d24mw\" (UID: \"8384ab5b-5da0-4f20-9786-b9b78d89a1e4\") " pod="openshift-console/console-55f6b65f7f-d24mw" Sep 29 11:02:35 crc kubenswrapper[4752]: I0929 11:02:35.290587 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-65a84658-aa50-4259-8efb-e2e46a6339b3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-65a84658-aa50-4259-8efb-e2e46a6339b3\") pod \"prometheus-metric-storage-0\" (UID: \"4c27e500-7063-40cc-9a6d-4e8fa0df4a98\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Sep 29 11:02:35 crc kubenswrapper[4752]: I0929 11:02:35.367408 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8384ab5b-5da0-4f20-9786-b9b78d89a1e4-console-oauth-config\") pod \"console-55f6b65f7f-d24mw\" (UID: \"8384ab5b-5da0-4f20-9786-b9b78d89a1e4\") " pod="openshift-console/console-55f6b65f7f-d24mw" Sep 29 11:02:35 crc kubenswrapper[4752]: I0929 11:02:35.367479 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8384ab5b-5da0-4f20-9786-b9b78d89a1e4-console-serving-cert\") pod \"console-55f6b65f7f-d24mw\" (UID: \"8384ab5b-5da0-4f20-9786-b9b78d89a1e4\") " pod="openshift-console/console-55f6b65f7f-d24mw" Sep 29 11:02:35 crc kubenswrapper[4752]: I0929 11:02:35.367532 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a408d775-16c4-45e5-9750-e3b3f8141bd6-serving-cert\") pod \"observability-ui-dashboards-6584dc9448-w8t78\" (UID: \"a408d775-16c4-45e5-9750-e3b3f8141bd6\") " pod="openshift-operators/observability-ui-dashboards-6584dc9448-w8t78" Sep 29 11:02:35 crc kubenswrapper[4752]: I0929 11:02:35.367557 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8384ab5b-5da0-4f20-9786-b9b78d89a1e4-console-config\") pod \"console-55f6b65f7f-d24mw\" (UID: \"8384ab5b-5da0-4f20-9786-b9b78d89a1e4\") " pod="openshift-console/console-55f6b65f7f-d24mw" Sep 29 11:02:35 crc kubenswrapper[4752]: I0929 11:02:35.367588 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8384ab5b-5da0-4f20-9786-b9b78d89a1e4-oauth-serving-cert\") pod \"console-55f6b65f7f-d24mw\" (UID: \"8384ab5b-5da0-4f20-9786-b9b78d89a1e4\") " pod="openshift-console/console-55f6b65f7f-d24mw" Sep 29 11:02:35 crc kubenswrapper[4752]: I0929 11:02:35.367608 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqphs\" (UniqueName: \"kubernetes.io/projected/8384ab5b-5da0-4f20-9786-b9b78d89a1e4-kube-api-access-vqphs\") pod \"console-55f6b65f7f-d24mw\" (UID: \"8384ab5b-5da0-4f20-9786-b9b78d89a1e4\") " pod="openshift-console/console-55f6b65f7f-d24mw" Sep 29 11:02:35 crc kubenswrapper[4752]: I0929 11:02:35.367642 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8384ab5b-5da0-4f20-9786-b9b78d89a1e4-trusted-ca-bundle\") pod \"console-55f6b65f7f-d24mw\" (UID: \"8384ab5b-5da0-4f20-9786-b9b78d89a1e4\") " pod="openshift-console/console-55f6b65f7f-d24mw" Sep 29 11:02:35 crc kubenswrapper[4752]: I0929 11:02:35.367685 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8384ab5b-5da0-4f20-9786-b9b78d89a1e4-service-ca\") pod \"console-55f6b65f7f-d24mw\" (UID: \"8384ab5b-5da0-4f20-9786-b9b78d89a1e4\") " pod="openshift-console/console-55f6b65f7f-d24mw" Sep 29 11:02:35 crc kubenswrapper[4752]: I0929 11:02:35.369513 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8384ab5b-5da0-4f20-9786-b9b78d89a1e4-console-config\") pod \"console-55f6b65f7f-d24mw\" (UID: \"8384ab5b-5da0-4f20-9786-b9b78d89a1e4\") " pod="openshift-console/console-55f6b65f7f-d24mw" Sep 29 11:02:35 crc kubenswrapper[4752]: I0929 11:02:35.370994 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8384ab5b-5da0-4f20-9786-b9b78d89a1e4-trusted-ca-bundle\") pod \"console-55f6b65f7f-d24mw\" (UID: \"8384ab5b-5da0-4f20-9786-b9b78d89a1e4\") " pod="openshift-console/console-55f6b65f7f-d24mw" Sep 29 11:02:35 crc kubenswrapper[4752]: I0929 11:02:35.371042 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8384ab5b-5da0-4f20-9786-b9b78d89a1e4-oauth-serving-cert\") pod \"console-55f6b65f7f-d24mw\" (UID: \"8384ab5b-5da0-4f20-9786-b9b78d89a1e4\") " pod="openshift-console/console-55f6b65f7f-d24mw" Sep 29 11:02:35 crc kubenswrapper[4752]: I0929 11:02:35.371762 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8384ab5b-5da0-4f20-9786-b9b78d89a1e4-service-ca\") pod \"console-55f6b65f7f-d24mw\" (UID: \"8384ab5b-5da0-4f20-9786-b9b78d89a1e4\") " pod="openshift-console/console-55f6b65f7f-d24mw" Sep 29 11:02:35 crc kubenswrapper[4752]: I0929 11:02:35.377990 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a408d775-16c4-45e5-9750-e3b3f8141bd6-serving-cert\") pod \"observability-ui-dashboards-6584dc9448-w8t78\" (UID: \"a408d775-16c4-45e5-9750-e3b3f8141bd6\") " pod="openshift-operators/observability-ui-dashboards-6584dc9448-w8t78" Sep 29 11:02:35 crc kubenswrapper[4752]: I0929 11:02:35.378770 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8384ab5b-5da0-4f20-9786-b9b78d89a1e4-console-serving-cert\") pod \"console-55f6b65f7f-d24mw\" (UID: \"8384ab5b-5da0-4f20-9786-b9b78d89a1e4\") " pod="openshift-console/console-55f6b65f7f-d24mw" Sep 29 11:02:35 crc kubenswrapper[4752]: I0929 11:02:35.379299 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8384ab5b-5da0-4f20-9786-b9b78d89a1e4-console-oauth-config\") pod \"console-55f6b65f7f-d24mw\" (UID: \"8384ab5b-5da0-4f20-9786-b9b78d89a1e4\") " pod="openshift-console/console-55f6b65f7f-d24mw" Sep 29 11:02:35 crc kubenswrapper[4752]: I0929 11:02:35.394006 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqphs\" (UniqueName: \"kubernetes.io/projected/8384ab5b-5da0-4f20-9786-b9b78d89a1e4-kube-api-access-vqphs\") pod \"console-55f6b65f7f-d24mw\" (UID: \"8384ab5b-5da0-4f20-9786-b9b78d89a1e4\") " pod="openshift-console/console-55f6b65f7f-d24mw" Sep 29 11:02:35 crc kubenswrapper[4752]: I0929 11:02:35.427852 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/alertmanager-metric-storage-0"] Sep 29 11:02:35 crc kubenswrapper[4752]: I0929 11:02:35.457611 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/prometheus-metric-storage-0" Sep 29 11:02:35 crc kubenswrapper[4752]: I0929 11:02:35.575176 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-55f6b65f7f-d24mw" Sep 29 11:02:35 crc kubenswrapper[4752]: I0929 11:02:35.643955 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-ui-dashboards-6584dc9448-w8t78" Sep 29 11:02:35 crc kubenswrapper[4752]: I0929 11:02:35.794363 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/alertmanager-metric-storage-0" event={"ID":"a00720d9-6655-4234-a3ea-25de9303b7e4","Type":"ContainerStarted","Data":"0940bf231ed03bc40253975d338779d617b80580c8918f0e897cb20385e4ff6f"} Sep 29 11:02:36 crc kubenswrapper[4752]: I0929 11:02:36.544158 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/prometheus-metric-storage-0"] Sep 29 11:02:37 crc kubenswrapper[4752]: I0929 11:02:37.361384 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-55f6b65f7f-d24mw"] Sep 29 11:02:37 crc kubenswrapper[4752]: I0929 11:02:37.420764 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-ui-dashboards-6584dc9448-w8t78"] Sep 29 11:02:37 crc kubenswrapper[4752]: W0929 11:02:37.576107 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8384ab5b_5da0_4f20_9786_b9b78d89a1e4.slice/crio-387e1a243e258b3898570e5b3bf5cedf27c021178430dbe7f411a43742aa7251 WatchSource:0}: Error finding container 387e1a243e258b3898570e5b3bf5cedf27c021178430dbe7f411a43742aa7251: Status 404 returned error can't find the container with id 387e1a243e258b3898570e5b3bf5cedf27c021178430dbe7f411a43742aa7251 Sep 29 11:02:37 crc kubenswrapper[4752]: W0929 11:02:37.577323 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda408d775_16c4_45e5_9750_e3b3f8141bd6.slice/crio-313d2d1e588a9bc6e1690c7cb0484226491119bcad7e620c42db34c1d22d68a2 WatchSource:0}: Error finding container 313d2d1e588a9bc6e1690c7cb0484226491119bcad7e620c42db34c1d22d68a2: Status 404 returned error can't find the container with id 313d2d1e588a9bc6e1690c7cb0484226491119bcad7e620c42db34c1d22d68a2 Sep 29 11:02:37 crc kubenswrapper[4752]: I0929 11:02:37.828073 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-55f6b65f7f-d24mw" event={"ID":"8384ab5b-5da0-4f20-9786-b9b78d89a1e4","Type":"ContainerStarted","Data":"387e1a243e258b3898570e5b3bf5cedf27c021178430dbe7f411a43742aa7251"} Sep 29 11:02:37 crc kubenswrapper[4752]: I0929 11:02:37.829596 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-ui-dashboards-6584dc9448-w8t78" event={"ID":"a408d775-16c4-45e5-9750-e3b3f8141bd6","Type":"ContainerStarted","Data":"313d2d1e588a9bc6e1690c7cb0484226491119bcad7e620c42db34c1d22d68a2"} Sep 29 11:02:37 crc kubenswrapper[4752]: I0929 11:02:37.831487 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/prometheus-metric-storage-0" event={"ID":"4c27e500-7063-40cc-9a6d-4e8fa0df4a98","Type":"ContainerStarted","Data":"5957773d2291ea0407409a09509a21976b2c685a8186970434ea1744e4aaddc6"} Sep 29 11:02:50 crc kubenswrapper[4752]: E0929 11:02:50.567173 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-memcached:current-podified" Sep 29 11:02:50 crc kubenswrapper[4752]: E0929 11:02:50.568110 4752 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:memcached,Image:quay.io/podified-antelope-centos9/openstack-memcached:current-podified,Command:[/usr/bin/dumb-init -- /usr/local/bin/kolla_start],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:memcached,HostPort:0,ContainerPort:11211,Protocol:TCP,HostIP:,},ContainerPort{Name:memcached-tls,HostPort:0,ContainerPort:11212,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:POD_IPS,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIPs,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:CONFIG_HASH,Value:n66fhf5h68dh8fh68bh5cbhc9hfdh688h67fh5b5h5f7hcch666h5cfh66bh665h559h76h89h5f9h58fh56fh568h64ch5d4h5c4h56ch5ddh5d4h8dh94q,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/src,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:memcached-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/certs/memcached.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:memcached-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/private/memcached.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-p2l57,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42457,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42457,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod memcached-0_watcher-kuttl-default(bfbc51a1-9e9b-4af1-865c-d6228444dded): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 29 11:02:50 crc kubenswrapper[4752]: E0929 11:02:50.569302 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="watcher-kuttl-default/memcached-0" podUID="bfbc51a1-9e9b-4af1-865c-d6228444dded" Sep 29 11:02:50 crc kubenswrapper[4752]: I0929 11:02:50.948473 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-55f6b65f7f-d24mw" event={"ID":"8384ab5b-5da0-4f20-9786-b9b78d89a1e4","Type":"ContainerStarted","Data":"19d8acf5e78432cae53141fc97773a665701aee6fda3dd9286be8bb8e42ac08c"} Sep 29 11:02:50 crc kubenswrapper[4752]: I0929 11:02:50.950840 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-ui-dashboards-6584dc9448-w8t78" event={"ID":"a408d775-16c4-45e5-9750-e3b3f8141bd6","Type":"ContainerStarted","Data":"2c141605a556d71ae8af603b6417c95316c9e05ef664384783a576e9b4be3ed2"} Sep 29 11:02:50 crc kubenswrapper[4752]: I0929 11:02:50.952786 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/kube-state-metrics-0" event={"ID":"238407f7-3389-4791-becd-0852b1e66cea","Type":"ContainerStarted","Data":"1d0f583aba0a2fd55be194208a8969712a520a0590bc133440bf316da56eb8cb"} Sep 29 11:02:50 crc kubenswrapper[4752]: I0929 11:02:50.952914 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/kube-state-metrics-0" Sep 29 11:02:50 crc kubenswrapper[4752]: E0929 11:02:50.954592 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-memcached:current-podified\\\"\"" pod="watcher-kuttl-default/memcached-0" podUID="bfbc51a1-9e9b-4af1-865c-d6228444dded" Sep 29 11:02:50 crc kubenswrapper[4752]: I0929 11:02:50.978691 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-55f6b65f7f-d24mw" podStartSLOduration=15.978647739 podStartE2EDuration="15.978647739s" podCreationTimestamp="2025-09-29 11:02:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 11:02:50.965690326 +0000 UTC m=+1111.754832003" watchObservedRunningTime="2025-09-29 11:02:50.978647739 +0000 UTC m=+1111.767789426" Sep 29 11:02:51 crc kubenswrapper[4752]: I0929 11:02:51.002911 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/kube-state-metrics-0" podStartSLOduration=1.9925466950000001 podStartE2EDuration="18.002888621s" podCreationTimestamp="2025-09-29 11:02:33 +0000 UTC" firstStartedPulling="2025-09-29 11:02:34.602524342 +0000 UTC m=+1095.391666009" lastFinishedPulling="2025-09-29 11:02:50.612866258 +0000 UTC m=+1111.402007935" observedRunningTime="2025-09-29 11:02:50.981474972 +0000 UTC m=+1111.770616639" watchObservedRunningTime="2025-09-29 11:02:51.002888621 +0000 UTC m=+1111.792030288" Sep 29 11:02:51 crc kubenswrapper[4752]: I0929 11:02:51.026503 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-ui-dashboards-6584dc9448-w8t78" podStartSLOduration=3.969523938 podStartE2EDuration="17.026487157s" podCreationTimestamp="2025-09-29 11:02:34 +0000 UTC" firstStartedPulling="2025-09-29 11:02:37.584257628 +0000 UTC m=+1098.373399295" lastFinishedPulling="2025-09-29 11:02:50.641220847 +0000 UTC m=+1111.430362514" observedRunningTime="2025-09-29 11:02:51.025750098 +0000 UTC m=+1111.814891765" watchObservedRunningTime="2025-09-29 11:02:51.026487157 +0000 UTC m=+1111.815628824" Sep 29 11:02:51 crc kubenswrapper[4752]: I0929 11:02:51.964003 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/rabbitmq-server-0" event={"ID":"0b2b1ef4-961c-4803-856d-4d6deb42cc10","Type":"ContainerStarted","Data":"720bb814a5bd0e7f20cf0ca8b7c41eae841a10b316796699d5cd44a22bf80160"} Sep 29 11:02:51 crc kubenswrapper[4752]: I0929 11:02:51.965770 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/rabbitmq-notifications-server-0" event={"ID":"23451e36-d03b-4039-ba05-d20e013b089b","Type":"ContainerStarted","Data":"34f419b9a68d0a4a859f95d55a246318e52e7f99c5029308854ec0ab6ac6d26d"} Sep 29 11:02:51 crc kubenswrapper[4752]: I0929 11:02:51.967137 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/openstack-galera-0" event={"ID":"58c49438-2c74-4c5e-b476-7fff98957387","Type":"ContainerStarted","Data":"d389d9223d1b0d97aaa24ebbe0448061cf2a30405f6650073fdd09e400078b8e"} Sep 29 11:02:53 crc kubenswrapper[4752]: I0929 11:02:53.992752 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/alertmanager-metric-storage-0" event={"ID":"a00720d9-6655-4234-a3ea-25de9303b7e4","Type":"ContainerStarted","Data":"00b41f2ae90b2851b9c5b57f6ffa7aa06267a81edbc033bf4f561f2e15e44a1d"} Sep 29 11:02:53 crc kubenswrapper[4752]: I0929 11:02:53.997272 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/prometheus-metric-storage-0" event={"ID":"4c27e500-7063-40cc-9a6d-4e8fa0df4a98","Type":"ContainerStarted","Data":"3fbc9051d817cdbe2d0a4d9b32e0712511bb3df0f5e6f1091c65fbd66abc443f"} Sep 29 11:02:55 crc kubenswrapper[4752]: I0929 11:02:55.006072 4752 generic.go:334] "Generic (PLEG): container finished" podID="58c49438-2c74-4c5e-b476-7fff98957387" containerID="d389d9223d1b0d97aaa24ebbe0448061cf2a30405f6650073fdd09e400078b8e" exitCode=0 Sep 29 11:02:55 crc kubenswrapper[4752]: I0929 11:02:55.006149 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/openstack-galera-0" event={"ID":"58c49438-2c74-4c5e-b476-7fff98957387","Type":"ContainerDied","Data":"d389d9223d1b0d97aaa24ebbe0448061cf2a30405f6650073fdd09e400078b8e"} Sep 29 11:02:55 crc kubenswrapper[4752]: I0929 11:02:55.576992 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-55f6b65f7f-d24mw" Sep 29 11:02:55 crc kubenswrapper[4752]: I0929 11:02:55.577447 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-55f6b65f7f-d24mw" Sep 29 11:02:55 crc kubenswrapper[4752]: I0929 11:02:55.583352 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-55f6b65f7f-d24mw" Sep 29 11:02:56 crc kubenswrapper[4752]: I0929 11:02:56.016009 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/openstack-galera-0" event={"ID":"58c49438-2c74-4c5e-b476-7fff98957387","Type":"ContainerStarted","Data":"6622fdcc99c120287e0cef92e34fd8a4ffb09846d3ea4e77f31329fefc2c4482"} Sep 29 11:02:56 crc kubenswrapper[4752]: I0929 11:02:56.020613 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-55f6b65f7f-d24mw" Sep 29 11:02:56 crc kubenswrapper[4752]: I0929 11:02:56.038154 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/openstack-galera-0" podStartSLOduration=8.129858804 podStartE2EDuration="25.038130397s" podCreationTimestamp="2025-09-29 11:02:31 +0000 UTC" firstStartedPulling="2025-09-29 11:02:33.782696871 +0000 UTC m=+1094.571838538" lastFinishedPulling="2025-09-29 11:02:50.690968464 +0000 UTC m=+1111.480110131" observedRunningTime="2025-09-29 11:02:56.035246753 +0000 UTC m=+1116.824388430" watchObservedRunningTime="2025-09-29 11:02:56.038130397 +0000 UTC m=+1116.827272064" Sep 29 11:02:56 crc kubenswrapper[4752]: I0929 11:02:56.114160 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-85546cf746-dfhr8"] Sep 29 11:02:56 crc kubenswrapper[4752]: I0929 11:02:56.175601 4752 patch_prober.go:28] interesting pod/machine-config-daemon-mgrvs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 11:02:56 crc kubenswrapper[4752]: I0929 11:02:56.175662 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 11:02:56 crc kubenswrapper[4752]: I0929 11:02:56.175938 4752 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" Sep 29 11:02:56 crc kubenswrapper[4752]: I0929 11:02:56.176612 4752 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"48a0da04429cf7fcc316318f0d1c0bddde646fbce423db761e54fa0241cf9fda"} pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 29 11:02:56 crc kubenswrapper[4752]: I0929 11:02:56.176675 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" containerName="machine-config-daemon" containerID="cri-o://48a0da04429cf7fcc316318f0d1c0bddde646fbce423db761e54fa0241cf9fda" gracePeriod=600 Sep 29 11:02:57 crc kubenswrapper[4752]: I0929 11:02:57.026605 4752 generic.go:334] "Generic (PLEG): container finished" podID="5863c243-797d-462a-b11f-71aaf005f8d1" containerID="48a0da04429cf7fcc316318f0d1c0bddde646fbce423db761e54fa0241cf9fda" exitCode=0 Sep 29 11:02:57 crc kubenswrapper[4752]: I0929 11:02:57.026665 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" event={"ID":"5863c243-797d-462a-b11f-71aaf005f8d1","Type":"ContainerDied","Data":"48a0da04429cf7fcc316318f0d1c0bddde646fbce423db761e54fa0241cf9fda"} Sep 29 11:02:57 crc kubenswrapper[4752]: I0929 11:02:57.027372 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" event={"ID":"5863c243-797d-462a-b11f-71aaf005f8d1","Type":"ContainerStarted","Data":"163ff8dbb1a373e991e8699e30ebf0d1354dad4f96196cd59c49a7d6edcb147e"} Sep 29 11:02:57 crc kubenswrapper[4752]: I0929 11:02:57.027404 4752 scope.go:117] "RemoveContainer" containerID="dbba8a90f680e465e868c9761ab597851b2db8c336cda0417acd2b4d326ea54a" Sep 29 11:03:02 crc kubenswrapper[4752]: I0929 11:03:02.079316 4752 generic.go:334] "Generic (PLEG): container finished" podID="a00720d9-6655-4234-a3ea-25de9303b7e4" containerID="00b41f2ae90b2851b9c5b57f6ffa7aa06267a81edbc033bf4f561f2e15e44a1d" exitCode=0 Sep 29 11:03:02 crc kubenswrapper[4752]: I0929 11:03:02.079447 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/alertmanager-metric-storage-0" event={"ID":"a00720d9-6655-4234-a3ea-25de9303b7e4","Type":"ContainerDied","Data":"00b41f2ae90b2851b9c5b57f6ffa7aa06267a81edbc033bf4f561f2e15e44a1d"} Sep 29 11:03:03 crc kubenswrapper[4752]: I0929 11:03:03.092202 4752 generic.go:334] "Generic (PLEG): container finished" podID="4c27e500-7063-40cc-9a6d-4e8fa0df4a98" containerID="3fbc9051d817cdbe2d0a4d9b32e0712511bb3df0f5e6f1091c65fbd66abc443f" exitCode=0 Sep 29 11:03:03 crc kubenswrapper[4752]: I0929 11:03:03.092313 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/prometheus-metric-storage-0" event={"ID":"4c27e500-7063-40cc-9a6d-4e8fa0df4a98","Type":"ContainerDied","Data":"3fbc9051d817cdbe2d0a4d9b32e0712511bb3df0f5e6f1091c65fbd66abc443f"} Sep 29 11:03:03 crc kubenswrapper[4752]: I0929 11:03:03.101638 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/openstack-galera-0" Sep 29 11:03:03 crc kubenswrapper[4752]: I0929 11:03:03.101688 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/openstack-galera-0" Sep 29 11:03:03 crc kubenswrapper[4752]: I0929 11:03:03.159072 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/openstack-galera-0" Sep 29 11:03:03 crc kubenswrapper[4752]: I0929 11:03:03.708795 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/kube-state-metrics-0" Sep 29 11:03:04 crc kubenswrapper[4752]: I0929 11:03:04.149585 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/openstack-galera-0" Sep 29 11:03:05 crc kubenswrapper[4752]: I0929 11:03:05.108876 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/memcached-0" event={"ID":"bfbc51a1-9e9b-4af1-865c-d6228444dded","Type":"ContainerStarted","Data":"ffc299b88bc217067d793d7ed6f8fa3522be304433be32bde7d189bb1b72660d"} Sep 29 11:03:05 crc kubenswrapper[4752]: I0929 11:03:05.110013 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/memcached-0" Sep 29 11:03:05 crc kubenswrapper[4752]: I0929 11:03:05.133779 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/memcached-0" podStartSLOduration=2.943473197 podStartE2EDuration="33.13375395s" podCreationTimestamp="2025-09-29 11:02:32 +0000 UTC" firstStartedPulling="2025-09-29 11:02:34.251977776 +0000 UTC m=+1095.041119443" lastFinishedPulling="2025-09-29 11:03:04.442258529 +0000 UTC m=+1125.231400196" observedRunningTime="2025-09-29 11:03:05.131503703 +0000 UTC m=+1125.920645380" watchObservedRunningTime="2025-09-29 11:03:05.13375395 +0000 UTC m=+1125.922895617" Sep 29 11:03:06 crc kubenswrapper[4752]: I0929 11:03:06.125466 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/alertmanager-metric-storage-0" event={"ID":"a00720d9-6655-4234-a3ea-25de9303b7e4","Type":"ContainerStarted","Data":"3e965bd08f8b03ae705cb2196cd0a783185c2863c5836844c5830231794140d0"} Sep 29 11:03:08 crc kubenswrapper[4752]: I0929 11:03:08.145592 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/alertmanager-metric-storage-0" event={"ID":"a00720d9-6655-4234-a3ea-25de9303b7e4","Type":"ContainerStarted","Data":"e84198db1e65fd249887db10ecb7efab1249bcfe87ecfec71538057644dec1e4"} Sep 29 11:03:08 crc kubenswrapper[4752]: I0929 11:03:08.147146 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/alertmanager-metric-storage-0" Sep 29 11:03:08 crc kubenswrapper[4752]: I0929 11:03:08.151704 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/alertmanager-metric-storage-0" Sep 29 11:03:08 crc kubenswrapper[4752]: I0929 11:03:08.174367 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/alertmanager-metric-storage-0" podStartSLOduration=4.494289875 podStartE2EDuration="34.17434492s" podCreationTimestamp="2025-09-29 11:02:34 +0000 UTC" firstStartedPulling="2025-09-29 11:02:35.632026794 +0000 UTC m=+1096.421168461" lastFinishedPulling="2025-09-29 11:03:05.312081839 +0000 UTC m=+1126.101223506" observedRunningTime="2025-09-29 11:03:08.169566808 +0000 UTC m=+1128.958708475" watchObservedRunningTime="2025-09-29 11:03:08.17434492 +0000 UTC m=+1128.963486587" Sep 29 11:03:13 crc kubenswrapper[4752]: I0929 11:03:13.046689 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/keystone-db-create-8mp5x"] Sep 29 11:03:13 crc kubenswrapper[4752]: I0929 11:03:13.050362 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-db-create-8mp5x" Sep 29 11:03:13 crc kubenswrapper[4752]: I0929 11:03:13.093612 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/keystone-db-create-8mp5x"] Sep 29 11:03:13 crc kubenswrapper[4752]: I0929 11:03:13.101053 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjwmq\" (UniqueName: \"kubernetes.io/projected/e25c334c-c7a9-46d2-b94b-9956db4aac80-kube-api-access-mjwmq\") pod \"keystone-db-create-8mp5x\" (UID: \"e25c334c-c7a9-46d2-b94b-9956db4aac80\") " pod="watcher-kuttl-default/keystone-db-create-8mp5x" Sep 29 11:03:13 crc kubenswrapper[4752]: I0929 11:03:13.192150 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/prometheus-metric-storage-0" event={"ID":"4c27e500-7063-40cc-9a6d-4e8fa0df4a98","Type":"ContainerStarted","Data":"f1571559973f20c9d9a814286b4195ca2ddf0bdf3811928039f17e0a52032c95"} Sep 29 11:03:13 crc kubenswrapper[4752]: I0929 11:03:13.203673 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjwmq\" (UniqueName: \"kubernetes.io/projected/e25c334c-c7a9-46d2-b94b-9956db4aac80-kube-api-access-mjwmq\") pod \"keystone-db-create-8mp5x\" (UID: \"e25c334c-c7a9-46d2-b94b-9956db4aac80\") " pod="watcher-kuttl-default/keystone-db-create-8mp5x" Sep 29 11:03:13 crc kubenswrapper[4752]: I0929 11:03:13.222983 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjwmq\" (UniqueName: \"kubernetes.io/projected/e25c334c-c7a9-46d2-b94b-9956db4aac80-kube-api-access-mjwmq\") pod \"keystone-db-create-8mp5x\" (UID: \"e25c334c-c7a9-46d2-b94b-9956db4aac80\") " pod="watcher-kuttl-default/keystone-db-create-8mp5x" Sep 29 11:03:13 crc kubenswrapper[4752]: I0929 11:03:13.398850 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-db-create-8mp5x" Sep 29 11:03:13 crc kubenswrapper[4752]: I0929 11:03:13.419134 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/memcached-0" Sep 29 11:03:13 crc kubenswrapper[4752]: I0929 11:03:13.851227 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/keystone-db-create-8mp5x"] Sep 29 11:03:14 crc kubenswrapper[4752]: I0929 11:03:14.214719 4752 generic.go:334] "Generic (PLEG): container finished" podID="e25c334c-c7a9-46d2-b94b-9956db4aac80" containerID="6e951f81f5c5a9b7c5ffc69ed43b0a83d60384b01f23a028a73575b9edb36eea" exitCode=0 Sep 29 11:03:14 crc kubenswrapper[4752]: I0929 11:03:14.214786 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-db-create-8mp5x" event={"ID":"e25c334c-c7a9-46d2-b94b-9956db4aac80","Type":"ContainerDied","Data":"6e951f81f5c5a9b7c5ffc69ed43b0a83d60384b01f23a028a73575b9edb36eea"} Sep 29 11:03:14 crc kubenswrapper[4752]: I0929 11:03:14.214868 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-db-create-8mp5x" event={"ID":"e25c334c-c7a9-46d2-b94b-9956db4aac80","Type":"ContainerStarted","Data":"586b960e7452be90ed16f10f7088138fe25346bfba0b7377dfd43f2d538c2011"} Sep 29 11:03:15 crc kubenswrapper[4752]: I0929 11:03:15.615350 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-db-create-8mp5x" Sep 29 11:03:15 crc kubenswrapper[4752]: I0929 11:03:15.748936 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mjwmq\" (UniqueName: \"kubernetes.io/projected/e25c334c-c7a9-46d2-b94b-9956db4aac80-kube-api-access-mjwmq\") pod \"e25c334c-c7a9-46d2-b94b-9956db4aac80\" (UID: \"e25c334c-c7a9-46d2-b94b-9956db4aac80\") " Sep 29 11:03:15 crc kubenswrapper[4752]: I0929 11:03:15.757732 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e25c334c-c7a9-46d2-b94b-9956db4aac80-kube-api-access-mjwmq" (OuterVolumeSpecName: "kube-api-access-mjwmq") pod "e25c334c-c7a9-46d2-b94b-9956db4aac80" (UID: "e25c334c-c7a9-46d2-b94b-9956db4aac80"). InnerVolumeSpecName "kube-api-access-mjwmq". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 11:03:15 crc kubenswrapper[4752]: I0929 11:03:15.851209 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mjwmq\" (UniqueName: \"kubernetes.io/projected/e25c334c-c7a9-46d2-b94b-9956db4aac80-kube-api-access-mjwmq\") on node \"crc\" DevicePath \"\"" Sep 29 11:03:16 crc kubenswrapper[4752]: I0929 11:03:16.245771 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-db-create-8mp5x" event={"ID":"e25c334c-c7a9-46d2-b94b-9956db4aac80","Type":"ContainerDied","Data":"586b960e7452be90ed16f10f7088138fe25346bfba0b7377dfd43f2d538c2011"} Sep 29 11:03:16 crc kubenswrapper[4752]: I0929 11:03:16.246024 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="586b960e7452be90ed16f10f7088138fe25346bfba0b7377dfd43f2d538c2011" Sep 29 11:03:16 crc kubenswrapper[4752]: I0929 11:03:16.246138 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-db-create-8mp5x" Sep 29 11:03:16 crc kubenswrapper[4752]: I0929 11:03:16.250987 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/prometheus-metric-storage-0" event={"ID":"4c27e500-7063-40cc-9a6d-4e8fa0df4a98","Type":"ContainerStarted","Data":"e58c04a47f7d10a2b647dd1ae5468ef4d97d5f0ddcd53122a0317d502bace647"} Sep 29 11:03:20 crc kubenswrapper[4752]: I0929 11:03:20.294701 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/prometheus-metric-storage-0" event={"ID":"4c27e500-7063-40cc-9a6d-4e8fa0df4a98","Type":"ContainerStarted","Data":"5b05930ac951f4413dedf617531740751d1b12d313edeb1e3cf6c88707192605"} Sep 29 11:03:20 crc kubenswrapper[4752]: I0929 11:03:20.319898 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/prometheus-metric-storage-0" podStartSLOduration=4.786926158 podStartE2EDuration="47.319872313s" podCreationTimestamp="2025-09-29 11:02:33 +0000 UTC" firstStartedPulling="2025-09-29 11:02:36.927392432 +0000 UTC m=+1097.716534099" lastFinishedPulling="2025-09-29 11:03:19.460338587 +0000 UTC m=+1140.249480254" observedRunningTime="2025-09-29 11:03:20.318625061 +0000 UTC m=+1141.107766728" watchObservedRunningTime="2025-09-29 11:03:20.319872313 +0000 UTC m=+1141.109013980" Sep 29 11:03:20 crc kubenswrapper[4752]: I0929 11:03:20.458150 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/prometheus-metric-storage-0" Sep 29 11:03:20 crc kubenswrapper[4752]: I0929 11:03:20.458220 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/prometheus-metric-storage-0" Sep 29 11:03:20 crc kubenswrapper[4752]: I0929 11:03:20.460633 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/prometheus-metric-storage-0" Sep 29 11:03:21 crc kubenswrapper[4752]: I0929 11:03:21.162049 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-85546cf746-dfhr8" podUID="4f94737b-80ce-4e92-85df-18fa2b1cbe8e" containerName="console" containerID="cri-o://83b24978d536cb9d3776d3364ef7f94a3f28f84f3fd0ee80edc7ba5d4946f101" gracePeriod=15 Sep 29 11:03:21 crc kubenswrapper[4752]: I0929 11:03:21.306672 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-85546cf746-dfhr8_4f94737b-80ce-4e92-85df-18fa2b1cbe8e/console/0.log" Sep 29 11:03:21 crc kubenswrapper[4752]: I0929 11:03:21.306721 4752 generic.go:334] "Generic (PLEG): container finished" podID="4f94737b-80ce-4e92-85df-18fa2b1cbe8e" containerID="83b24978d536cb9d3776d3364ef7f94a3f28f84f3fd0ee80edc7ba5d4946f101" exitCode=2 Sep 29 11:03:21 crc kubenswrapper[4752]: I0929 11:03:21.306881 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-85546cf746-dfhr8" event={"ID":"4f94737b-80ce-4e92-85df-18fa2b1cbe8e","Type":"ContainerDied","Data":"83b24978d536cb9d3776d3364ef7f94a3f28f84f3fd0ee80edc7ba5d4946f101"} Sep 29 11:03:21 crc kubenswrapper[4752]: I0929 11:03:21.308619 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/prometheus-metric-storage-0" Sep 29 11:03:21 crc kubenswrapper[4752]: I0929 11:03:21.644195 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-85546cf746-dfhr8_4f94737b-80ce-4e92-85df-18fa2b1cbe8e/console/0.log" Sep 29 11:03:21 crc kubenswrapper[4752]: I0929 11:03:21.644655 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-85546cf746-dfhr8" Sep 29 11:03:21 crc kubenswrapper[4752]: I0929 11:03:21.764495 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4f94737b-80ce-4e92-85df-18fa2b1cbe8e-console-oauth-config\") pod \"4f94737b-80ce-4e92-85df-18fa2b1cbe8e\" (UID: \"4f94737b-80ce-4e92-85df-18fa2b1cbe8e\") " Sep 29 11:03:21 crc kubenswrapper[4752]: I0929 11:03:21.764642 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4f94737b-80ce-4e92-85df-18fa2b1cbe8e-console-config\") pod \"4f94737b-80ce-4e92-85df-18fa2b1cbe8e\" (UID: \"4f94737b-80ce-4e92-85df-18fa2b1cbe8e\") " Sep 29 11:03:21 crc kubenswrapper[4752]: I0929 11:03:21.765605 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f94737b-80ce-4e92-85df-18fa2b1cbe8e-console-config" (OuterVolumeSpecName: "console-config") pod "4f94737b-80ce-4e92-85df-18fa2b1cbe8e" (UID: "4f94737b-80ce-4e92-85df-18fa2b1cbe8e"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 11:03:21 crc kubenswrapper[4752]: I0929 11:03:21.765710 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4f94737b-80ce-4e92-85df-18fa2b1cbe8e-service-ca\") pod \"4f94737b-80ce-4e92-85df-18fa2b1cbe8e\" (UID: \"4f94737b-80ce-4e92-85df-18fa2b1cbe8e\") " Sep 29 11:03:21 crc kubenswrapper[4752]: I0929 11:03:21.765735 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4f94737b-80ce-4e92-85df-18fa2b1cbe8e-console-serving-cert\") pod \"4f94737b-80ce-4e92-85df-18fa2b1cbe8e\" (UID: \"4f94737b-80ce-4e92-85df-18fa2b1cbe8e\") " Sep 29 11:03:21 crc kubenswrapper[4752]: I0929 11:03:21.766117 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f94737b-80ce-4e92-85df-18fa2b1cbe8e-service-ca" (OuterVolumeSpecName: "service-ca") pod "4f94737b-80ce-4e92-85df-18fa2b1cbe8e" (UID: "4f94737b-80ce-4e92-85df-18fa2b1cbe8e"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 11:03:21 crc kubenswrapper[4752]: I0929 11:03:21.766180 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rjsw\" (UniqueName: \"kubernetes.io/projected/4f94737b-80ce-4e92-85df-18fa2b1cbe8e-kube-api-access-6rjsw\") pod \"4f94737b-80ce-4e92-85df-18fa2b1cbe8e\" (UID: \"4f94737b-80ce-4e92-85df-18fa2b1cbe8e\") " Sep 29 11:03:21 crc kubenswrapper[4752]: I0929 11:03:21.766209 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4f94737b-80ce-4e92-85df-18fa2b1cbe8e-trusted-ca-bundle\") pod \"4f94737b-80ce-4e92-85df-18fa2b1cbe8e\" (UID: \"4f94737b-80ce-4e92-85df-18fa2b1cbe8e\") " Sep 29 11:03:21 crc kubenswrapper[4752]: I0929 11:03:21.766264 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4f94737b-80ce-4e92-85df-18fa2b1cbe8e-oauth-serving-cert\") pod \"4f94737b-80ce-4e92-85df-18fa2b1cbe8e\" (UID: \"4f94737b-80ce-4e92-85df-18fa2b1cbe8e\") " Sep 29 11:03:21 crc kubenswrapper[4752]: I0929 11:03:21.766558 4752 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4f94737b-80ce-4e92-85df-18fa2b1cbe8e-service-ca\") on node \"crc\" DevicePath \"\"" Sep 29 11:03:21 crc kubenswrapper[4752]: I0929 11:03:21.766577 4752 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4f94737b-80ce-4e92-85df-18fa2b1cbe8e-console-config\") on node \"crc\" DevicePath \"\"" Sep 29 11:03:21 crc kubenswrapper[4752]: I0929 11:03:21.767257 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f94737b-80ce-4e92-85df-18fa2b1cbe8e-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "4f94737b-80ce-4e92-85df-18fa2b1cbe8e" (UID: "4f94737b-80ce-4e92-85df-18fa2b1cbe8e"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 11:03:21 crc kubenswrapper[4752]: I0929 11:03:21.767394 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f94737b-80ce-4e92-85df-18fa2b1cbe8e-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "4f94737b-80ce-4e92-85df-18fa2b1cbe8e" (UID: "4f94737b-80ce-4e92-85df-18fa2b1cbe8e"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 11:03:21 crc kubenswrapper[4752]: I0929 11:03:21.771999 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f94737b-80ce-4e92-85df-18fa2b1cbe8e-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "4f94737b-80ce-4e92-85df-18fa2b1cbe8e" (UID: "4f94737b-80ce-4e92-85df-18fa2b1cbe8e"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:03:21 crc kubenswrapper[4752]: I0929 11:03:21.772104 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f94737b-80ce-4e92-85df-18fa2b1cbe8e-kube-api-access-6rjsw" (OuterVolumeSpecName: "kube-api-access-6rjsw") pod "4f94737b-80ce-4e92-85df-18fa2b1cbe8e" (UID: "4f94737b-80ce-4e92-85df-18fa2b1cbe8e"). InnerVolumeSpecName "kube-api-access-6rjsw". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 11:03:21 crc kubenswrapper[4752]: I0929 11:03:21.773249 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f94737b-80ce-4e92-85df-18fa2b1cbe8e-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "4f94737b-80ce-4e92-85df-18fa2b1cbe8e" (UID: "4f94737b-80ce-4e92-85df-18fa2b1cbe8e"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:03:21 crc kubenswrapper[4752]: I0929 11:03:21.867925 4752 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4f94737b-80ce-4e92-85df-18fa2b1cbe8e-console-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 29 11:03:21 crc kubenswrapper[4752]: I0929 11:03:21.867968 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6rjsw\" (UniqueName: \"kubernetes.io/projected/4f94737b-80ce-4e92-85df-18fa2b1cbe8e-kube-api-access-6rjsw\") on node \"crc\" DevicePath \"\"" Sep 29 11:03:21 crc kubenswrapper[4752]: I0929 11:03:21.867984 4752 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4f94737b-80ce-4e92-85df-18fa2b1cbe8e-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 11:03:21 crc kubenswrapper[4752]: I0929 11:03:21.867998 4752 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4f94737b-80ce-4e92-85df-18fa2b1cbe8e-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 29 11:03:21 crc kubenswrapper[4752]: I0929 11:03:21.868012 4752 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4f94737b-80ce-4e92-85df-18fa2b1cbe8e-console-oauth-config\") on node \"crc\" DevicePath \"\"" Sep 29 11:03:22 crc kubenswrapper[4752]: I0929 11:03:22.315931 4752 generic.go:334] "Generic (PLEG): container finished" podID="0b2b1ef4-961c-4803-856d-4d6deb42cc10" containerID="720bb814a5bd0e7f20cf0ca8b7c41eae841a10b316796699d5cd44a22bf80160" exitCode=0 Sep 29 11:03:22 crc kubenswrapper[4752]: I0929 11:03:22.315998 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/rabbitmq-server-0" event={"ID":"0b2b1ef4-961c-4803-856d-4d6deb42cc10","Type":"ContainerDied","Data":"720bb814a5bd0e7f20cf0ca8b7c41eae841a10b316796699d5cd44a22bf80160"} Sep 29 11:03:22 crc kubenswrapper[4752]: I0929 11:03:22.320458 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-85546cf746-dfhr8_4f94737b-80ce-4e92-85df-18fa2b1cbe8e/console/0.log" Sep 29 11:03:22 crc kubenswrapper[4752]: I0929 11:03:22.320564 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-85546cf746-dfhr8" Sep 29 11:03:22 crc kubenswrapper[4752]: I0929 11:03:22.321107 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-85546cf746-dfhr8" event={"ID":"4f94737b-80ce-4e92-85df-18fa2b1cbe8e","Type":"ContainerDied","Data":"c9445b3e2f7a928bf6777b3822af936db527b5afef5bb86e0775789d605f8c6e"} Sep 29 11:03:22 crc kubenswrapper[4752]: I0929 11:03:22.321141 4752 scope.go:117] "RemoveContainer" containerID="83b24978d536cb9d3776d3364ef7f94a3f28f84f3fd0ee80edc7ba5d4946f101" Sep 29 11:03:22 crc kubenswrapper[4752]: I0929 11:03:22.322956 4752 generic.go:334] "Generic (PLEG): container finished" podID="23451e36-d03b-4039-ba05-d20e013b089b" containerID="34f419b9a68d0a4a859f95d55a246318e52e7f99c5029308854ec0ab6ac6d26d" exitCode=0 Sep 29 11:03:22 crc kubenswrapper[4752]: I0929 11:03:22.323889 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/rabbitmq-notifications-server-0" event={"ID":"23451e36-d03b-4039-ba05-d20e013b089b","Type":"ContainerDied","Data":"34f419b9a68d0a4a859f95d55a246318e52e7f99c5029308854ec0ab6ac6d26d"} Sep 29 11:03:22 crc kubenswrapper[4752]: I0929 11:03:22.498111 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-85546cf746-dfhr8"] Sep 29 11:03:22 crc kubenswrapper[4752]: I0929 11:03:22.510706 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-85546cf746-dfhr8"] Sep 29 11:03:23 crc kubenswrapper[4752]: I0929 11:03:23.101237 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/keystone-238e-account-create-vpltk"] Sep 29 11:03:23 crc kubenswrapper[4752]: E0929 11:03:23.101616 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f94737b-80ce-4e92-85df-18fa2b1cbe8e" containerName="console" Sep 29 11:03:23 crc kubenswrapper[4752]: I0929 11:03:23.101633 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f94737b-80ce-4e92-85df-18fa2b1cbe8e" containerName="console" Sep 29 11:03:23 crc kubenswrapper[4752]: E0929 11:03:23.101644 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e25c334c-c7a9-46d2-b94b-9956db4aac80" containerName="mariadb-database-create" Sep 29 11:03:23 crc kubenswrapper[4752]: I0929 11:03:23.101650 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="e25c334c-c7a9-46d2-b94b-9956db4aac80" containerName="mariadb-database-create" Sep 29 11:03:23 crc kubenswrapper[4752]: I0929 11:03:23.101984 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f94737b-80ce-4e92-85df-18fa2b1cbe8e" containerName="console" Sep 29 11:03:23 crc kubenswrapper[4752]: I0929 11:03:23.102000 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="e25c334c-c7a9-46d2-b94b-9956db4aac80" containerName="mariadb-database-create" Sep 29 11:03:23 crc kubenswrapper[4752]: I0929 11:03:23.102698 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-238e-account-create-vpltk" Sep 29 11:03:23 crc kubenswrapper[4752]: I0929 11:03:23.105534 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"keystone-db-secret" Sep 29 11:03:23 crc kubenswrapper[4752]: I0929 11:03:23.117532 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/keystone-238e-account-create-vpltk"] Sep 29 11:03:23 crc kubenswrapper[4752]: I0929 11:03:23.197679 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5jmx\" (UniqueName: \"kubernetes.io/projected/9a39b4ca-d533-4620-8e4e-827f7e2dc8de-kube-api-access-r5jmx\") pod \"keystone-238e-account-create-vpltk\" (UID: \"9a39b4ca-d533-4620-8e4e-827f7e2dc8de\") " pod="watcher-kuttl-default/keystone-238e-account-create-vpltk" Sep 29 11:03:23 crc kubenswrapper[4752]: I0929 11:03:23.298973 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5jmx\" (UniqueName: \"kubernetes.io/projected/9a39b4ca-d533-4620-8e4e-827f7e2dc8de-kube-api-access-r5jmx\") pod \"keystone-238e-account-create-vpltk\" (UID: \"9a39b4ca-d533-4620-8e4e-827f7e2dc8de\") " pod="watcher-kuttl-default/keystone-238e-account-create-vpltk" Sep 29 11:03:23 crc kubenswrapper[4752]: I0929 11:03:23.317874 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5jmx\" (UniqueName: \"kubernetes.io/projected/9a39b4ca-d533-4620-8e4e-827f7e2dc8de-kube-api-access-r5jmx\") pod \"keystone-238e-account-create-vpltk\" (UID: \"9a39b4ca-d533-4620-8e4e-827f7e2dc8de\") " pod="watcher-kuttl-default/keystone-238e-account-create-vpltk" Sep 29 11:03:23 crc kubenswrapper[4752]: I0929 11:03:23.360950 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/rabbitmq-server-0" event={"ID":"0b2b1ef4-961c-4803-856d-4d6deb42cc10","Type":"ContainerStarted","Data":"f7f7741d8be15f064168a86a059030861ef15c36fed3997872025aa6fe65beef"} Sep 29 11:03:23 crc kubenswrapper[4752]: I0929 11:03:23.362631 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/rabbitmq-server-0" Sep 29 11:03:23 crc kubenswrapper[4752]: I0929 11:03:23.376297 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/rabbitmq-notifications-server-0" event={"ID":"23451e36-d03b-4039-ba05-d20e013b089b","Type":"ContainerStarted","Data":"b914dc019238dfc872773cf7d1483e0f62c371ec0f8bcf45ba18a509032aff7f"} Sep 29 11:03:23 crc kubenswrapper[4752]: I0929 11:03:23.376695 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Sep 29 11:03:23 crc kubenswrapper[4752]: I0929 11:03:23.399410 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/rabbitmq-server-0" podStartSLOduration=34.810068157 podStartE2EDuration="53.399381791s" podCreationTimestamp="2025-09-29 11:02:30 +0000 UTC" firstStartedPulling="2025-09-29 11:02:32.101941227 +0000 UTC m=+1092.891082894" lastFinishedPulling="2025-09-29 11:02:50.691254851 +0000 UTC m=+1111.480396528" observedRunningTime="2025-09-29 11:03:23.385352941 +0000 UTC m=+1144.174494618" watchObservedRunningTime="2025-09-29 11:03:23.399381791 +0000 UTC m=+1144.188523458" Sep 29 11:03:23 crc kubenswrapper[4752]: I0929 11:03:23.418787 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/rabbitmq-notifications-server-0" podStartSLOduration=35.277151086 podStartE2EDuration="53.418772399s" podCreationTimestamp="2025-09-29 11:02:30 +0000 UTC" firstStartedPulling="2025-09-29 11:02:32.570772801 +0000 UTC m=+1093.359914468" lastFinishedPulling="2025-09-29 11:02:50.712394114 +0000 UTC m=+1111.501535781" observedRunningTime="2025-09-29 11:03:23.416398217 +0000 UTC m=+1144.205539884" watchObservedRunningTime="2025-09-29 11:03:23.418772399 +0000 UTC m=+1144.207914066" Sep 29 11:03:23 crc kubenswrapper[4752]: I0929 11:03:23.422170 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-238e-account-create-vpltk" Sep 29 11:03:23 crc kubenswrapper[4752]: I0929 11:03:23.835466 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/keystone-238e-account-create-vpltk"] Sep 29 11:03:23 crc kubenswrapper[4752]: I0929 11:03:23.932952 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/prometheus-metric-storage-0"] Sep 29 11:03:24 crc kubenswrapper[4752]: I0929 11:03:24.040757 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f94737b-80ce-4e92-85df-18fa2b1cbe8e" path="/var/lib/kubelet/pods/4f94737b-80ce-4e92-85df-18fa2b1cbe8e/volumes" Sep 29 11:03:24 crc kubenswrapper[4752]: I0929 11:03:24.384596 4752 generic.go:334] "Generic (PLEG): container finished" podID="9a39b4ca-d533-4620-8e4e-827f7e2dc8de" containerID="53d9cefff49de70973024a39cc6726dd9a2b5fd75f345c1ae68c6f8604cbae54" exitCode=0 Sep 29 11:03:24 crc kubenswrapper[4752]: I0929 11:03:24.384663 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-238e-account-create-vpltk" event={"ID":"9a39b4ca-d533-4620-8e4e-827f7e2dc8de","Type":"ContainerDied","Data":"53d9cefff49de70973024a39cc6726dd9a2b5fd75f345c1ae68c6f8604cbae54"} Sep 29 11:03:24 crc kubenswrapper[4752]: I0929 11:03:24.384742 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-238e-account-create-vpltk" event={"ID":"9a39b4ca-d533-4620-8e4e-827f7e2dc8de","Type":"ContainerStarted","Data":"edbc200ada33683dbb38c21be0b564aa6aa6e2db9ad2a94504e6a6feb6cab808"} Sep 29 11:03:24 crc kubenswrapper[4752]: I0929 11:03:24.384791 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/prometheus-metric-storage-0" podUID="4c27e500-7063-40cc-9a6d-4e8fa0df4a98" containerName="prometheus" containerID="cri-o://f1571559973f20c9d9a814286b4195ca2ddf0bdf3811928039f17e0a52032c95" gracePeriod=600 Sep 29 11:03:24 crc kubenswrapper[4752]: I0929 11:03:24.384885 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/prometheus-metric-storage-0" podUID="4c27e500-7063-40cc-9a6d-4e8fa0df4a98" containerName="thanos-sidecar" containerID="cri-o://5b05930ac951f4413dedf617531740751d1b12d313edeb1e3cf6c88707192605" gracePeriod=600 Sep 29 11:03:24 crc kubenswrapper[4752]: I0929 11:03:24.384921 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/prometheus-metric-storage-0" podUID="4c27e500-7063-40cc-9a6d-4e8fa0df4a98" containerName="config-reloader" containerID="cri-o://e58c04a47f7d10a2b647dd1ae5468ef4d97d5f0ddcd53122a0317d502bace647" gracePeriod=600 Sep 29 11:03:24 crc kubenswrapper[4752]: I0929 11:03:24.920623 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/prometheus-metric-storage-0" Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.025992 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-65a84658-aa50-4259-8efb-e2e46a6339b3\") pod \"4c27e500-7063-40cc-9a6d-4e8fa0df4a98\" (UID: \"4c27e500-7063-40cc-9a6d-4e8fa0df4a98\") " Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.026063 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4c27e500-7063-40cc-9a6d-4e8fa0df4a98-web-config\") pod \"4c27e500-7063-40cc-9a6d-4e8fa0df4a98\" (UID: \"4c27e500-7063-40cc-9a6d-4e8fa0df4a98\") " Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.026097 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4c27e500-7063-40cc-9a6d-4e8fa0df4a98-config\") pod \"4c27e500-7063-40cc-9a6d-4e8fa0df4a98\" (UID: \"4c27e500-7063-40cc-9a6d-4e8fa0df4a98\") " Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.026178 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4c27e500-7063-40cc-9a6d-4e8fa0df4a98-config-out\") pod \"4c27e500-7063-40cc-9a6d-4e8fa0df4a98\" (UID: \"4c27e500-7063-40cc-9a6d-4e8fa0df4a98\") " Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.026211 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4c27e500-7063-40cc-9a6d-4e8fa0df4a98-tls-assets\") pod \"4c27e500-7063-40cc-9a6d-4e8fa0df4a98\" (UID: \"4c27e500-7063-40cc-9a6d-4e8fa0df4a98\") " Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.026239 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l86vs\" (UniqueName: \"kubernetes.io/projected/4c27e500-7063-40cc-9a6d-4e8fa0df4a98-kube-api-access-l86vs\") pod \"4c27e500-7063-40cc-9a6d-4e8fa0df4a98\" (UID: \"4c27e500-7063-40cc-9a6d-4e8fa0df4a98\") " Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.026267 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/4c27e500-7063-40cc-9a6d-4e8fa0df4a98-prometheus-metric-storage-rulefiles-0\") pod \"4c27e500-7063-40cc-9a6d-4e8fa0df4a98\" (UID: \"4c27e500-7063-40cc-9a6d-4e8fa0df4a98\") " Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.026314 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/4c27e500-7063-40cc-9a6d-4e8fa0df4a98-thanos-prometheus-http-client-file\") pod \"4c27e500-7063-40cc-9a6d-4e8fa0df4a98\" (UID: \"4c27e500-7063-40cc-9a6d-4e8fa0df4a98\") " Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.027382 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c27e500-7063-40cc-9a6d-4e8fa0df4a98-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "4c27e500-7063-40cc-9a6d-4e8fa0df4a98" (UID: "4c27e500-7063-40cc-9a6d-4e8fa0df4a98"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.031748 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c27e500-7063-40cc-9a6d-4e8fa0df4a98-config-out" (OuterVolumeSpecName: "config-out") pod "4c27e500-7063-40cc-9a6d-4e8fa0df4a98" (UID: "4c27e500-7063-40cc-9a6d-4e8fa0df4a98"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.031847 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c27e500-7063-40cc-9a6d-4e8fa0df4a98-config" (OuterVolumeSpecName: "config") pod "4c27e500-7063-40cc-9a6d-4e8fa0df4a98" (UID: "4c27e500-7063-40cc-9a6d-4e8fa0df4a98"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.031965 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c27e500-7063-40cc-9a6d-4e8fa0df4a98-kube-api-access-l86vs" (OuterVolumeSpecName: "kube-api-access-l86vs") pod "4c27e500-7063-40cc-9a6d-4e8fa0df4a98" (UID: "4c27e500-7063-40cc-9a6d-4e8fa0df4a98"). InnerVolumeSpecName "kube-api-access-l86vs". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.032320 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c27e500-7063-40cc-9a6d-4e8fa0df4a98-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "4c27e500-7063-40cc-9a6d-4e8fa0df4a98" (UID: "4c27e500-7063-40cc-9a6d-4e8fa0df4a98"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.033041 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c27e500-7063-40cc-9a6d-4e8fa0df4a98-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "4c27e500-7063-40cc-9a6d-4e8fa0df4a98" (UID: "4c27e500-7063-40cc-9a6d-4e8fa0df4a98"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.043188 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-65a84658-aa50-4259-8efb-e2e46a6339b3" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "4c27e500-7063-40cc-9a6d-4e8fa0df4a98" (UID: "4c27e500-7063-40cc-9a6d-4e8fa0df4a98"). InnerVolumeSpecName "pvc-65a84658-aa50-4259-8efb-e2e46a6339b3". PluginName "kubernetes.io/csi", VolumeGidValue "" Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.051944 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c27e500-7063-40cc-9a6d-4e8fa0df4a98-web-config" (OuterVolumeSpecName: "web-config") pod "4c27e500-7063-40cc-9a6d-4e8fa0df4a98" (UID: "4c27e500-7063-40cc-9a6d-4e8fa0df4a98"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.128430 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l86vs\" (UniqueName: \"kubernetes.io/projected/4c27e500-7063-40cc-9a6d-4e8fa0df4a98-kube-api-access-l86vs\") on node \"crc\" DevicePath \"\"" Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.128461 4752 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/4c27e500-7063-40cc-9a6d-4e8fa0df4a98-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.128540 4752 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/4c27e500-7063-40cc-9a6d-4e8fa0df4a98-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.128566 4752 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-65a84658-aa50-4259-8efb-e2e46a6339b3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-65a84658-aa50-4259-8efb-e2e46a6339b3\") on node \"crc\" " Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.128718 4752 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4c27e500-7063-40cc-9a6d-4e8fa0df4a98-web-config\") on node \"crc\" DevicePath \"\"" Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.128762 4752 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/4c27e500-7063-40cc-9a6d-4e8fa0df4a98-config\") on node \"crc\" DevicePath \"\"" Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.128774 4752 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4c27e500-7063-40cc-9a6d-4e8fa0df4a98-config-out\") on node \"crc\" DevicePath \"\"" Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.128783 4752 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4c27e500-7063-40cc-9a6d-4e8fa0df4a98-tls-assets\") on node \"crc\" DevicePath \"\"" Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.144502 4752 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.145069 4752 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-65a84658-aa50-4259-8efb-e2e46a6339b3" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-65a84658-aa50-4259-8efb-e2e46a6339b3") on node "crc" Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.230225 4752 reconciler_common.go:293] "Volume detached for volume \"pvc-65a84658-aa50-4259-8efb-e2e46a6339b3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-65a84658-aa50-4259-8efb-e2e46a6339b3\") on node \"crc\" DevicePath \"\"" Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.395012 4752 generic.go:334] "Generic (PLEG): container finished" podID="4c27e500-7063-40cc-9a6d-4e8fa0df4a98" containerID="5b05930ac951f4413dedf617531740751d1b12d313edeb1e3cf6c88707192605" exitCode=0 Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.395052 4752 generic.go:334] "Generic (PLEG): container finished" podID="4c27e500-7063-40cc-9a6d-4e8fa0df4a98" containerID="e58c04a47f7d10a2b647dd1ae5468ef4d97d5f0ddcd53122a0317d502bace647" exitCode=0 Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.395063 4752 generic.go:334] "Generic (PLEG): container finished" podID="4c27e500-7063-40cc-9a6d-4e8fa0df4a98" containerID="f1571559973f20c9d9a814286b4195ca2ddf0bdf3811928039f17e0a52032c95" exitCode=0 Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.395083 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/prometheus-metric-storage-0" Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.395098 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/prometheus-metric-storage-0" event={"ID":"4c27e500-7063-40cc-9a6d-4e8fa0df4a98","Type":"ContainerDied","Data":"5b05930ac951f4413dedf617531740751d1b12d313edeb1e3cf6c88707192605"} Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.395150 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/prometheus-metric-storage-0" event={"ID":"4c27e500-7063-40cc-9a6d-4e8fa0df4a98","Type":"ContainerDied","Data":"e58c04a47f7d10a2b647dd1ae5468ef4d97d5f0ddcd53122a0317d502bace647"} Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.395163 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/prometheus-metric-storage-0" event={"ID":"4c27e500-7063-40cc-9a6d-4e8fa0df4a98","Type":"ContainerDied","Data":"f1571559973f20c9d9a814286b4195ca2ddf0bdf3811928039f17e0a52032c95"} Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.395172 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/prometheus-metric-storage-0" event={"ID":"4c27e500-7063-40cc-9a6d-4e8fa0df4a98","Type":"ContainerDied","Data":"5957773d2291ea0407409a09509a21976b2c685a8186970434ea1744e4aaddc6"} Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.395194 4752 scope.go:117] "RemoveContainer" containerID="5b05930ac951f4413dedf617531740751d1b12d313edeb1e3cf6c88707192605" Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.438120 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/prometheus-metric-storage-0"] Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.445260 4752 scope.go:117] "RemoveContainer" containerID="e58c04a47f7d10a2b647dd1ae5468ef4d97d5f0ddcd53122a0317d502bace647" Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.447821 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/prometheus-metric-storage-0"] Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.470381 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/prometheus-metric-storage-0"] Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.470436 4752 scope.go:117] "RemoveContainer" containerID="f1571559973f20c9d9a814286b4195ca2ddf0bdf3811928039f17e0a52032c95" Sep 29 11:03:25 crc kubenswrapper[4752]: E0929 11:03:25.470791 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c27e500-7063-40cc-9a6d-4e8fa0df4a98" containerName="config-reloader" Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.470829 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c27e500-7063-40cc-9a6d-4e8fa0df4a98" containerName="config-reloader" Sep 29 11:03:25 crc kubenswrapper[4752]: E0929 11:03:25.470868 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c27e500-7063-40cc-9a6d-4e8fa0df4a98" containerName="prometheus" Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.470876 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c27e500-7063-40cc-9a6d-4e8fa0df4a98" containerName="prometheus" Sep 29 11:03:25 crc kubenswrapper[4752]: E0929 11:03:25.470889 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c27e500-7063-40cc-9a6d-4e8fa0df4a98" containerName="init-config-reloader" Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.470898 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c27e500-7063-40cc-9a6d-4e8fa0df4a98" containerName="init-config-reloader" Sep 29 11:03:25 crc kubenswrapper[4752]: E0929 11:03:25.470919 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c27e500-7063-40cc-9a6d-4e8fa0df4a98" containerName="thanos-sidecar" Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.470945 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c27e500-7063-40cc-9a6d-4e8fa0df4a98" containerName="thanos-sidecar" Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.471149 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c27e500-7063-40cc-9a6d-4e8fa0df4a98" containerName="prometheus" Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.471174 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c27e500-7063-40cc-9a6d-4e8fa0df4a98" containerName="config-reloader" Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.471188 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c27e500-7063-40cc-9a6d-4e8fa0df4a98" containerName="thanos-sidecar" Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.478251 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/prometheus-metric-storage-0" Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.480856 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"prometheus-metric-storage" Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.482087 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.482182 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"watcher-kuttl-default"/"prometheus-metric-storage-rulefiles-0" Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.482232 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"prometheus-metric-storage-web-config" Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.492784 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-metric-storage-prometheus-svc" Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.492873 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"metric-storage-prometheus-dockercfg-bdf9r" Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.494415 4752 scope.go:117] "RemoveContainer" containerID="3fbc9051d817cdbe2d0a4d9b32e0712511bb3df0f5e6f1091c65fbd66abc443f" Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.496160 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"prometheus-metric-storage-tls-assets-0" Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.499527 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/prometheus-metric-storage-0"] Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.577932 4752 scope.go:117] "RemoveContainer" containerID="5b05930ac951f4413dedf617531740751d1b12d313edeb1e3cf6c88707192605" Sep 29 11:03:25 crc kubenswrapper[4752]: E0929 11:03:25.578356 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b05930ac951f4413dedf617531740751d1b12d313edeb1e3cf6c88707192605\": container with ID starting with 5b05930ac951f4413dedf617531740751d1b12d313edeb1e3cf6c88707192605 not found: ID does not exist" containerID="5b05930ac951f4413dedf617531740751d1b12d313edeb1e3cf6c88707192605" Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.578392 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b05930ac951f4413dedf617531740751d1b12d313edeb1e3cf6c88707192605"} err="failed to get container status \"5b05930ac951f4413dedf617531740751d1b12d313edeb1e3cf6c88707192605\": rpc error: code = NotFound desc = could not find container \"5b05930ac951f4413dedf617531740751d1b12d313edeb1e3cf6c88707192605\": container with ID starting with 5b05930ac951f4413dedf617531740751d1b12d313edeb1e3cf6c88707192605 not found: ID does not exist" Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.578429 4752 scope.go:117] "RemoveContainer" containerID="e58c04a47f7d10a2b647dd1ae5468ef4d97d5f0ddcd53122a0317d502bace647" Sep 29 11:03:25 crc kubenswrapper[4752]: E0929 11:03:25.578704 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e58c04a47f7d10a2b647dd1ae5468ef4d97d5f0ddcd53122a0317d502bace647\": container with ID starting with e58c04a47f7d10a2b647dd1ae5468ef4d97d5f0ddcd53122a0317d502bace647 not found: ID does not exist" containerID="e58c04a47f7d10a2b647dd1ae5468ef4d97d5f0ddcd53122a0317d502bace647" Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.578748 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e58c04a47f7d10a2b647dd1ae5468ef4d97d5f0ddcd53122a0317d502bace647"} err="failed to get container status \"e58c04a47f7d10a2b647dd1ae5468ef4d97d5f0ddcd53122a0317d502bace647\": rpc error: code = NotFound desc = could not find container \"e58c04a47f7d10a2b647dd1ae5468ef4d97d5f0ddcd53122a0317d502bace647\": container with ID starting with e58c04a47f7d10a2b647dd1ae5468ef4d97d5f0ddcd53122a0317d502bace647 not found: ID does not exist" Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.578769 4752 scope.go:117] "RemoveContainer" containerID="f1571559973f20c9d9a814286b4195ca2ddf0bdf3811928039f17e0a52032c95" Sep 29 11:03:25 crc kubenswrapper[4752]: E0929 11:03:25.579003 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1571559973f20c9d9a814286b4195ca2ddf0bdf3811928039f17e0a52032c95\": container with ID starting with f1571559973f20c9d9a814286b4195ca2ddf0bdf3811928039f17e0a52032c95 not found: ID does not exist" containerID="f1571559973f20c9d9a814286b4195ca2ddf0bdf3811928039f17e0a52032c95" Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.579027 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1571559973f20c9d9a814286b4195ca2ddf0bdf3811928039f17e0a52032c95"} err="failed to get container status \"f1571559973f20c9d9a814286b4195ca2ddf0bdf3811928039f17e0a52032c95\": rpc error: code = NotFound desc = could not find container \"f1571559973f20c9d9a814286b4195ca2ddf0bdf3811928039f17e0a52032c95\": container with ID starting with f1571559973f20c9d9a814286b4195ca2ddf0bdf3811928039f17e0a52032c95 not found: ID does not exist" Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.579055 4752 scope.go:117] "RemoveContainer" containerID="3fbc9051d817cdbe2d0a4d9b32e0712511bb3df0f5e6f1091c65fbd66abc443f" Sep 29 11:03:25 crc kubenswrapper[4752]: E0929 11:03:25.579249 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3fbc9051d817cdbe2d0a4d9b32e0712511bb3df0f5e6f1091c65fbd66abc443f\": container with ID starting with 3fbc9051d817cdbe2d0a4d9b32e0712511bb3df0f5e6f1091c65fbd66abc443f not found: ID does not exist" containerID="3fbc9051d817cdbe2d0a4d9b32e0712511bb3df0f5e6f1091c65fbd66abc443f" Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.579267 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3fbc9051d817cdbe2d0a4d9b32e0712511bb3df0f5e6f1091c65fbd66abc443f"} err="failed to get container status \"3fbc9051d817cdbe2d0a4d9b32e0712511bb3df0f5e6f1091c65fbd66abc443f\": rpc error: code = NotFound desc = could not find container \"3fbc9051d817cdbe2d0a4d9b32e0712511bb3df0f5e6f1091c65fbd66abc443f\": container with ID starting with 3fbc9051d817cdbe2d0a4d9b32e0712511bb3df0f5e6f1091c65fbd66abc443f not found: ID does not exist" Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.579302 4752 scope.go:117] "RemoveContainer" containerID="5b05930ac951f4413dedf617531740751d1b12d313edeb1e3cf6c88707192605" Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.579500 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b05930ac951f4413dedf617531740751d1b12d313edeb1e3cf6c88707192605"} err="failed to get container status \"5b05930ac951f4413dedf617531740751d1b12d313edeb1e3cf6c88707192605\": rpc error: code = NotFound desc = could not find container \"5b05930ac951f4413dedf617531740751d1b12d313edeb1e3cf6c88707192605\": container with ID starting with 5b05930ac951f4413dedf617531740751d1b12d313edeb1e3cf6c88707192605 not found: ID does not exist" Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.579517 4752 scope.go:117] "RemoveContainer" containerID="e58c04a47f7d10a2b647dd1ae5468ef4d97d5f0ddcd53122a0317d502bace647" Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.579688 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e58c04a47f7d10a2b647dd1ae5468ef4d97d5f0ddcd53122a0317d502bace647"} err="failed to get container status \"e58c04a47f7d10a2b647dd1ae5468ef4d97d5f0ddcd53122a0317d502bace647\": rpc error: code = NotFound desc = could not find container \"e58c04a47f7d10a2b647dd1ae5468ef4d97d5f0ddcd53122a0317d502bace647\": container with ID starting with e58c04a47f7d10a2b647dd1ae5468ef4d97d5f0ddcd53122a0317d502bace647 not found: ID does not exist" Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.579707 4752 scope.go:117] "RemoveContainer" containerID="f1571559973f20c9d9a814286b4195ca2ddf0bdf3811928039f17e0a52032c95" Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.579961 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1571559973f20c9d9a814286b4195ca2ddf0bdf3811928039f17e0a52032c95"} err="failed to get container status \"f1571559973f20c9d9a814286b4195ca2ddf0bdf3811928039f17e0a52032c95\": rpc error: code = NotFound desc = could not find container \"f1571559973f20c9d9a814286b4195ca2ddf0bdf3811928039f17e0a52032c95\": container with ID starting with f1571559973f20c9d9a814286b4195ca2ddf0bdf3811928039f17e0a52032c95 not found: ID does not exist" Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.579974 4752 scope.go:117] "RemoveContainer" containerID="3fbc9051d817cdbe2d0a4d9b32e0712511bb3df0f5e6f1091c65fbd66abc443f" Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.580191 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3fbc9051d817cdbe2d0a4d9b32e0712511bb3df0f5e6f1091c65fbd66abc443f"} err="failed to get container status \"3fbc9051d817cdbe2d0a4d9b32e0712511bb3df0f5e6f1091c65fbd66abc443f\": rpc error: code = NotFound desc = could not find container \"3fbc9051d817cdbe2d0a4d9b32e0712511bb3df0f5e6f1091c65fbd66abc443f\": container with ID starting with 3fbc9051d817cdbe2d0a4d9b32e0712511bb3df0f5e6f1091c65fbd66abc443f not found: ID does not exist" Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.580204 4752 scope.go:117] "RemoveContainer" containerID="5b05930ac951f4413dedf617531740751d1b12d313edeb1e3cf6c88707192605" Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.580380 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b05930ac951f4413dedf617531740751d1b12d313edeb1e3cf6c88707192605"} err="failed to get container status \"5b05930ac951f4413dedf617531740751d1b12d313edeb1e3cf6c88707192605\": rpc error: code = NotFound desc = could not find container \"5b05930ac951f4413dedf617531740751d1b12d313edeb1e3cf6c88707192605\": container with ID starting with 5b05930ac951f4413dedf617531740751d1b12d313edeb1e3cf6c88707192605 not found: ID does not exist" Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.580394 4752 scope.go:117] "RemoveContainer" containerID="e58c04a47f7d10a2b647dd1ae5468ef4d97d5f0ddcd53122a0317d502bace647" Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.580961 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e58c04a47f7d10a2b647dd1ae5468ef4d97d5f0ddcd53122a0317d502bace647"} err="failed to get container status \"e58c04a47f7d10a2b647dd1ae5468ef4d97d5f0ddcd53122a0317d502bace647\": rpc error: code = NotFound desc = could not find container \"e58c04a47f7d10a2b647dd1ae5468ef4d97d5f0ddcd53122a0317d502bace647\": container with ID starting with e58c04a47f7d10a2b647dd1ae5468ef4d97d5f0ddcd53122a0317d502bace647 not found: ID does not exist" Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.580981 4752 scope.go:117] "RemoveContainer" containerID="f1571559973f20c9d9a814286b4195ca2ddf0bdf3811928039f17e0a52032c95" Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.581186 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1571559973f20c9d9a814286b4195ca2ddf0bdf3811928039f17e0a52032c95"} err="failed to get container status \"f1571559973f20c9d9a814286b4195ca2ddf0bdf3811928039f17e0a52032c95\": rpc error: code = NotFound desc = could not find container \"f1571559973f20c9d9a814286b4195ca2ddf0bdf3811928039f17e0a52032c95\": container with ID starting with f1571559973f20c9d9a814286b4195ca2ddf0bdf3811928039f17e0a52032c95 not found: ID does not exist" Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.581203 4752 scope.go:117] "RemoveContainer" containerID="3fbc9051d817cdbe2d0a4d9b32e0712511bb3df0f5e6f1091c65fbd66abc443f" Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.581489 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3fbc9051d817cdbe2d0a4d9b32e0712511bb3df0f5e6f1091c65fbd66abc443f"} err="failed to get container status \"3fbc9051d817cdbe2d0a4d9b32e0712511bb3df0f5e6f1091c65fbd66abc443f\": rpc error: code = NotFound desc = could not find container \"3fbc9051d817cdbe2d0a4d9b32e0712511bb3df0f5e6f1091c65fbd66abc443f\": container with ID starting with 3fbc9051d817cdbe2d0a4d9b32e0712511bb3df0f5e6f1091c65fbd66abc443f not found: ID does not exist" Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.637163 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e1e250ba-1828-4bd7-8e1d-71e5aea5a7f2-config\") pod \"prometheus-metric-storage-0\" (UID: \"e1e250ba-1828-4bd7-8e1d-71e5aea5a7f2\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.637219 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e1e250ba-1828-4bd7-8e1d-71e5aea5a7f2-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"e1e250ba-1828-4bd7-8e1d-71e5aea5a7f2\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.637242 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e1e250ba-1828-4bd7-8e1d-71e5aea5a7f2-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"e1e250ba-1828-4bd7-8e1d-71e5aea5a7f2\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.637265 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e1e250ba-1828-4bd7-8e1d-71e5aea5a7f2-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"e1e250ba-1828-4bd7-8e1d-71e5aea5a7f2\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.637284 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/e1e250ba-1828-4bd7-8e1d-71e5aea5a7f2-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"e1e250ba-1828-4bd7-8e1d-71e5aea5a7f2\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.637303 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwzcq\" (UniqueName: \"kubernetes.io/projected/e1e250ba-1828-4bd7-8e1d-71e5aea5a7f2-kube-api-access-bwzcq\") pod \"prometheus-metric-storage-0\" (UID: \"e1e250ba-1828-4bd7-8e1d-71e5aea5a7f2\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.637341 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/e1e250ba-1828-4bd7-8e1d-71e5aea5a7f2-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"e1e250ba-1828-4bd7-8e1d-71e5aea5a7f2\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.637362 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e1e250ba-1828-4bd7-8e1d-71e5aea5a7f2-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"e1e250ba-1828-4bd7-8e1d-71e5aea5a7f2\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.637380 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1e250ba-1828-4bd7-8e1d-71e5aea5a7f2-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"e1e250ba-1828-4bd7-8e1d-71e5aea5a7f2\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.637395 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e1e250ba-1828-4bd7-8e1d-71e5aea5a7f2-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"e1e250ba-1828-4bd7-8e1d-71e5aea5a7f2\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.637432 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-65a84658-aa50-4259-8efb-e2e46a6339b3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-65a84658-aa50-4259-8efb-e2e46a6339b3\") pod \"prometheus-metric-storage-0\" (UID: \"e1e250ba-1828-4bd7-8e1d-71e5aea5a7f2\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.739318 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e1e250ba-1828-4bd7-8e1d-71e5aea5a7f2-config\") pod \"prometheus-metric-storage-0\" (UID: \"e1e250ba-1828-4bd7-8e1d-71e5aea5a7f2\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.739368 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e1e250ba-1828-4bd7-8e1d-71e5aea5a7f2-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"e1e250ba-1828-4bd7-8e1d-71e5aea5a7f2\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.739391 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e1e250ba-1828-4bd7-8e1d-71e5aea5a7f2-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"e1e250ba-1828-4bd7-8e1d-71e5aea5a7f2\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.739413 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e1e250ba-1828-4bd7-8e1d-71e5aea5a7f2-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"e1e250ba-1828-4bd7-8e1d-71e5aea5a7f2\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.739429 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/e1e250ba-1828-4bd7-8e1d-71e5aea5a7f2-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"e1e250ba-1828-4bd7-8e1d-71e5aea5a7f2\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.739450 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwzcq\" (UniqueName: \"kubernetes.io/projected/e1e250ba-1828-4bd7-8e1d-71e5aea5a7f2-kube-api-access-bwzcq\") pod \"prometheus-metric-storage-0\" (UID: \"e1e250ba-1828-4bd7-8e1d-71e5aea5a7f2\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.739491 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/e1e250ba-1828-4bd7-8e1d-71e5aea5a7f2-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"e1e250ba-1828-4bd7-8e1d-71e5aea5a7f2\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.739516 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e1e250ba-1828-4bd7-8e1d-71e5aea5a7f2-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"e1e250ba-1828-4bd7-8e1d-71e5aea5a7f2\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.739537 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1e250ba-1828-4bd7-8e1d-71e5aea5a7f2-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"e1e250ba-1828-4bd7-8e1d-71e5aea5a7f2\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.739554 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e1e250ba-1828-4bd7-8e1d-71e5aea5a7f2-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"e1e250ba-1828-4bd7-8e1d-71e5aea5a7f2\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.739594 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-65a84658-aa50-4259-8efb-e2e46a6339b3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-65a84658-aa50-4259-8efb-e2e46a6339b3\") pod \"prometheus-metric-storage-0\" (UID: \"e1e250ba-1828-4bd7-8e1d-71e5aea5a7f2\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.741835 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e1e250ba-1828-4bd7-8e1d-71e5aea5a7f2-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"e1e250ba-1828-4bd7-8e1d-71e5aea5a7f2\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.746204 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e1e250ba-1828-4bd7-8e1d-71e5aea5a7f2-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"e1e250ba-1828-4bd7-8e1d-71e5aea5a7f2\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.746354 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e1e250ba-1828-4bd7-8e1d-71e5aea5a7f2-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"e1e250ba-1828-4bd7-8e1d-71e5aea5a7f2\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.746999 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/e1e250ba-1828-4bd7-8e1d-71e5aea5a7f2-config\") pod \"prometheus-metric-storage-0\" (UID: \"e1e250ba-1828-4bd7-8e1d-71e5aea5a7f2\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.747706 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1e250ba-1828-4bd7-8e1d-71e5aea5a7f2-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"e1e250ba-1828-4bd7-8e1d-71e5aea5a7f2\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.747920 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e1e250ba-1828-4bd7-8e1d-71e5aea5a7f2-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"e1e250ba-1828-4bd7-8e1d-71e5aea5a7f2\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.753287 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/e1e250ba-1828-4bd7-8e1d-71e5aea5a7f2-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"e1e250ba-1828-4bd7-8e1d-71e5aea5a7f2\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.755082 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e1e250ba-1828-4bd7-8e1d-71e5aea5a7f2-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"e1e250ba-1828-4bd7-8e1d-71e5aea5a7f2\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.763203 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwzcq\" (UniqueName: \"kubernetes.io/projected/e1e250ba-1828-4bd7-8e1d-71e5aea5a7f2-kube-api-access-bwzcq\") pod \"prometheus-metric-storage-0\" (UID: \"e1e250ba-1828-4bd7-8e1d-71e5aea5a7f2\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.764242 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/e1e250ba-1828-4bd7-8e1d-71e5aea5a7f2-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"e1e250ba-1828-4bd7-8e1d-71e5aea5a7f2\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.768368 4752 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.768403 4752 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-65a84658-aa50-4259-8efb-e2e46a6339b3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-65a84658-aa50-4259-8efb-e2e46a6339b3\") pod \"prometheus-metric-storage-0\" (UID: \"e1e250ba-1828-4bd7-8e1d-71e5aea5a7f2\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/9b2293af6eff455224fb1fa348a6b5097aab5a7c0afb95e3cd736ba39d87a712/globalmount\"" pod="watcher-kuttl-default/prometheus-metric-storage-0" Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.797437 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-65a84658-aa50-4259-8efb-e2e46a6339b3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-65a84658-aa50-4259-8efb-e2e46a6339b3\") pod \"prometheus-metric-storage-0\" (UID: \"e1e250ba-1828-4bd7-8e1d-71e5aea5a7f2\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.837757 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-238e-account-create-vpltk" Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.859921 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/prometheus-metric-storage-0" Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.941671 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r5jmx\" (UniqueName: \"kubernetes.io/projected/9a39b4ca-d533-4620-8e4e-827f7e2dc8de-kube-api-access-r5jmx\") pod \"9a39b4ca-d533-4620-8e4e-827f7e2dc8de\" (UID: \"9a39b4ca-d533-4620-8e4e-827f7e2dc8de\") " Sep 29 11:03:25 crc kubenswrapper[4752]: I0929 11:03:25.946061 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a39b4ca-d533-4620-8e4e-827f7e2dc8de-kube-api-access-r5jmx" (OuterVolumeSpecName: "kube-api-access-r5jmx") pod "9a39b4ca-d533-4620-8e4e-827f7e2dc8de" (UID: "9a39b4ca-d533-4620-8e4e-827f7e2dc8de"). InnerVolumeSpecName "kube-api-access-r5jmx". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 11:03:26 crc kubenswrapper[4752]: I0929 11:03:26.044825 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r5jmx\" (UniqueName: \"kubernetes.io/projected/9a39b4ca-d533-4620-8e4e-827f7e2dc8de-kube-api-access-r5jmx\") on node \"crc\" DevicePath \"\"" Sep 29 11:03:26 crc kubenswrapper[4752]: I0929 11:03:26.049714 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c27e500-7063-40cc-9a6d-4e8fa0df4a98" path="/var/lib/kubelet/pods/4c27e500-7063-40cc-9a6d-4e8fa0df4a98/volumes" Sep 29 11:03:26 crc kubenswrapper[4752]: I0929 11:03:26.327662 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/prometheus-metric-storage-0"] Sep 29 11:03:26 crc kubenswrapper[4752]: I0929 11:03:26.403882 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/prometheus-metric-storage-0" event={"ID":"e1e250ba-1828-4bd7-8e1d-71e5aea5a7f2","Type":"ContainerStarted","Data":"c289d769db168c5d19d4f9362f1bcc7d30c78841d5bd954174faf2b6764204ff"} Sep 29 11:03:26 crc kubenswrapper[4752]: I0929 11:03:26.405594 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-238e-account-create-vpltk" event={"ID":"9a39b4ca-d533-4620-8e4e-827f7e2dc8de","Type":"ContainerDied","Data":"edbc200ada33683dbb38c21be0b564aa6aa6e2db9ad2a94504e6a6feb6cab808"} Sep 29 11:03:26 crc kubenswrapper[4752]: I0929 11:03:26.405632 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="edbc200ada33683dbb38c21be0b564aa6aa6e2db9ad2a94504e6a6feb6cab808" Sep 29 11:03:26 crc kubenswrapper[4752]: I0929 11:03:26.405878 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-238e-account-create-vpltk" Sep 29 11:03:29 crc kubenswrapper[4752]: I0929 11:03:29.432708 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/prometheus-metric-storage-0" event={"ID":"e1e250ba-1828-4bd7-8e1d-71e5aea5a7f2","Type":"ContainerStarted","Data":"aad5605068150f9631d30e20237e5b5490ffeafc1b4f46b89c8c5a624e2a0eed"} Sep 29 11:03:32 crc kubenswrapper[4752]: I0929 11:03:32.133965 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Sep 29 11:03:36 crc kubenswrapper[4752]: I0929 11:03:36.500549 4752 generic.go:334] "Generic (PLEG): container finished" podID="e1e250ba-1828-4bd7-8e1d-71e5aea5a7f2" containerID="aad5605068150f9631d30e20237e5b5490ffeafc1b4f46b89c8c5a624e2a0eed" exitCode=0 Sep 29 11:03:36 crc kubenswrapper[4752]: I0929 11:03:36.500590 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/prometheus-metric-storage-0" event={"ID":"e1e250ba-1828-4bd7-8e1d-71e5aea5a7f2","Type":"ContainerDied","Data":"aad5605068150f9631d30e20237e5b5490ffeafc1b4f46b89c8c5a624e2a0eed"} Sep 29 11:03:37 crc kubenswrapper[4752]: I0929 11:03:37.514202 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/prometheus-metric-storage-0" event={"ID":"e1e250ba-1828-4bd7-8e1d-71e5aea5a7f2","Type":"ContainerStarted","Data":"329db9e81e2ff7daee9bfd4a05ca93350fe4ce73df9366925e969e72bdc24ad8"} Sep 29 11:03:40 crc kubenswrapper[4752]: I0929 11:03:40.544325 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/prometheus-metric-storage-0" event={"ID":"e1e250ba-1828-4bd7-8e1d-71e5aea5a7f2","Type":"ContainerStarted","Data":"e627028f5ea0da5fab9fb40a65c2dbc11a6cc12d5cca80cca18f9be12ec2bee8"} Sep 29 11:03:40 crc kubenswrapper[4752]: I0929 11:03:40.544854 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/prometheus-metric-storage-0" event={"ID":"e1e250ba-1828-4bd7-8e1d-71e5aea5a7f2","Type":"ContainerStarted","Data":"f94ebd15a8b7408fdbc4c544d4ca1d833f2740bb32bb00ed17c3b5b5005baab4"} Sep 29 11:03:40 crc kubenswrapper[4752]: I0929 11:03:40.615448 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/prometheus-metric-storage-0" podStartSLOduration=15.615425803 podStartE2EDuration="15.615425803s" podCreationTimestamp="2025-09-29 11:03:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 11:03:40.60907497 +0000 UTC m=+1161.398216637" watchObservedRunningTime="2025-09-29 11:03:40.615425803 +0000 UTC m=+1161.404567470" Sep 29 11:03:40 crc kubenswrapper[4752]: I0929 11:03:40.861427 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/prometheus-metric-storage-0" Sep 29 11:03:40 crc kubenswrapper[4752]: I0929 11:03:40.861484 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/prometheus-metric-storage-0" Sep 29 11:03:40 crc kubenswrapper[4752]: I0929 11:03:40.869311 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/prometheus-metric-storage-0" Sep 29 11:03:41 crc kubenswrapper[4752]: I0929 11:03:41.563353 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/prometheus-metric-storage-0" Sep 29 11:03:41 crc kubenswrapper[4752]: I0929 11:03:41.570361 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/rabbitmq-server-0" Sep 29 11:03:42 crc kubenswrapper[4752]: I0929 11:03:42.161469 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/keystone-db-sync-6p29p"] Sep 29 11:03:42 crc kubenswrapper[4752]: E0929 11:03:42.163168 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a39b4ca-d533-4620-8e4e-827f7e2dc8de" containerName="mariadb-account-create" Sep 29 11:03:42 crc kubenswrapper[4752]: I0929 11:03:42.163197 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a39b4ca-d533-4620-8e4e-827f7e2dc8de" containerName="mariadb-account-create" Sep 29 11:03:42 crc kubenswrapper[4752]: I0929 11:03:42.164040 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a39b4ca-d533-4620-8e4e-827f7e2dc8de" containerName="mariadb-account-create" Sep 29 11:03:42 crc kubenswrapper[4752]: I0929 11:03:42.164985 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-db-sync-6p29p" Sep 29 11:03:42 crc kubenswrapper[4752]: I0929 11:03:42.176406 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"keystone-scripts" Sep 29 11:03:42 crc kubenswrapper[4752]: I0929 11:03:42.176538 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"keystone" Sep 29 11:03:42 crc kubenswrapper[4752]: I0929 11:03:42.176657 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"keystone-keystone-dockercfg-xxmcc" Sep 29 11:03:42 crc kubenswrapper[4752]: I0929 11:03:42.176934 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"keystone-config-data" Sep 29 11:03:42 crc kubenswrapper[4752]: I0929 11:03:42.192921 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/keystone-db-sync-6p29p"] Sep 29 11:03:42 crc kubenswrapper[4752]: I0929 11:03:42.231627 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1cce556e-44f8-46c9-ba1d-b609c7e6468d-config-data\") pod \"keystone-db-sync-6p29p\" (UID: \"1cce556e-44f8-46c9-ba1d-b609c7e6468d\") " pod="watcher-kuttl-default/keystone-db-sync-6p29p" Sep 29 11:03:42 crc kubenswrapper[4752]: I0929 11:03:42.231680 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bdcg\" (UniqueName: \"kubernetes.io/projected/1cce556e-44f8-46c9-ba1d-b609c7e6468d-kube-api-access-9bdcg\") pod \"keystone-db-sync-6p29p\" (UID: \"1cce556e-44f8-46c9-ba1d-b609c7e6468d\") " pod="watcher-kuttl-default/keystone-db-sync-6p29p" Sep 29 11:03:42 crc kubenswrapper[4752]: I0929 11:03:42.232068 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cce556e-44f8-46c9-ba1d-b609c7e6468d-combined-ca-bundle\") pod \"keystone-db-sync-6p29p\" (UID: \"1cce556e-44f8-46c9-ba1d-b609c7e6468d\") " pod="watcher-kuttl-default/keystone-db-sync-6p29p" Sep 29 11:03:42 crc kubenswrapper[4752]: I0929 11:03:42.333743 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cce556e-44f8-46c9-ba1d-b609c7e6468d-combined-ca-bundle\") pod \"keystone-db-sync-6p29p\" (UID: \"1cce556e-44f8-46c9-ba1d-b609c7e6468d\") " pod="watcher-kuttl-default/keystone-db-sync-6p29p" Sep 29 11:03:42 crc kubenswrapper[4752]: I0929 11:03:42.333900 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1cce556e-44f8-46c9-ba1d-b609c7e6468d-config-data\") pod \"keystone-db-sync-6p29p\" (UID: \"1cce556e-44f8-46c9-ba1d-b609c7e6468d\") " pod="watcher-kuttl-default/keystone-db-sync-6p29p" Sep 29 11:03:42 crc kubenswrapper[4752]: I0929 11:03:42.333925 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bdcg\" (UniqueName: \"kubernetes.io/projected/1cce556e-44f8-46c9-ba1d-b609c7e6468d-kube-api-access-9bdcg\") pod \"keystone-db-sync-6p29p\" (UID: \"1cce556e-44f8-46c9-ba1d-b609c7e6468d\") " pod="watcher-kuttl-default/keystone-db-sync-6p29p" Sep 29 11:03:42 crc kubenswrapper[4752]: I0929 11:03:42.343737 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1cce556e-44f8-46c9-ba1d-b609c7e6468d-config-data\") pod \"keystone-db-sync-6p29p\" (UID: \"1cce556e-44f8-46c9-ba1d-b609c7e6468d\") " pod="watcher-kuttl-default/keystone-db-sync-6p29p" Sep 29 11:03:42 crc kubenswrapper[4752]: I0929 11:03:42.345692 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cce556e-44f8-46c9-ba1d-b609c7e6468d-combined-ca-bundle\") pod \"keystone-db-sync-6p29p\" (UID: \"1cce556e-44f8-46c9-ba1d-b609c7e6468d\") " pod="watcher-kuttl-default/keystone-db-sync-6p29p" Sep 29 11:03:42 crc kubenswrapper[4752]: I0929 11:03:42.361126 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bdcg\" (UniqueName: \"kubernetes.io/projected/1cce556e-44f8-46c9-ba1d-b609c7e6468d-kube-api-access-9bdcg\") pod \"keystone-db-sync-6p29p\" (UID: \"1cce556e-44f8-46c9-ba1d-b609c7e6468d\") " pod="watcher-kuttl-default/keystone-db-sync-6p29p" Sep 29 11:03:42 crc kubenswrapper[4752]: I0929 11:03:42.500987 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-db-sync-6p29p" Sep 29 11:03:42 crc kubenswrapper[4752]: I0929 11:03:42.939896 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/keystone-db-sync-6p29p"] Sep 29 11:03:43 crc kubenswrapper[4752]: I0929 11:03:43.581541 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-db-sync-6p29p" event={"ID":"1cce556e-44f8-46c9-ba1d-b609c7e6468d","Type":"ContainerStarted","Data":"ea964b9dfd3428fa895c0d3ec12b492d7912aa39132e4bfc033d5a2c476471f3"} Sep 29 11:03:51 crc kubenswrapper[4752]: I0929 11:03:51.658405 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-db-sync-6p29p" event={"ID":"1cce556e-44f8-46c9-ba1d-b609c7e6468d","Type":"ContainerStarted","Data":"4d3149214022ac33d5cd5dcaa9685e20715f7a26cc5a0de14b7149211b77adde"} Sep 29 11:03:51 crc kubenswrapper[4752]: I0929 11:03:51.709781 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/keystone-db-sync-6p29p" podStartSLOduration=2.152054946 podStartE2EDuration="9.70975629s" podCreationTimestamp="2025-09-29 11:03:42 +0000 UTC" firstStartedPulling="2025-09-29 11:03:42.945779038 +0000 UTC m=+1163.734920705" lastFinishedPulling="2025-09-29 11:03:50.503480382 +0000 UTC m=+1171.292622049" observedRunningTime="2025-09-29 11:03:51.699354463 +0000 UTC m=+1172.488496200" watchObservedRunningTime="2025-09-29 11:03:51.70975629 +0000 UTC m=+1172.498897977" Sep 29 11:03:54 crc kubenswrapper[4752]: I0929 11:03:54.689115 4752 generic.go:334] "Generic (PLEG): container finished" podID="1cce556e-44f8-46c9-ba1d-b609c7e6468d" containerID="4d3149214022ac33d5cd5dcaa9685e20715f7a26cc5a0de14b7149211b77adde" exitCode=0 Sep 29 11:03:54 crc kubenswrapper[4752]: I0929 11:03:54.689200 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-db-sync-6p29p" event={"ID":"1cce556e-44f8-46c9-ba1d-b609c7e6468d","Type":"ContainerDied","Data":"4d3149214022ac33d5cd5dcaa9685e20715f7a26cc5a0de14b7149211b77adde"} Sep 29 11:03:56 crc kubenswrapper[4752]: I0929 11:03:56.102845 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-db-sync-6p29p" Sep 29 11:03:56 crc kubenswrapper[4752]: I0929 11:03:56.189972 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9bdcg\" (UniqueName: \"kubernetes.io/projected/1cce556e-44f8-46c9-ba1d-b609c7e6468d-kube-api-access-9bdcg\") pod \"1cce556e-44f8-46c9-ba1d-b609c7e6468d\" (UID: \"1cce556e-44f8-46c9-ba1d-b609c7e6468d\") " Sep 29 11:03:56 crc kubenswrapper[4752]: I0929 11:03:56.190039 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1cce556e-44f8-46c9-ba1d-b609c7e6468d-config-data\") pod \"1cce556e-44f8-46c9-ba1d-b609c7e6468d\" (UID: \"1cce556e-44f8-46c9-ba1d-b609c7e6468d\") " Sep 29 11:03:56 crc kubenswrapper[4752]: I0929 11:03:56.190177 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cce556e-44f8-46c9-ba1d-b609c7e6468d-combined-ca-bundle\") pod \"1cce556e-44f8-46c9-ba1d-b609c7e6468d\" (UID: \"1cce556e-44f8-46c9-ba1d-b609c7e6468d\") " Sep 29 11:03:56 crc kubenswrapper[4752]: I0929 11:03:56.232112 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1cce556e-44f8-46c9-ba1d-b609c7e6468d-kube-api-access-9bdcg" (OuterVolumeSpecName: "kube-api-access-9bdcg") pod "1cce556e-44f8-46c9-ba1d-b609c7e6468d" (UID: "1cce556e-44f8-46c9-ba1d-b609c7e6468d"). InnerVolumeSpecName "kube-api-access-9bdcg". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 11:03:56 crc kubenswrapper[4752]: I0929 11:03:56.299005 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9bdcg\" (UniqueName: \"kubernetes.io/projected/1cce556e-44f8-46c9-ba1d-b609c7e6468d-kube-api-access-9bdcg\") on node \"crc\" DevicePath \"\"" Sep 29 11:03:56 crc kubenswrapper[4752]: I0929 11:03:56.319975 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1cce556e-44f8-46c9-ba1d-b609c7e6468d-config-data" (OuterVolumeSpecName: "config-data") pod "1cce556e-44f8-46c9-ba1d-b609c7e6468d" (UID: "1cce556e-44f8-46c9-ba1d-b609c7e6468d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:03:56 crc kubenswrapper[4752]: I0929 11:03:56.339069 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1cce556e-44f8-46c9-ba1d-b609c7e6468d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1cce556e-44f8-46c9-ba1d-b609c7e6468d" (UID: "1cce556e-44f8-46c9-ba1d-b609c7e6468d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:03:56 crc kubenswrapper[4752]: I0929 11:03:56.400156 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cce556e-44f8-46c9-ba1d-b609c7e6468d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 11:03:56 crc kubenswrapper[4752]: I0929 11:03:56.400188 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1cce556e-44f8-46c9-ba1d-b609c7e6468d-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 11:03:56 crc kubenswrapper[4752]: I0929 11:03:56.718900 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-db-sync-6p29p" event={"ID":"1cce556e-44f8-46c9-ba1d-b609c7e6468d","Type":"ContainerDied","Data":"ea964b9dfd3428fa895c0d3ec12b492d7912aa39132e4bfc033d5a2c476471f3"} Sep 29 11:03:56 crc kubenswrapper[4752]: I0929 11:03:56.719484 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea964b9dfd3428fa895c0d3ec12b492d7912aa39132e4bfc033d5a2c476471f3" Sep 29 11:03:56 crc kubenswrapper[4752]: I0929 11:03:56.718980 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-db-sync-6p29p" Sep 29 11:03:56 crc kubenswrapper[4752]: I0929 11:03:56.918881 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/keystone-bootstrap-9vskz"] Sep 29 11:03:56 crc kubenswrapper[4752]: E0929 11:03:56.919279 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cce556e-44f8-46c9-ba1d-b609c7e6468d" containerName="keystone-db-sync" Sep 29 11:03:56 crc kubenswrapper[4752]: I0929 11:03:56.919297 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cce556e-44f8-46c9-ba1d-b609c7e6468d" containerName="keystone-db-sync" Sep 29 11:03:56 crc kubenswrapper[4752]: I0929 11:03:56.919477 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="1cce556e-44f8-46c9-ba1d-b609c7e6468d" containerName="keystone-db-sync" Sep 29 11:03:56 crc kubenswrapper[4752]: I0929 11:03:56.920080 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-bootstrap-9vskz" Sep 29 11:03:56 crc kubenswrapper[4752]: I0929 11:03:56.928894 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"keystone-scripts" Sep 29 11:03:56 crc kubenswrapper[4752]: I0929 11:03:56.929628 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"keystone-keystone-dockercfg-xxmcc" Sep 29 11:03:56 crc kubenswrapper[4752]: I0929 11:03:56.929640 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"keystone" Sep 29 11:03:56 crc kubenswrapper[4752]: I0929 11:03:56.929786 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"keystone-config-data" Sep 29 11:03:56 crc kubenswrapper[4752]: I0929 11:03:56.944241 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/keystone-bootstrap-9vskz"] Sep 29 11:03:57 crc kubenswrapper[4752]: I0929 11:03:57.007578 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c588168d-087f-4ee1-9f3f-dcfaeeed587c-combined-ca-bundle\") pod \"keystone-bootstrap-9vskz\" (UID: \"c588168d-087f-4ee1-9f3f-dcfaeeed587c\") " pod="watcher-kuttl-default/keystone-bootstrap-9vskz" Sep 29 11:03:57 crc kubenswrapper[4752]: I0929 11:03:57.007670 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c588168d-087f-4ee1-9f3f-dcfaeeed587c-scripts\") pod \"keystone-bootstrap-9vskz\" (UID: \"c588168d-087f-4ee1-9f3f-dcfaeeed587c\") " pod="watcher-kuttl-default/keystone-bootstrap-9vskz" Sep 29 11:03:57 crc kubenswrapper[4752]: I0929 11:03:57.007776 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c588168d-087f-4ee1-9f3f-dcfaeeed587c-fernet-keys\") pod \"keystone-bootstrap-9vskz\" (UID: \"c588168d-087f-4ee1-9f3f-dcfaeeed587c\") " pod="watcher-kuttl-default/keystone-bootstrap-9vskz" Sep 29 11:03:57 crc kubenswrapper[4752]: I0929 11:03:57.007835 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c588168d-087f-4ee1-9f3f-dcfaeeed587c-credential-keys\") pod \"keystone-bootstrap-9vskz\" (UID: \"c588168d-087f-4ee1-9f3f-dcfaeeed587c\") " pod="watcher-kuttl-default/keystone-bootstrap-9vskz" Sep 29 11:03:57 crc kubenswrapper[4752]: I0929 11:03:57.007905 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c588168d-087f-4ee1-9f3f-dcfaeeed587c-config-data\") pod \"keystone-bootstrap-9vskz\" (UID: \"c588168d-087f-4ee1-9f3f-dcfaeeed587c\") " pod="watcher-kuttl-default/keystone-bootstrap-9vskz" Sep 29 11:03:57 crc kubenswrapper[4752]: I0929 11:03:57.007932 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6pz9\" (UniqueName: \"kubernetes.io/projected/c588168d-087f-4ee1-9f3f-dcfaeeed587c-kube-api-access-p6pz9\") pod \"keystone-bootstrap-9vskz\" (UID: \"c588168d-087f-4ee1-9f3f-dcfaeeed587c\") " pod="watcher-kuttl-default/keystone-bootstrap-9vskz" Sep 29 11:03:57 crc kubenswrapper[4752]: I0929 11:03:57.109958 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c588168d-087f-4ee1-9f3f-dcfaeeed587c-combined-ca-bundle\") pod \"keystone-bootstrap-9vskz\" (UID: \"c588168d-087f-4ee1-9f3f-dcfaeeed587c\") " pod="watcher-kuttl-default/keystone-bootstrap-9vskz" Sep 29 11:03:57 crc kubenswrapper[4752]: I0929 11:03:57.110381 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c588168d-087f-4ee1-9f3f-dcfaeeed587c-scripts\") pod \"keystone-bootstrap-9vskz\" (UID: \"c588168d-087f-4ee1-9f3f-dcfaeeed587c\") " pod="watcher-kuttl-default/keystone-bootstrap-9vskz" Sep 29 11:03:57 crc kubenswrapper[4752]: I0929 11:03:57.110434 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c588168d-087f-4ee1-9f3f-dcfaeeed587c-fernet-keys\") pod \"keystone-bootstrap-9vskz\" (UID: \"c588168d-087f-4ee1-9f3f-dcfaeeed587c\") " pod="watcher-kuttl-default/keystone-bootstrap-9vskz" Sep 29 11:03:57 crc kubenswrapper[4752]: I0929 11:03:57.110454 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c588168d-087f-4ee1-9f3f-dcfaeeed587c-credential-keys\") pod \"keystone-bootstrap-9vskz\" (UID: \"c588168d-087f-4ee1-9f3f-dcfaeeed587c\") " pod="watcher-kuttl-default/keystone-bootstrap-9vskz" Sep 29 11:03:57 crc kubenswrapper[4752]: I0929 11:03:57.110491 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c588168d-087f-4ee1-9f3f-dcfaeeed587c-config-data\") pod \"keystone-bootstrap-9vskz\" (UID: \"c588168d-087f-4ee1-9f3f-dcfaeeed587c\") " pod="watcher-kuttl-default/keystone-bootstrap-9vskz" Sep 29 11:03:57 crc kubenswrapper[4752]: I0929 11:03:57.110509 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6pz9\" (UniqueName: \"kubernetes.io/projected/c588168d-087f-4ee1-9f3f-dcfaeeed587c-kube-api-access-p6pz9\") pod \"keystone-bootstrap-9vskz\" (UID: \"c588168d-087f-4ee1-9f3f-dcfaeeed587c\") " pod="watcher-kuttl-default/keystone-bootstrap-9vskz" Sep 29 11:03:57 crc kubenswrapper[4752]: I0929 11:03:57.117898 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c588168d-087f-4ee1-9f3f-dcfaeeed587c-combined-ca-bundle\") pod \"keystone-bootstrap-9vskz\" (UID: \"c588168d-087f-4ee1-9f3f-dcfaeeed587c\") " pod="watcher-kuttl-default/keystone-bootstrap-9vskz" Sep 29 11:03:57 crc kubenswrapper[4752]: I0929 11:03:57.118760 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c588168d-087f-4ee1-9f3f-dcfaeeed587c-config-data\") pod \"keystone-bootstrap-9vskz\" (UID: \"c588168d-087f-4ee1-9f3f-dcfaeeed587c\") " pod="watcher-kuttl-default/keystone-bootstrap-9vskz" Sep 29 11:03:57 crc kubenswrapper[4752]: I0929 11:03:57.120165 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c588168d-087f-4ee1-9f3f-dcfaeeed587c-scripts\") pod \"keystone-bootstrap-9vskz\" (UID: \"c588168d-087f-4ee1-9f3f-dcfaeeed587c\") " pod="watcher-kuttl-default/keystone-bootstrap-9vskz" Sep 29 11:03:57 crc kubenswrapper[4752]: I0929 11:03:57.121185 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c588168d-087f-4ee1-9f3f-dcfaeeed587c-credential-keys\") pod \"keystone-bootstrap-9vskz\" (UID: \"c588168d-087f-4ee1-9f3f-dcfaeeed587c\") " pod="watcher-kuttl-default/keystone-bootstrap-9vskz" Sep 29 11:03:57 crc kubenswrapper[4752]: I0929 11:03:57.122686 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c588168d-087f-4ee1-9f3f-dcfaeeed587c-fernet-keys\") pod \"keystone-bootstrap-9vskz\" (UID: \"c588168d-087f-4ee1-9f3f-dcfaeeed587c\") " pod="watcher-kuttl-default/keystone-bootstrap-9vskz" Sep 29 11:03:57 crc kubenswrapper[4752]: I0929 11:03:57.124649 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Sep 29 11:03:57 crc kubenswrapper[4752]: I0929 11:03:57.128155 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:03:57 crc kubenswrapper[4752]: I0929 11:03:57.131481 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Sep 29 11:03:57 crc kubenswrapper[4752]: I0929 11:03:57.131915 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Sep 29 11:03:57 crc kubenswrapper[4752]: I0929 11:03:57.137993 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Sep 29 11:03:57 crc kubenswrapper[4752]: I0929 11:03:57.138210 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6pz9\" (UniqueName: \"kubernetes.io/projected/c588168d-087f-4ee1-9f3f-dcfaeeed587c-kube-api-access-p6pz9\") pod \"keystone-bootstrap-9vskz\" (UID: \"c588168d-087f-4ee1-9f3f-dcfaeeed587c\") " pod="watcher-kuttl-default/keystone-bootstrap-9vskz" Sep 29 11:03:57 crc kubenswrapper[4752]: I0929 11:03:57.212001 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c62050de-6e3f-4b09-a7a7-9038430e54ce-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c62050de-6e3f-4b09-a7a7-9038430e54ce\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:03:57 crc kubenswrapper[4752]: I0929 11:03:57.212047 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c62050de-6e3f-4b09-a7a7-9038430e54ce-config-data\") pod \"ceilometer-0\" (UID: \"c62050de-6e3f-4b09-a7a7-9038430e54ce\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:03:57 crc kubenswrapper[4752]: I0929 11:03:57.212124 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c62050de-6e3f-4b09-a7a7-9038430e54ce-run-httpd\") pod \"ceilometer-0\" (UID: \"c62050de-6e3f-4b09-a7a7-9038430e54ce\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:03:57 crc kubenswrapper[4752]: I0929 11:03:57.212158 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5x54\" (UniqueName: \"kubernetes.io/projected/c62050de-6e3f-4b09-a7a7-9038430e54ce-kube-api-access-n5x54\") pod \"ceilometer-0\" (UID: \"c62050de-6e3f-4b09-a7a7-9038430e54ce\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:03:57 crc kubenswrapper[4752]: I0929 11:03:57.212173 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c62050de-6e3f-4b09-a7a7-9038430e54ce-scripts\") pod \"ceilometer-0\" (UID: \"c62050de-6e3f-4b09-a7a7-9038430e54ce\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:03:57 crc kubenswrapper[4752]: I0929 11:03:57.212193 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c62050de-6e3f-4b09-a7a7-9038430e54ce-log-httpd\") pod \"ceilometer-0\" (UID: \"c62050de-6e3f-4b09-a7a7-9038430e54ce\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:03:57 crc kubenswrapper[4752]: I0929 11:03:57.212258 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c62050de-6e3f-4b09-a7a7-9038430e54ce-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c62050de-6e3f-4b09-a7a7-9038430e54ce\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:03:57 crc kubenswrapper[4752]: I0929 11:03:57.239693 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-bootstrap-9vskz" Sep 29 11:03:57 crc kubenswrapper[4752]: I0929 11:03:57.313504 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c62050de-6e3f-4b09-a7a7-9038430e54ce-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c62050de-6e3f-4b09-a7a7-9038430e54ce\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:03:57 crc kubenswrapper[4752]: I0929 11:03:57.313573 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c62050de-6e3f-4b09-a7a7-9038430e54ce-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c62050de-6e3f-4b09-a7a7-9038430e54ce\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:03:57 crc kubenswrapper[4752]: I0929 11:03:57.313602 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c62050de-6e3f-4b09-a7a7-9038430e54ce-config-data\") pod \"ceilometer-0\" (UID: \"c62050de-6e3f-4b09-a7a7-9038430e54ce\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:03:57 crc kubenswrapper[4752]: I0929 11:03:57.313665 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c62050de-6e3f-4b09-a7a7-9038430e54ce-run-httpd\") pod \"ceilometer-0\" (UID: \"c62050de-6e3f-4b09-a7a7-9038430e54ce\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:03:57 crc kubenswrapper[4752]: I0929 11:03:57.313706 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c62050de-6e3f-4b09-a7a7-9038430e54ce-scripts\") pod \"ceilometer-0\" (UID: \"c62050de-6e3f-4b09-a7a7-9038430e54ce\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:03:57 crc kubenswrapper[4752]: I0929 11:03:57.313725 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5x54\" (UniqueName: \"kubernetes.io/projected/c62050de-6e3f-4b09-a7a7-9038430e54ce-kube-api-access-n5x54\") pod \"ceilometer-0\" (UID: \"c62050de-6e3f-4b09-a7a7-9038430e54ce\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:03:57 crc kubenswrapper[4752]: I0929 11:03:57.313761 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c62050de-6e3f-4b09-a7a7-9038430e54ce-log-httpd\") pod \"ceilometer-0\" (UID: \"c62050de-6e3f-4b09-a7a7-9038430e54ce\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:03:57 crc kubenswrapper[4752]: I0929 11:03:57.314285 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c62050de-6e3f-4b09-a7a7-9038430e54ce-log-httpd\") pod \"ceilometer-0\" (UID: \"c62050de-6e3f-4b09-a7a7-9038430e54ce\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:03:57 crc kubenswrapper[4752]: I0929 11:03:57.314576 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c62050de-6e3f-4b09-a7a7-9038430e54ce-run-httpd\") pod \"ceilometer-0\" (UID: \"c62050de-6e3f-4b09-a7a7-9038430e54ce\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:03:57 crc kubenswrapper[4752]: I0929 11:03:57.317460 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c62050de-6e3f-4b09-a7a7-9038430e54ce-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c62050de-6e3f-4b09-a7a7-9038430e54ce\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:03:57 crc kubenswrapper[4752]: I0929 11:03:57.317650 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c62050de-6e3f-4b09-a7a7-9038430e54ce-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c62050de-6e3f-4b09-a7a7-9038430e54ce\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:03:57 crc kubenswrapper[4752]: I0929 11:03:57.317753 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c62050de-6e3f-4b09-a7a7-9038430e54ce-scripts\") pod \"ceilometer-0\" (UID: \"c62050de-6e3f-4b09-a7a7-9038430e54ce\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:03:57 crc kubenswrapper[4752]: I0929 11:03:57.318412 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c62050de-6e3f-4b09-a7a7-9038430e54ce-config-data\") pod \"ceilometer-0\" (UID: \"c62050de-6e3f-4b09-a7a7-9038430e54ce\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:03:57 crc kubenswrapper[4752]: I0929 11:03:57.333536 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5x54\" (UniqueName: \"kubernetes.io/projected/c62050de-6e3f-4b09-a7a7-9038430e54ce-kube-api-access-n5x54\") pod \"ceilometer-0\" (UID: \"c62050de-6e3f-4b09-a7a7-9038430e54ce\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:03:57 crc kubenswrapper[4752]: I0929 11:03:57.503971 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:03:57 crc kubenswrapper[4752]: I0929 11:03:57.683697 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/keystone-bootstrap-9vskz"] Sep 29 11:03:57 crc kubenswrapper[4752]: W0929 11:03:57.690143 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc588168d_087f_4ee1_9f3f_dcfaeeed587c.slice/crio-d49b8db2fbaa835bc7708110066f160d3c08ea7d43eac7c19ebb37c22fed8dd5 WatchSource:0}: Error finding container d49b8db2fbaa835bc7708110066f160d3c08ea7d43eac7c19ebb37c22fed8dd5: Status 404 returned error can't find the container with id d49b8db2fbaa835bc7708110066f160d3c08ea7d43eac7c19ebb37c22fed8dd5 Sep 29 11:03:57 crc kubenswrapper[4752]: I0929 11:03:57.732777 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-bootstrap-9vskz" event={"ID":"c588168d-087f-4ee1-9f3f-dcfaeeed587c","Type":"ContainerStarted","Data":"d49b8db2fbaa835bc7708110066f160d3c08ea7d43eac7c19ebb37c22fed8dd5"} Sep 29 11:03:57 crc kubenswrapper[4752]: I0929 11:03:57.941657 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Sep 29 11:03:57 crc kubenswrapper[4752]: W0929 11:03:57.943720 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc62050de_6e3f_4b09_a7a7_9038430e54ce.slice/crio-9b9bb5c68a181084b810560270743bb281b06b0e2fda4cd823f2c787f0bce922 WatchSource:0}: Error finding container 9b9bb5c68a181084b810560270743bb281b06b0e2fda4cd823f2c787f0bce922: Status 404 returned error can't find the container with id 9b9bb5c68a181084b810560270743bb281b06b0e2fda4cd823f2c787f0bce922 Sep 29 11:03:58 crc kubenswrapper[4752]: I0929 11:03:58.619183 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Sep 29 11:03:58 crc kubenswrapper[4752]: I0929 11:03:58.757275 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-bootstrap-9vskz" event={"ID":"c588168d-087f-4ee1-9f3f-dcfaeeed587c","Type":"ContainerStarted","Data":"e01016c76542e51402483036d4b612e7c0e136acbb18a2b4d89914506235824b"} Sep 29 11:03:58 crc kubenswrapper[4752]: I0929 11:03:58.758516 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"c62050de-6e3f-4b09-a7a7-9038430e54ce","Type":"ContainerStarted","Data":"9b9bb5c68a181084b810560270743bb281b06b0e2fda4cd823f2c787f0bce922"} Sep 29 11:03:58 crc kubenswrapper[4752]: I0929 11:03:58.789253 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/keystone-bootstrap-9vskz" podStartSLOduration=2.7892319260000003 podStartE2EDuration="2.789231926s" podCreationTimestamp="2025-09-29 11:03:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 11:03:58.779164668 +0000 UTC m=+1179.568306345" watchObservedRunningTime="2025-09-29 11:03:58.789231926 +0000 UTC m=+1179.578373593" Sep 29 11:04:01 crc kubenswrapper[4752]: I0929 11:04:01.793712 4752 generic.go:334] "Generic (PLEG): container finished" podID="c588168d-087f-4ee1-9f3f-dcfaeeed587c" containerID="e01016c76542e51402483036d4b612e7c0e136acbb18a2b4d89914506235824b" exitCode=0 Sep 29 11:04:01 crc kubenswrapper[4752]: I0929 11:04:01.793825 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-bootstrap-9vskz" event={"ID":"c588168d-087f-4ee1-9f3f-dcfaeeed587c","Type":"ContainerDied","Data":"e01016c76542e51402483036d4b612e7c0e136acbb18a2b4d89914506235824b"} Sep 29 11:04:02 crc kubenswrapper[4752]: I0929 11:04:02.804741 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"c62050de-6e3f-4b09-a7a7-9038430e54ce","Type":"ContainerStarted","Data":"4c69423b5cd9bdf2002e45c57217a45936b03429dd6d0e7e112a95c1d82a1794"} Sep 29 11:04:03 crc kubenswrapper[4752]: I0929 11:04:03.210612 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-bootstrap-9vskz" Sep 29 11:04:03 crc kubenswrapper[4752]: I0929 11:04:03.320354 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c588168d-087f-4ee1-9f3f-dcfaeeed587c-credential-keys\") pod \"c588168d-087f-4ee1-9f3f-dcfaeeed587c\" (UID: \"c588168d-087f-4ee1-9f3f-dcfaeeed587c\") " Sep 29 11:04:03 crc kubenswrapper[4752]: I0929 11:04:03.320447 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c588168d-087f-4ee1-9f3f-dcfaeeed587c-combined-ca-bundle\") pod \"c588168d-087f-4ee1-9f3f-dcfaeeed587c\" (UID: \"c588168d-087f-4ee1-9f3f-dcfaeeed587c\") " Sep 29 11:04:03 crc kubenswrapper[4752]: I0929 11:04:03.320524 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c588168d-087f-4ee1-9f3f-dcfaeeed587c-config-data\") pod \"c588168d-087f-4ee1-9f3f-dcfaeeed587c\" (UID: \"c588168d-087f-4ee1-9f3f-dcfaeeed587c\") " Sep 29 11:04:03 crc kubenswrapper[4752]: I0929 11:04:03.320551 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c588168d-087f-4ee1-9f3f-dcfaeeed587c-scripts\") pod \"c588168d-087f-4ee1-9f3f-dcfaeeed587c\" (UID: \"c588168d-087f-4ee1-9f3f-dcfaeeed587c\") " Sep 29 11:04:03 crc kubenswrapper[4752]: I0929 11:04:03.320608 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6pz9\" (UniqueName: \"kubernetes.io/projected/c588168d-087f-4ee1-9f3f-dcfaeeed587c-kube-api-access-p6pz9\") pod \"c588168d-087f-4ee1-9f3f-dcfaeeed587c\" (UID: \"c588168d-087f-4ee1-9f3f-dcfaeeed587c\") " Sep 29 11:04:03 crc kubenswrapper[4752]: I0929 11:04:03.320641 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c588168d-087f-4ee1-9f3f-dcfaeeed587c-fernet-keys\") pod \"c588168d-087f-4ee1-9f3f-dcfaeeed587c\" (UID: \"c588168d-087f-4ee1-9f3f-dcfaeeed587c\") " Sep 29 11:04:03 crc kubenswrapper[4752]: I0929 11:04:03.326279 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c588168d-087f-4ee1-9f3f-dcfaeeed587c-kube-api-access-p6pz9" (OuterVolumeSpecName: "kube-api-access-p6pz9") pod "c588168d-087f-4ee1-9f3f-dcfaeeed587c" (UID: "c588168d-087f-4ee1-9f3f-dcfaeeed587c"). InnerVolumeSpecName "kube-api-access-p6pz9". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 11:04:03 crc kubenswrapper[4752]: I0929 11:04:03.326676 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c588168d-087f-4ee1-9f3f-dcfaeeed587c-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "c588168d-087f-4ee1-9f3f-dcfaeeed587c" (UID: "c588168d-087f-4ee1-9f3f-dcfaeeed587c"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:04:03 crc kubenswrapper[4752]: I0929 11:04:03.327407 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c588168d-087f-4ee1-9f3f-dcfaeeed587c-scripts" (OuterVolumeSpecName: "scripts") pod "c588168d-087f-4ee1-9f3f-dcfaeeed587c" (UID: "c588168d-087f-4ee1-9f3f-dcfaeeed587c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:04:03 crc kubenswrapper[4752]: I0929 11:04:03.327979 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c588168d-087f-4ee1-9f3f-dcfaeeed587c-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "c588168d-087f-4ee1-9f3f-dcfaeeed587c" (UID: "c588168d-087f-4ee1-9f3f-dcfaeeed587c"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:04:03 crc kubenswrapper[4752]: I0929 11:04:03.353672 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c588168d-087f-4ee1-9f3f-dcfaeeed587c-config-data" (OuterVolumeSpecName: "config-data") pod "c588168d-087f-4ee1-9f3f-dcfaeeed587c" (UID: "c588168d-087f-4ee1-9f3f-dcfaeeed587c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:04:03 crc kubenswrapper[4752]: I0929 11:04:03.356655 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c588168d-087f-4ee1-9f3f-dcfaeeed587c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c588168d-087f-4ee1-9f3f-dcfaeeed587c" (UID: "c588168d-087f-4ee1-9f3f-dcfaeeed587c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:04:03 crc kubenswrapper[4752]: I0929 11:04:03.422517 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6pz9\" (UniqueName: \"kubernetes.io/projected/c588168d-087f-4ee1-9f3f-dcfaeeed587c-kube-api-access-p6pz9\") on node \"crc\" DevicePath \"\"" Sep 29 11:04:03 crc kubenswrapper[4752]: I0929 11:04:03.422560 4752 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c588168d-087f-4ee1-9f3f-dcfaeeed587c-fernet-keys\") on node \"crc\" DevicePath \"\"" Sep 29 11:04:03 crc kubenswrapper[4752]: I0929 11:04:03.422575 4752 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c588168d-087f-4ee1-9f3f-dcfaeeed587c-credential-keys\") on node \"crc\" DevicePath \"\"" Sep 29 11:04:03 crc kubenswrapper[4752]: I0929 11:04:03.422587 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c588168d-087f-4ee1-9f3f-dcfaeeed587c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 11:04:03 crc kubenswrapper[4752]: I0929 11:04:03.422600 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c588168d-087f-4ee1-9f3f-dcfaeeed587c-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 11:04:03 crc kubenswrapper[4752]: I0929 11:04:03.422611 4752 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c588168d-087f-4ee1-9f3f-dcfaeeed587c-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 11:04:03 crc kubenswrapper[4752]: I0929 11:04:03.815393 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"c62050de-6e3f-4b09-a7a7-9038430e54ce","Type":"ContainerStarted","Data":"20eb5c2cb774aa124a0c46c80c779afe1facd1d05932bbb28bee06d4866a9efc"} Sep 29 11:04:03 crc kubenswrapper[4752]: I0929 11:04:03.817059 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-bootstrap-9vskz" event={"ID":"c588168d-087f-4ee1-9f3f-dcfaeeed587c","Type":"ContainerDied","Data":"d49b8db2fbaa835bc7708110066f160d3c08ea7d43eac7c19ebb37c22fed8dd5"} Sep 29 11:04:03 crc kubenswrapper[4752]: I0929 11:04:03.817089 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d49b8db2fbaa835bc7708110066f160d3c08ea7d43eac7c19ebb37c22fed8dd5" Sep 29 11:04:03 crc kubenswrapper[4752]: I0929 11:04:03.817152 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-bootstrap-9vskz" Sep 29 11:04:04 crc kubenswrapper[4752]: I0929 11:04:04.004498 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/keystone-bootstrap-9vskz"] Sep 29 11:04:04 crc kubenswrapper[4752]: I0929 11:04:04.012400 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/keystone-bootstrap-9vskz"] Sep 29 11:04:04 crc kubenswrapper[4752]: I0929 11:04:04.042156 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c588168d-087f-4ee1-9f3f-dcfaeeed587c" path="/var/lib/kubelet/pods/c588168d-087f-4ee1-9f3f-dcfaeeed587c/volumes" Sep 29 11:04:04 crc kubenswrapper[4752]: I0929 11:04:04.091704 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/keystone-bootstrap-fr8mq"] Sep 29 11:04:04 crc kubenswrapper[4752]: E0929 11:04:04.092053 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c588168d-087f-4ee1-9f3f-dcfaeeed587c" containerName="keystone-bootstrap" Sep 29 11:04:04 crc kubenswrapper[4752]: I0929 11:04:04.092073 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="c588168d-087f-4ee1-9f3f-dcfaeeed587c" containerName="keystone-bootstrap" Sep 29 11:04:04 crc kubenswrapper[4752]: I0929 11:04:04.092246 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="c588168d-087f-4ee1-9f3f-dcfaeeed587c" containerName="keystone-bootstrap" Sep 29 11:04:04 crc kubenswrapper[4752]: I0929 11:04:04.092824 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-bootstrap-fr8mq" Sep 29 11:04:04 crc kubenswrapper[4752]: I0929 11:04:04.094793 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"keystone-scripts" Sep 29 11:04:04 crc kubenswrapper[4752]: I0929 11:04:04.095214 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"keystone-keystone-dockercfg-xxmcc" Sep 29 11:04:04 crc kubenswrapper[4752]: I0929 11:04:04.096355 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"keystone" Sep 29 11:04:04 crc kubenswrapper[4752]: I0929 11:04:04.097552 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"keystone-config-data" Sep 29 11:04:04 crc kubenswrapper[4752]: I0929 11:04:04.109756 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/keystone-bootstrap-fr8mq"] Sep 29 11:04:04 crc kubenswrapper[4752]: I0929 11:04:04.239643 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/205ab4e1-725f-4b7e-8c70-b809d2390460-combined-ca-bundle\") pod \"keystone-bootstrap-fr8mq\" (UID: \"205ab4e1-725f-4b7e-8c70-b809d2390460\") " pod="watcher-kuttl-default/keystone-bootstrap-fr8mq" Sep 29 11:04:04 crc kubenswrapper[4752]: I0929 11:04:04.239735 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/205ab4e1-725f-4b7e-8c70-b809d2390460-scripts\") pod \"keystone-bootstrap-fr8mq\" (UID: \"205ab4e1-725f-4b7e-8c70-b809d2390460\") " pod="watcher-kuttl-default/keystone-bootstrap-fr8mq" Sep 29 11:04:04 crc kubenswrapper[4752]: I0929 11:04:04.239831 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/205ab4e1-725f-4b7e-8c70-b809d2390460-config-data\") pod \"keystone-bootstrap-fr8mq\" (UID: \"205ab4e1-725f-4b7e-8c70-b809d2390460\") " pod="watcher-kuttl-default/keystone-bootstrap-fr8mq" Sep 29 11:04:04 crc kubenswrapper[4752]: I0929 11:04:04.239890 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/205ab4e1-725f-4b7e-8c70-b809d2390460-credential-keys\") pod \"keystone-bootstrap-fr8mq\" (UID: \"205ab4e1-725f-4b7e-8c70-b809d2390460\") " pod="watcher-kuttl-default/keystone-bootstrap-fr8mq" Sep 29 11:04:04 crc kubenswrapper[4752]: I0929 11:04:04.239935 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/205ab4e1-725f-4b7e-8c70-b809d2390460-fernet-keys\") pod \"keystone-bootstrap-fr8mq\" (UID: \"205ab4e1-725f-4b7e-8c70-b809d2390460\") " pod="watcher-kuttl-default/keystone-bootstrap-fr8mq" Sep 29 11:04:04 crc kubenswrapper[4752]: I0929 11:04:04.239986 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gln7b\" (UniqueName: \"kubernetes.io/projected/205ab4e1-725f-4b7e-8c70-b809d2390460-kube-api-access-gln7b\") pod \"keystone-bootstrap-fr8mq\" (UID: \"205ab4e1-725f-4b7e-8c70-b809d2390460\") " pod="watcher-kuttl-default/keystone-bootstrap-fr8mq" Sep 29 11:04:04 crc kubenswrapper[4752]: I0929 11:04:04.342064 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gln7b\" (UniqueName: \"kubernetes.io/projected/205ab4e1-725f-4b7e-8c70-b809d2390460-kube-api-access-gln7b\") pod \"keystone-bootstrap-fr8mq\" (UID: \"205ab4e1-725f-4b7e-8c70-b809d2390460\") " pod="watcher-kuttl-default/keystone-bootstrap-fr8mq" Sep 29 11:04:04 crc kubenswrapper[4752]: I0929 11:04:04.342268 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/205ab4e1-725f-4b7e-8c70-b809d2390460-combined-ca-bundle\") pod \"keystone-bootstrap-fr8mq\" (UID: \"205ab4e1-725f-4b7e-8c70-b809d2390460\") " pod="watcher-kuttl-default/keystone-bootstrap-fr8mq" Sep 29 11:04:04 crc kubenswrapper[4752]: I0929 11:04:04.342335 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/205ab4e1-725f-4b7e-8c70-b809d2390460-scripts\") pod \"keystone-bootstrap-fr8mq\" (UID: \"205ab4e1-725f-4b7e-8c70-b809d2390460\") " pod="watcher-kuttl-default/keystone-bootstrap-fr8mq" Sep 29 11:04:04 crc kubenswrapper[4752]: I0929 11:04:04.342385 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/205ab4e1-725f-4b7e-8c70-b809d2390460-config-data\") pod \"keystone-bootstrap-fr8mq\" (UID: \"205ab4e1-725f-4b7e-8c70-b809d2390460\") " pod="watcher-kuttl-default/keystone-bootstrap-fr8mq" Sep 29 11:04:04 crc kubenswrapper[4752]: I0929 11:04:04.342448 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/205ab4e1-725f-4b7e-8c70-b809d2390460-credential-keys\") pod \"keystone-bootstrap-fr8mq\" (UID: \"205ab4e1-725f-4b7e-8c70-b809d2390460\") " pod="watcher-kuttl-default/keystone-bootstrap-fr8mq" Sep 29 11:04:04 crc kubenswrapper[4752]: I0929 11:04:04.342506 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/205ab4e1-725f-4b7e-8c70-b809d2390460-fernet-keys\") pod \"keystone-bootstrap-fr8mq\" (UID: \"205ab4e1-725f-4b7e-8c70-b809d2390460\") " pod="watcher-kuttl-default/keystone-bootstrap-fr8mq" Sep 29 11:04:04 crc kubenswrapper[4752]: I0929 11:04:04.354244 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/205ab4e1-725f-4b7e-8c70-b809d2390460-config-data\") pod \"keystone-bootstrap-fr8mq\" (UID: \"205ab4e1-725f-4b7e-8c70-b809d2390460\") " pod="watcher-kuttl-default/keystone-bootstrap-fr8mq" Sep 29 11:04:04 crc kubenswrapper[4752]: I0929 11:04:04.355381 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/205ab4e1-725f-4b7e-8c70-b809d2390460-fernet-keys\") pod \"keystone-bootstrap-fr8mq\" (UID: \"205ab4e1-725f-4b7e-8c70-b809d2390460\") " pod="watcher-kuttl-default/keystone-bootstrap-fr8mq" Sep 29 11:04:04 crc kubenswrapper[4752]: I0929 11:04:04.355920 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/205ab4e1-725f-4b7e-8c70-b809d2390460-scripts\") pod \"keystone-bootstrap-fr8mq\" (UID: \"205ab4e1-725f-4b7e-8c70-b809d2390460\") " pod="watcher-kuttl-default/keystone-bootstrap-fr8mq" Sep 29 11:04:04 crc kubenswrapper[4752]: I0929 11:04:04.357326 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/205ab4e1-725f-4b7e-8c70-b809d2390460-credential-keys\") pod \"keystone-bootstrap-fr8mq\" (UID: \"205ab4e1-725f-4b7e-8c70-b809d2390460\") " pod="watcher-kuttl-default/keystone-bootstrap-fr8mq" Sep 29 11:04:04 crc kubenswrapper[4752]: I0929 11:04:04.372266 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gln7b\" (UniqueName: \"kubernetes.io/projected/205ab4e1-725f-4b7e-8c70-b809d2390460-kube-api-access-gln7b\") pod \"keystone-bootstrap-fr8mq\" (UID: \"205ab4e1-725f-4b7e-8c70-b809d2390460\") " pod="watcher-kuttl-default/keystone-bootstrap-fr8mq" Sep 29 11:04:04 crc kubenswrapper[4752]: I0929 11:04:04.373658 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/205ab4e1-725f-4b7e-8c70-b809d2390460-combined-ca-bundle\") pod \"keystone-bootstrap-fr8mq\" (UID: \"205ab4e1-725f-4b7e-8c70-b809d2390460\") " pod="watcher-kuttl-default/keystone-bootstrap-fr8mq" Sep 29 11:04:04 crc kubenswrapper[4752]: I0929 11:04:04.426945 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-bootstrap-fr8mq" Sep 29 11:04:04 crc kubenswrapper[4752]: I0929 11:04:04.898992 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/keystone-bootstrap-fr8mq"] Sep 29 11:04:04 crc kubenswrapper[4752]: W0929 11:04:04.904383 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod205ab4e1_725f_4b7e_8c70_b809d2390460.slice/crio-7ceaf970dbd17551952f5fc56a624c1a985d760a014236ec185a043640d89af3 WatchSource:0}: Error finding container 7ceaf970dbd17551952f5fc56a624c1a985d760a014236ec185a043640d89af3: Status 404 returned error can't find the container with id 7ceaf970dbd17551952f5fc56a624c1a985d760a014236ec185a043640d89af3 Sep 29 11:04:05 crc kubenswrapper[4752]: I0929 11:04:05.834664 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-bootstrap-fr8mq" event={"ID":"205ab4e1-725f-4b7e-8c70-b809d2390460","Type":"ContainerStarted","Data":"b6f199eebbc77d83b64cdc8243513a246ec991f0c48284c2c2e95bfaeea3d843"} Sep 29 11:04:05 crc kubenswrapper[4752]: I0929 11:04:05.834984 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-bootstrap-fr8mq" event={"ID":"205ab4e1-725f-4b7e-8c70-b809d2390460","Type":"ContainerStarted","Data":"7ceaf970dbd17551952f5fc56a624c1a985d760a014236ec185a043640d89af3"} Sep 29 11:04:05 crc kubenswrapper[4752]: I0929 11:04:05.852038 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/keystone-bootstrap-fr8mq" podStartSLOduration=1.852022273 podStartE2EDuration="1.852022273s" podCreationTimestamp="2025-09-29 11:04:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 11:04:05.848660007 +0000 UTC m=+1186.637801674" watchObservedRunningTime="2025-09-29 11:04:05.852022273 +0000 UTC m=+1186.641163940" Sep 29 11:04:08 crc kubenswrapper[4752]: I0929 11:04:08.881201 4752 generic.go:334] "Generic (PLEG): container finished" podID="205ab4e1-725f-4b7e-8c70-b809d2390460" containerID="b6f199eebbc77d83b64cdc8243513a246ec991f0c48284c2c2e95bfaeea3d843" exitCode=0 Sep 29 11:04:08 crc kubenswrapper[4752]: I0929 11:04:08.881779 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-bootstrap-fr8mq" event={"ID":"205ab4e1-725f-4b7e-8c70-b809d2390460","Type":"ContainerDied","Data":"b6f199eebbc77d83b64cdc8243513a246ec991f0c48284c2c2e95bfaeea3d843"} Sep 29 11:04:19 crc kubenswrapper[4752]: I0929 11:04:19.159724 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-bootstrap-fr8mq" Sep 29 11:04:19 crc kubenswrapper[4752]: I0929 11:04:19.204020 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/205ab4e1-725f-4b7e-8c70-b809d2390460-scripts\") pod \"205ab4e1-725f-4b7e-8c70-b809d2390460\" (UID: \"205ab4e1-725f-4b7e-8c70-b809d2390460\") " Sep 29 11:04:19 crc kubenswrapper[4752]: I0929 11:04:19.204078 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/205ab4e1-725f-4b7e-8c70-b809d2390460-fernet-keys\") pod \"205ab4e1-725f-4b7e-8c70-b809d2390460\" (UID: \"205ab4e1-725f-4b7e-8c70-b809d2390460\") " Sep 29 11:04:19 crc kubenswrapper[4752]: I0929 11:04:19.204128 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gln7b\" (UniqueName: \"kubernetes.io/projected/205ab4e1-725f-4b7e-8c70-b809d2390460-kube-api-access-gln7b\") pod \"205ab4e1-725f-4b7e-8c70-b809d2390460\" (UID: \"205ab4e1-725f-4b7e-8c70-b809d2390460\") " Sep 29 11:04:19 crc kubenswrapper[4752]: I0929 11:04:19.204153 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/205ab4e1-725f-4b7e-8c70-b809d2390460-config-data\") pod \"205ab4e1-725f-4b7e-8c70-b809d2390460\" (UID: \"205ab4e1-725f-4b7e-8c70-b809d2390460\") " Sep 29 11:04:19 crc kubenswrapper[4752]: I0929 11:04:19.204189 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/205ab4e1-725f-4b7e-8c70-b809d2390460-credential-keys\") pod \"205ab4e1-725f-4b7e-8c70-b809d2390460\" (UID: \"205ab4e1-725f-4b7e-8c70-b809d2390460\") " Sep 29 11:04:19 crc kubenswrapper[4752]: I0929 11:04:19.204232 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/205ab4e1-725f-4b7e-8c70-b809d2390460-combined-ca-bundle\") pod \"205ab4e1-725f-4b7e-8c70-b809d2390460\" (UID: \"205ab4e1-725f-4b7e-8c70-b809d2390460\") " Sep 29 11:04:19 crc kubenswrapper[4752]: I0929 11:04:19.209938 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/205ab4e1-725f-4b7e-8c70-b809d2390460-kube-api-access-gln7b" (OuterVolumeSpecName: "kube-api-access-gln7b") pod "205ab4e1-725f-4b7e-8c70-b809d2390460" (UID: "205ab4e1-725f-4b7e-8c70-b809d2390460"). InnerVolumeSpecName "kube-api-access-gln7b". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 11:04:19 crc kubenswrapper[4752]: I0929 11:04:19.210229 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/205ab4e1-725f-4b7e-8c70-b809d2390460-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "205ab4e1-725f-4b7e-8c70-b809d2390460" (UID: "205ab4e1-725f-4b7e-8c70-b809d2390460"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:04:19 crc kubenswrapper[4752]: I0929 11:04:19.210520 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/205ab4e1-725f-4b7e-8c70-b809d2390460-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "205ab4e1-725f-4b7e-8c70-b809d2390460" (UID: "205ab4e1-725f-4b7e-8c70-b809d2390460"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:04:19 crc kubenswrapper[4752]: I0929 11:04:19.213976 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/205ab4e1-725f-4b7e-8c70-b809d2390460-scripts" (OuterVolumeSpecName: "scripts") pod "205ab4e1-725f-4b7e-8c70-b809d2390460" (UID: "205ab4e1-725f-4b7e-8c70-b809d2390460"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:04:19 crc kubenswrapper[4752]: I0929 11:04:19.227563 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/205ab4e1-725f-4b7e-8c70-b809d2390460-config-data" (OuterVolumeSpecName: "config-data") pod "205ab4e1-725f-4b7e-8c70-b809d2390460" (UID: "205ab4e1-725f-4b7e-8c70-b809d2390460"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:04:19 crc kubenswrapper[4752]: I0929 11:04:19.232307 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/205ab4e1-725f-4b7e-8c70-b809d2390460-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "205ab4e1-725f-4b7e-8c70-b809d2390460" (UID: "205ab4e1-725f-4b7e-8c70-b809d2390460"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:04:19 crc kubenswrapper[4752]: E0929 11:04:19.287004 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/sg-core:latest" Sep 29 11:04:19 crc kubenswrapper[4752]: E0929 11:04:19.287155 4752 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:sg-core,Image:quay.io/openstack-k8s-operators/sg-core:latest,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:sg-core-conf-yaml,ReadOnly:false,MountPath:/etc/sg-core.conf.yaml,SubPath:sg-core.conf.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-n5x54,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_watcher-kuttl-default(c62050de-6e3f-4b09-a7a7-9038430e54ce): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 29 11:04:19 crc kubenswrapper[4752]: I0929 11:04:19.306346 4752 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/205ab4e1-725f-4b7e-8c70-b809d2390460-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 11:04:19 crc kubenswrapper[4752]: I0929 11:04:19.306379 4752 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/205ab4e1-725f-4b7e-8c70-b809d2390460-fernet-keys\") on node \"crc\" DevicePath \"\"" Sep 29 11:04:19 crc kubenswrapper[4752]: I0929 11:04:19.306394 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gln7b\" (UniqueName: \"kubernetes.io/projected/205ab4e1-725f-4b7e-8c70-b809d2390460-kube-api-access-gln7b\") on node \"crc\" DevicePath \"\"" Sep 29 11:04:19 crc kubenswrapper[4752]: I0929 11:04:19.306448 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/205ab4e1-725f-4b7e-8c70-b809d2390460-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 11:04:19 crc kubenswrapper[4752]: I0929 11:04:19.306464 4752 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/205ab4e1-725f-4b7e-8c70-b809d2390460-credential-keys\") on node \"crc\" DevicePath \"\"" Sep 29 11:04:19 crc kubenswrapper[4752]: I0929 11:04:19.306477 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/205ab4e1-725f-4b7e-8c70-b809d2390460-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 11:04:19 crc kubenswrapper[4752]: I0929 11:04:19.983383 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-bootstrap-fr8mq" event={"ID":"205ab4e1-725f-4b7e-8c70-b809d2390460","Type":"ContainerDied","Data":"7ceaf970dbd17551952f5fc56a624c1a985d760a014236ec185a043640d89af3"} Sep 29 11:04:19 crc kubenswrapper[4752]: I0929 11:04:19.983423 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7ceaf970dbd17551952f5fc56a624c1a985d760a014236ec185a043640d89af3" Sep 29 11:04:19 crc kubenswrapper[4752]: I0929 11:04:19.983428 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-bootstrap-fr8mq" Sep 29 11:04:20 crc kubenswrapper[4752]: I0929 11:04:20.288371 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/keystone-556fd57974-4ns5d"] Sep 29 11:04:20 crc kubenswrapper[4752]: E0929 11:04:20.289230 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="205ab4e1-725f-4b7e-8c70-b809d2390460" containerName="keystone-bootstrap" Sep 29 11:04:20 crc kubenswrapper[4752]: I0929 11:04:20.289254 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="205ab4e1-725f-4b7e-8c70-b809d2390460" containerName="keystone-bootstrap" Sep 29 11:04:20 crc kubenswrapper[4752]: I0929 11:04:20.289468 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="205ab4e1-725f-4b7e-8c70-b809d2390460" containerName="keystone-bootstrap" Sep 29 11:04:20 crc kubenswrapper[4752]: I0929 11:04:20.292326 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-556fd57974-4ns5d" Sep 29 11:04:20 crc kubenswrapper[4752]: I0929 11:04:20.295424 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"keystone-scripts" Sep 29 11:04:20 crc kubenswrapper[4752]: I0929 11:04:20.295642 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-keystone-public-svc" Sep 29 11:04:20 crc kubenswrapper[4752]: I0929 11:04:20.295794 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"keystone-keystone-dockercfg-xxmcc" Sep 29 11:04:20 crc kubenswrapper[4752]: I0929 11:04:20.296063 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-keystone-internal-svc" Sep 29 11:04:20 crc kubenswrapper[4752]: I0929 11:04:20.297153 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"keystone" Sep 29 11:04:20 crc kubenswrapper[4752]: I0929 11:04:20.307743 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/keystone-556fd57974-4ns5d"] Sep 29 11:04:20 crc kubenswrapper[4752]: I0929 11:04:20.310842 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"keystone-config-data" Sep 29 11:04:20 crc kubenswrapper[4752]: I0929 11:04:20.423571 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/514edfd0-8be7-4795-9f0e-fc403c69f692-fernet-keys\") pod \"keystone-556fd57974-4ns5d\" (UID: \"514edfd0-8be7-4795-9f0e-fc403c69f692\") " pod="watcher-kuttl-default/keystone-556fd57974-4ns5d" Sep 29 11:04:20 crc kubenswrapper[4752]: I0929 11:04:20.423634 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/514edfd0-8be7-4795-9f0e-fc403c69f692-credential-keys\") pod \"keystone-556fd57974-4ns5d\" (UID: \"514edfd0-8be7-4795-9f0e-fc403c69f692\") " pod="watcher-kuttl-default/keystone-556fd57974-4ns5d" Sep 29 11:04:20 crc kubenswrapper[4752]: I0929 11:04:20.423736 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/514edfd0-8be7-4795-9f0e-fc403c69f692-config-data\") pod \"keystone-556fd57974-4ns5d\" (UID: \"514edfd0-8be7-4795-9f0e-fc403c69f692\") " pod="watcher-kuttl-default/keystone-556fd57974-4ns5d" Sep 29 11:04:20 crc kubenswrapper[4752]: I0929 11:04:20.423781 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/514edfd0-8be7-4795-9f0e-fc403c69f692-public-tls-certs\") pod \"keystone-556fd57974-4ns5d\" (UID: \"514edfd0-8be7-4795-9f0e-fc403c69f692\") " pod="watcher-kuttl-default/keystone-556fd57974-4ns5d" Sep 29 11:04:20 crc kubenswrapper[4752]: I0929 11:04:20.423835 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/514edfd0-8be7-4795-9f0e-fc403c69f692-internal-tls-certs\") pod \"keystone-556fd57974-4ns5d\" (UID: \"514edfd0-8be7-4795-9f0e-fc403c69f692\") " pod="watcher-kuttl-default/keystone-556fd57974-4ns5d" Sep 29 11:04:20 crc kubenswrapper[4752]: I0929 11:04:20.423867 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8krws\" (UniqueName: \"kubernetes.io/projected/514edfd0-8be7-4795-9f0e-fc403c69f692-kube-api-access-8krws\") pod \"keystone-556fd57974-4ns5d\" (UID: \"514edfd0-8be7-4795-9f0e-fc403c69f692\") " pod="watcher-kuttl-default/keystone-556fd57974-4ns5d" Sep 29 11:04:20 crc kubenswrapper[4752]: I0929 11:04:20.423896 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/514edfd0-8be7-4795-9f0e-fc403c69f692-combined-ca-bundle\") pod \"keystone-556fd57974-4ns5d\" (UID: \"514edfd0-8be7-4795-9f0e-fc403c69f692\") " pod="watcher-kuttl-default/keystone-556fd57974-4ns5d" Sep 29 11:04:20 crc kubenswrapper[4752]: I0929 11:04:20.423918 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/514edfd0-8be7-4795-9f0e-fc403c69f692-scripts\") pod \"keystone-556fd57974-4ns5d\" (UID: \"514edfd0-8be7-4795-9f0e-fc403c69f692\") " pod="watcher-kuttl-default/keystone-556fd57974-4ns5d" Sep 29 11:04:20 crc kubenswrapper[4752]: I0929 11:04:20.525622 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/514edfd0-8be7-4795-9f0e-fc403c69f692-fernet-keys\") pod \"keystone-556fd57974-4ns5d\" (UID: \"514edfd0-8be7-4795-9f0e-fc403c69f692\") " pod="watcher-kuttl-default/keystone-556fd57974-4ns5d" Sep 29 11:04:20 crc kubenswrapper[4752]: I0929 11:04:20.525684 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/514edfd0-8be7-4795-9f0e-fc403c69f692-credential-keys\") pod \"keystone-556fd57974-4ns5d\" (UID: \"514edfd0-8be7-4795-9f0e-fc403c69f692\") " pod="watcher-kuttl-default/keystone-556fd57974-4ns5d" Sep 29 11:04:20 crc kubenswrapper[4752]: I0929 11:04:20.525746 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/514edfd0-8be7-4795-9f0e-fc403c69f692-config-data\") pod \"keystone-556fd57974-4ns5d\" (UID: \"514edfd0-8be7-4795-9f0e-fc403c69f692\") " pod="watcher-kuttl-default/keystone-556fd57974-4ns5d" Sep 29 11:04:20 crc kubenswrapper[4752]: I0929 11:04:20.525815 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/514edfd0-8be7-4795-9f0e-fc403c69f692-public-tls-certs\") pod \"keystone-556fd57974-4ns5d\" (UID: \"514edfd0-8be7-4795-9f0e-fc403c69f692\") " pod="watcher-kuttl-default/keystone-556fd57974-4ns5d" Sep 29 11:04:20 crc kubenswrapper[4752]: I0929 11:04:20.525856 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/514edfd0-8be7-4795-9f0e-fc403c69f692-internal-tls-certs\") pod \"keystone-556fd57974-4ns5d\" (UID: \"514edfd0-8be7-4795-9f0e-fc403c69f692\") " pod="watcher-kuttl-default/keystone-556fd57974-4ns5d" Sep 29 11:04:20 crc kubenswrapper[4752]: I0929 11:04:20.525886 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8krws\" (UniqueName: \"kubernetes.io/projected/514edfd0-8be7-4795-9f0e-fc403c69f692-kube-api-access-8krws\") pod \"keystone-556fd57974-4ns5d\" (UID: \"514edfd0-8be7-4795-9f0e-fc403c69f692\") " pod="watcher-kuttl-default/keystone-556fd57974-4ns5d" Sep 29 11:04:20 crc kubenswrapper[4752]: I0929 11:04:20.525919 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/514edfd0-8be7-4795-9f0e-fc403c69f692-combined-ca-bundle\") pod \"keystone-556fd57974-4ns5d\" (UID: \"514edfd0-8be7-4795-9f0e-fc403c69f692\") " pod="watcher-kuttl-default/keystone-556fd57974-4ns5d" Sep 29 11:04:20 crc kubenswrapper[4752]: I0929 11:04:20.525941 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/514edfd0-8be7-4795-9f0e-fc403c69f692-scripts\") pod \"keystone-556fd57974-4ns5d\" (UID: \"514edfd0-8be7-4795-9f0e-fc403c69f692\") " pod="watcher-kuttl-default/keystone-556fd57974-4ns5d" Sep 29 11:04:20 crc kubenswrapper[4752]: I0929 11:04:20.530330 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/514edfd0-8be7-4795-9f0e-fc403c69f692-scripts\") pod \"keystone-556fd57974-4ns5d\" (UID: \"514edfd0-8be7-4795-9f0e-fc403c69f692\") " pod="watcher-kuttl-default/keystone-556fd57974-4ns5d" Sep 29 11:04:20 crc kubenswrapper[4752]: I0929 11:04:20.530376 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/514edfd0-8be7-4795-9f0e-fc403c69f692-public-tls-certs\") pod \"keystone-556fd57974-4ns5d\" (UID: \"514edfd0-8be7-4795-9f0e-fc403c69f692\") " pod="watcher-kuttl-default/keystone-556fd57974-4ns5d" Sep 29 11:04:20 crc kubenswrapper[4752]: I0929 11:04:20.532006 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/514edfd0-8be7-4795-9f0e-fc403c69f692-credential-keys\") pod \"keystone-556fd57974-4ns5d\" (UID: \"514edfd0-8be7-4795-9f0e-fc403c69f692\") " pod="watcher-kuttl-default/keystone-556fd57974-4ns5d" Sep 29 11:04:20 crc kubenswrapper[4752]: I0929 11:04:20.533433 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/514edfd0-8be7-4795-9f0e-fc403c69f692-config-data\") pod \"keystone-556fd57974-4ns5d\" (UID: \"514edfd0-8be7-4795-9f0e-fc403c69f692\") " pod="watcher-kuttl-default/keystone-556fd57974-4ns5d" Sep 29 11:04:20 crc kubenswrapper[4752]: I0929 11:04:20.533786 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/514edfd0-8be7-4795-9f0e-fc403c69f692-combined-ca-bundle\") pod \"keystone-556fd57974-4ns5d\" (UID: \"514edfd0-8be7-4795-9f0e-fc403c69f692\") " pod="watcher-kuttl-default/keystone-556fd57974-4ns5d" Sep 29 11:04:20 crc kubenswrapper[4752]: I0929 11:04:20.534692 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/514edfd0-8be7-4795-9f0e-fc403c69f692-fernet-keys\") pod \"keystone-556fd57974-4ns5d\" (UID: \"514edfd0-8be7-4795-9f0e-fc403c69f692\") " pod="watcher-kuttl-default/keystone-556fd57974-4ns5d" Sep 29 11:04:20 crc kubenswrapper[4752]: I0929 11:04:20.536525 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/514edfd0-8be7-4795-9f0e-fc403c69f692-internal-tls-certs\") pod \"keystone-556fd57974-4ns5d\" (UID: \"514edfd0-8be7-4795-9f0e-fc403c69f692\") " pod="watcher-kuttl-default/keystone-556fd57974-4ns5d" Sep 29 11:04:20 crc kubenswrapper[4752]: I0929 11:04:20.546091 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8krws\" (UniqueName: \"kubernetes.io/projected/514edfd0-8be7-4795-9f0e-fc403c69f692-kube-api-access-8krws\") pod \"keystone-556fd57974-4ns5d\" (UID: \"514edfd0-8be7-4795-9f0e-fc403c69f692\") " pod="watcher-kuttl-default/keystone-556fd57974-4ns5d" Sep 29 11:04:20 crc kubenswrapper[4752]: I0929 11:04:20.617177 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-556fd57974-4ns5d" Sep 29 11:04:21 crc kubenswrapper[4752]: I0929 11:04:21.080378 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/keystone-556fd57974-4ns5d"] Sep 29 11:04:21 crc kubenswrapper[4752]: W0929 11:04:21.085998 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod514edfd0_8be7_4795_9f0e_fc403c69f692.slice/crio-861ed5bcb664845fe3584fc554aaec55212aa36c3350698f95b907d491cfb219 WatchSource:0}: Error finding container 861ed5bcb664845fe3584fc554aaec55212aa36c3350698f95b907d491cfb219: Status 404 returned error can't find the container with id 861ed5bcb664845fe3584fc554aaec55212aa36c3350698f95b907d491cfb219 Sep 29 11:04:22 crc kubenswrapper[4752]: I0929 11:04:22.009239 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-556fd57974-4ns5d" event={"ID":"514edfd0-8be7-4795-9f0e-fc403c69f692","Type":"ContainerStarted","Data":"821131176be46270f1e6848aeb8d29b9c886c865819cd1c20f6877ed72491b36"} Sep 29 11:04:22 crc kubenswrapper[4752]: I0929 11:04:22.009625 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-556fd57974-4ns5d" event={"ID":"514edfd0-8be7-4795-9f0e-fc403c69f692","Type":"ContainerStarted","Data":"861ed5bcb664845fe3584fc554aaec55212aa36c3350698f95b907d491cfb219"} Sep 29 11:04:22 crc kubenswrapper[4752]: I0929 11:04:22.010308 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/keystone-556fd57974-4ns5d" Sep 29 11:04:22 crc kubenswrapper[4752]: I0929 11:04:22.034738 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/keystone-556fd57974-4ns5d" podStartSLOduration=2.034716897 podStartE2EDuration="2.034716897s" podCreationTimestamp="2025-09-29 11:04:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 11:04:22.029488273 +0000 UTC m=+1202.818629940" watchObservedRunningTime="2025-09-29 11:04:22.034716897 +0000 UTC m=+1202.823858564" Sep 29 11:04:26 crc kubenswrapper[4752]: E0929 11:04:26.927977 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"sg-core\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="watcher-kuttl-default/ceilometer-0" podUID="c62050de-6e3f-4b09-a7a7-9038430e54ce" Sep 29 11:04:27 crc kubenswrapper[4752]: I0929 11:04:27.056523 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"c62050de-6e3f-4b09-a7a7-9038430e54ce","Type":"ContainerStarted","Data":"6bcc84e6ccd6280139c5d5e3340d7de35f2fa2a0e404398b943c14e9c306c992"} Sep 29 11:04:27 crc kubenswrapper[4752]: I0929 11:04:27.056715 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="c62050de-6e3f-4b09-a7a7-9038430e54ce" containerName="ceilometer-central-agent" containerID="cri-o://4c69423b5cd9bdf2002e45c57217a45936b03429dd6d0e7e112a95c1d82a1794" gracePeriod=30 Sep 29 11:04:27 crc kubenswrapper[4752]: I0929 11:04:27.056746 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:04:27 crc kubenswrapper[4752]: I0929 11:04:27.056843 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="c62050de-6e3f-4b09-a7a7-9038430e54ce" containerName="ceilometer-notification-agent" containerID="cri-o://20eb5c2cb774aa124a0c46c80c779afe1facd1d05932bbb28bee06d4866a9efc" gracePeriod=30 Sep 29 11:04:27 crc kubenswrapper[4752]: I0929 11:04:27.056786 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="c62050de-6e3f-4b09-a7a7-9038430e54ce" containerName="proxy-httpd" containerID="cri-o://6bcc84e6ccd6280139c5d5e3340d7de35f2fa2a0e404398b943c14e9c306c992" gracePeriod=30 Sep 29 11:04:28 crc kubenswrapper[4752]: I0929 11:04:28.066524 4752 generic.go:334] "Generic (PLEG): container finished" podID="c62050de-6e3f-4b09-a7a7-9038430e54ce" containerID="6bcc84e6ccd6280139c5d5e3340d7de35f2fa2a0e404398b943c14e9c306c992" exitCode=0 Sep 29 11:04:28 crc kubenswrapper[4752]: I0929 11:04:28.067020 4752 generic.go:334] "Generic (PLEG): container finished" podID="c62050de-6e3f-4b09-a7a7-9038430e54ce" containerID="4c69423b5cd9bdf2002e45c57217a45936b03429dd6d0e7e112a95c1d82a1794" exitCode=0 Sep 29 11:04:28 crc kubenswrapper[4752]: I0929 11:04:28.066607 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"c62050de-6e3f-4b09-a7a7-9038430e54ce","Type":"ContainerDied","Data":"6bcc84e6ccd6280139c5d5e3340d7de35f2fa2a0e404398b943c14e9c306c992"} Sep 29 11:04:28 crc kubenswrapper[4752]: I0929 11:04:28.067071 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"c62050de-6e3f-4b09-a7a7-9038430e54ce","Type":"ContainerDied","Data":"4c69423b5cd9bdf2002e45c57217a45936b03429dd6d0e7e112a95c1d82a1794"} Sep 29 11:04:30 crc kubenswrapper[4752]: I0929 11:04:30.086169 4752 generic.go:334] "Generic (PLEG): container finished" podID="c62050de-6e3f-4b09-a7a7-9038430e54ce" containerID="20eb5c2cb774aa124a0c46c80c779afe1facd1d05932bbb28bee06d4866a9efc" exitCode=0 Sep 29 11:04:30 crc kubenswrapper[4752]: I0929 11:04:30.086211 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"c62050de-6e3f-4b09-a7a7-9038430e54ce","Type":"ContainerDied","Data":"20eb5c2cb774aa124a0c46c80c779afe1facd1d05932bbb28bee06d4866a9efc"} Sep 29 11:04:30 crc kubenswrapper[4752]: I0929 11:04:30.185151 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:04:30 crc kubenswrapper[4752]: I0929 11:04:30.319167 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c62050de-6e3f-4b09-a7a7-9038430e54ce-config-data\") pod \"c62050de-6e3f-4b09-a7a7-9038430e54ce\" (UID: \"c62050de-6e3f-4b09-a7a7-9038430e54ce\") " Sep 29 11:04:30 crc kubenswrapper[4752]: I0929 11:04:30.319260 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c62050de-6e3f-4b09-a7a7-9038430e54ce-sg-core-conf-yaml\") pod \"c62050de-6e3f-4b09-a7a7-9038430e54ce\" (UID: \"c62050de-6e3f-4b09-a7a7-9038430e54ce\") " Sep 29 11:04:30 crc kubenswrapper[4752]: I0929 11:04:30.319303 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5x54\" (UniqueName: \"kubernetes.io/projected/c62050de-6e3f-4b09-a7a7-9038430e54ce-kube-api-access-n5x54\") pod \"c62050de-6e3f-4b09-a7a7-9038430e54ce\" (UID: \"c62050de-6e3f-4b09-a7a7-9038430e54ce\") " Sep 29 11:04:30 crc kubenswrapper[4752]: I0929 11:04:30.319335 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c62050de-6e3f-4b09-a7a7-9038430e54ce-scripts\") pod \"c62050de-6e3f-4b09-a7a7-9038430e54ce\" (UID: \"c62050de-6e3f-4b09-a7a7-9038430e54ce\") " Sep 29 11:04:30 crc kubenswrapper[4752]: I0929 11:04:30.319385 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c62050de-6e3f-4b09-a7a7-9038430e54ce-combined-ca-bundle\") pod \"c62050de-6e3f-4b09-a7a7-9038430e54ce\" (UID: \"c62050de-6e3f-4b09-a7a7-9038430e54ce\") " Sep 29 11:04:30 crc kubenswrapper[4752]: I0929 11:04:30.319471 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c62050de-6e3f-4b09-a7a7-9038430e54ce-run-httpd\") pod \"c62050de-6e3f-4b09-a7a7-9038430e54ce\" (UID: \"c62050de-6e3f-4b09-a7a7-9038430e54ce\") " Sep 29 11:04:30 crc kubenswrapper[4752]: I0929 11:04:30.319503 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c62050de-6e3f-4b09-a7a7-9038430e54ce-log-httpd\") pod \"c62050de-6e3f-4b09-a7a7-9038430e54ce\" (UID: \"c62050de-6e3f-4b09-a7a7-9038430e54ce\") " Sep 29 11:04:30 crc kubenswrapper[4752]: I0929 11:04:30.320245 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c62050de-6e3f-4b09-a7a7-9038430e54ce-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "c62050de-6e3f-4b09-a7a7-9038430e54ce" (UID: "c62050de-6e3f-4b09-a7a7-9038430e54ce"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 11:04:30 crc kubenswrapper[4752]: I0929 11:04:30.320385 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c62050de-6e3f-4b09-a7a7-9038430e54ce-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "c62050de-6e3f-4b09-a7a7-9038430e54ce" (UID: "c62050de-6e3f-4b09-a7a7-9038430e54ce"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 11:04:30 crc kubenswrapper[4752]: I0929 11:04:30.325454 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c62050de-6e3f-4b09-a7a7-9038430e54ce-scripts" (OuterVolumeSpecName: "scripts") pod "c62050de-6e3f-4b09-a7a7-9038430e54ce" (UID: "c62050de-6e3f-4b09-a7a7-9038430e54ce"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:04:30 crc kubenswrapper[4752]: I0929 11:04:30.325702 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c62050de-6e3f-4b09-a7a7-9038430e54ce-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "c62050de-6e3f-4b09-a7a7-9038430e54ce" (UID: "c62050de-6e3f-4b09-a7a7-9038430e54ce"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:04:30 crc kubenswrapper[4752]: I0929 11:04:30.325855 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c62050de-6e3f-4b09-a7a7-9038430e54ce-kube-api-access-n5x54" (OuterVolumeSpecName: "kube-api-access-n5x54") pod "c62050de-6e3f-4b09-a7a7-9038430e54ce" (UID: "c62050de-6e3f-4b09-a7a7-9038430e54ce"). InnerVolumeSpecName "kube-api-access-n5x54". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 11:04:30 crc kubenswrapper[4752]: I0929 11:04:30.402939 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c62050de-6e3f-4b09-a7a7-9038430e54ce-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c62050de-6e3f-4b09-a7a7-9038430e54ce" (UID: "c62050de-6e3f-4b09-a7a7-9038430e54ce"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:04:30 crc kubenswrapper[4752]: I0929 11:04:30.421781 4752 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c62050de-6e3f-4b09-a7a7-9038430e54ce-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 29 11:04:30 crc kubenswrapper[4752]: I0929 11:04:30.421827 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5x54\" (UniqueName: \"kubernetes.io/projected/c62050de-6e3f-4b09-a7a7-9038430e54ce-kube-api-access-n5x54\") on node \"crc\" DevicePath \"\"" Sep 29 11:04:30 crc kubenswrapper[4752]: I0929 11:04:30.421844 4752 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c62050de-6e3f-4b09-a7a7-9038430e54ce-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 11:04:30 crc kubenswrapper[4752]: I0929 11:04:30.421856 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c62050de-6e3f-4b09-a7a7-9038430e54ce-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 11:04:30 crc kubenswrapper[4752]: I0929 11:04:30.421868 4752 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c62050de-6e3f-4b09-a7a7-9038430e54ce-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 29 11:04:30 crc kubenswrapper[4752]: I0929 11:04:30.421878 4752 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c62050de-6e3f-4b09-a7a7-9038430e54ce-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 29 11:04:30 crc kubenswrapper[4752]: I0929 11:04:30.439391 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c62050de-6e3f-4b09-a7a7-9038430e54ce-config-data" (OuterVolumeSpecName: "config-data") pod "c62050de-6e3f-4b09-a7a7-9038430e54ce" (UID: "c62050de-6e3f-4b09-a7a7-9038430e54ce"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:04:30 crc kubenswrapper[4752]: I0929 11:04:30.523652 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c62050de-6e3f-4b09-a7a7-9038430e54ce-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 11:04:31 crc kubenswrapper[4752]: I0929 11:04:31.094634 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"c62050de-6e3f-4b09-a7a7-9038430e54ce","Type":"ContainerDied","Data":"9b9bb5c68a181084b810560270743bb281b06b0e2fda4cd823f2c787f0bce922"} Sep 29 11:04:31 crc kubenswrapper[4752]: I0929 11:04:31.094697 4752 scope.go:117] "RemoveContainer" containerID="6bcc84e6ccd6280139c5d5e3340d7de35f2fa2a0e404398b943c14e9c306c992" Sep 29 11:04:31 crc kubenswrapper[4752]: I0929 11:04:31.094711 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:04:31 crc kubenswrapper[4752]: I0929 11:04:31.123053 4752 scope.go:117] "RemoveContainer" containerID="20eb5c2cb774aa124a0c46c80c779afe1facd1d05932bbb28bee06d4866a9efc" Sep 29 11:04:31 crc kubenswrapper[4752]: I0929 11:04:31.153987 4752 scope.go:117] "RemoveContainer" containerID="4c69423b5cd9bdf2002e45c57217a45936b03429dd6d0e7e112a95c1d82a1794" Sep 29 11:04:31 crc kubenswrapper[4752]: I0929 11:04:31.162459 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Sep 29 11:04:31 crc kubenswrapper[4752]: I0929 11:04:31.174531 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Sep 29 11:04:31 crc kubenswrapper[4752]: I0929 11:04:31.203289 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Sep 29 11:04:31 crc kubenswrapper[4752]: E0929 11:04:31.203717 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c62050de-6e3f-4b09-a7a7-9038430e54ce" containerName="proxy-httpd" Sep 29 11:04:31 crc kubenswrapper[4752]: I0929 11:04:31.203741 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="c62050de-6e3f-4b09-a7a7-9038430e54ce" containerName="proxy-httpd" Sep 29 11:04:31 crc kubenswrapper[4752]: E0929 11:04:31.203769 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c62050de-6e3f-4b09-a7a7-9038430e54ce" containerName="ceilometer-central-agent" Sep 29 11:04:31 crc kubenswrapper[4752]: I0929 11:04:31.203778 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="c62050de-6e3f-4b09-a7a7-9038430e54ce" containerName="ceilometer-central-agent" Sep 29 11:04:31 crc kubenswrapper[4752]: E0929 11:04:31.203817 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c62050de-6e3f-4b09-a7a7-9038430e54ce" containerName="ceilometer-notification-agent" Sep 29 11:04:31 crc kubenswrapper[4752]: I0929 11:04:31.203827 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="c62050de-6e3f-4b09-a7a7-9038430e54ce" containerName="ceilometer-notification-agent" Sep 29 11:04:31 crc kubenswrapper[4752]: I0929 11:04:31.204035 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="c62050de-6e3f-4b09-a7a7-9038430e54ce" containerName="ceilometer-central-agent" Sep 29 11:04:31 crc kubenswrapper[4752]: I0929 11:04:31.204057 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="c62050de-6e3f-4b09-a7a7-9038430e54ce" containerName="ceilometer-notification-agent" Sep 29 11:04:31 crc kubenswrapper[4752]: I0929 11:04:31.204080 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="c62050de-6e3f-4b09-a7a7-9038430e54ce" containerName="proxy-httpd" Sep 29 11:04:31 crc kubenswrapper[4752]: I0929 11:04:31.206925 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:04:31 crc kubenswrapper[4752]: I0929 11:04:31.211722 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Sep 29 11:04:31 crc kubenswrapper[4752]: I0929 11:04:31.211985 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Sep 29 11:04:31 crc kubenswrapper[4752]: I0929 11:04:31.216636 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Sep 29 11:04:31 crc kubenswrapper[4752]: I0929 11:04:31.335223 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55d465d9-e21f-439a-82b6-3479e802de2d-scripts\") pod \"ceilometer-0\" (UID: \"55d465d9-e21f-439a-82b6-3479e802de2d\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:04:31 crc kubenswrapper[4752]: I0929 11:04:31.335269 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/55d465d9-e21f-439a-82b6-3479e802de2d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"55d465d9-e21f-439a-82b6-3479e802de2d\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:04:31 crc kubenswrapper[4752]: I0929 11:04:31.335298 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/55d465d9-e21f-439a-82b6-3479e802de2d-log-httpd\") pod \"ceilometer-0\" (UID: \"55d465d9-e21f-439a-82b6-3479e802de2d\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:04:31 crc kubenswrapper[4752]: I0929 11:04:31.335321 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/55d465d9-e21f-439a-82b6-3479e802de2d-run-httpd\") pod \"ceilometer-0\" (UID: \"55d465d9-e21f-439a-82b6-3479e802de2d\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:04:31 crc kubenswrapper[4752]: I0929 11:04:31.335690 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55d465d9-e21f-439a-82b6-3479e802de2d-config-data\") pod \"ceilometer-0\" (UID: \"55d465d9-e21f-439a-82b6-3479e802de2d\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:04:31 crc kubenswrapper[4752]: I0929 11:04:31.335792 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55d465d9-e21f-439a-82b6-3479e802de2d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"55d465d9-e21f-439a-82b6-3479e802de2d\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:04:31 crc kubenswrapper[4752]: I0929 11:04:31.335936 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmx4f\" (UniqueName: \"kubernetes.io/projected/55d465d9-e21f-439a-82b6-3479e802de2d-kube-api-access-nmx4f\") pod \"ceilometer-0\" (UID: \"55d465d9-e21f-439a-82b6-3479e802de2d\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:04:31 crc kubenswrapper[4752]: I0929 11:04:31.437358 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55d465d9-e21f-439a-82b6-3479e802de2d-config-data\") pod \"ceilometer-0\" (UID: \"55d465d9-e21f-439a-82b6-3479e802de2d\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:04:31 crc kubenswrapper[4752]: I0929 11:04:31.437466 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55d465d9-e21f-439a-82b6-3479e802de2d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"55d465d9-e21f-439a-82b6-3479e802de2d\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:04:31 crc kubenswrapper[4752]: I0929 11:04:31.437556 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmx4f\" (UniqueName: \"kubernetes.io/projected/55d465d9-e21f-439a-82b6-3479e802de2d-kube-api-access-nmx4f\") pod \"ceilometer-0\" (UID: \"55d465d9-e21f-439a-82b6-3479e802de2d\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:04:31 crc kubenswrapper[4752]: I0929 11:04:31.437618 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55d465d9-e21f-439a-82b6-3479e802de2d-scripts\") pod \"ceilometer-0\" (UID: \"55d465d9-e21f-439a-82b6-3479e802de2d\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:04:31 crc kubenswrapper[4752]: I0929 11:04:31.437650 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/55d465d9-e21f-439a-82b6-3479e802de2d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"55d465d9-e21f-439a-82b6-3479e802de2d\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:04:31 crc kubenswrapper[4752]: I0929 11:04:31.437699 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/55d465d9-e21f-439a-82b6-3479e802de2d-log-httpd\") pod \"ceilometer-0\" (UID: \"55d465d9-e21f-439a-82b6-3479e802de2d\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:04:31 crc kubenswrapper[4752]: I0929 11:04:31.437782 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/55d465d9-e21f-439a-82b6-3479e802de2d-run-httpd\") pod \"ceilometer-0\" (UID: \"55d465d9-e21f-439a-82b6-3479e802de2d\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:04:31 crc kubenswrapper[4752]: I0929 11:04:31.438421 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/55d465d9-e21f-439a-82b6-3479e802de2d-run-httpd\") pod \"ceilometer-0\" (UID: \"55d465d9-e21f-439a-82b6-3479e802de2d\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:04:31 crc kubenswrapper[4752]: I0929 11:04:31.439057 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/55d465d9-e21f-439a-82b6-3479e802de2d-log-httpd\") pod \"ceilometer-0\" (UID: \"55d465d9-e21f-439a-82b6-3479e802de2d\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:04:31 crc kubenswrapper[4752]: I0929 11:04:31.443779 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55d465d9-e21f-439a-82b6-3479e802de2d-scripts\") pod \"ceilometer-0\" (UID: \"55d465d9-e21f-439a-82b6-3479e802de2d\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:04:31 crc kubenswrapper[4752]: I0929 11:04:31.444077 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/55d465d9-e21f-439a-82b6-3479e802de2d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"55d465d9-e21f-439a-82b6-3479e802de2d\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:04:31 crc kubenswrapper[4752]: I0929 11:04:31.444176 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55d465d9-e21f-439a-82b6-3479e802de2d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"55d465d9-e21f-439a-82b6-3479e802de2d\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:04:31 crc kubenswrapper[4752]: I0929 11:04:31.445890 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55d465d9-e21f-439a-82b6-3479e802de2d-config-data\") pod \"ceilometer-0\" (UID: \"55d465d9-e21f-439a-82b6-3479e802de2d\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:04:31 crc kubenswrapper[4752]: I0929 11:04:31.460067 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmx4f\" (UniqueName: \"kubernetes.io/projected/55d465d9-e21f-439a-82b6-3479e802de2d-kube-api-access-nmx4f\") pod \"ceilometer-0\" (UID: \"55d465d9-e21f-439a-82b6-3479e802de2d\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:04:31 crc kubenswrapper[4752]: I0929 11:04:31.530160 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:04:31 crc kubenswrapper[4752]: I0929 11:04:31.991473 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Sep 29 11:04:32 crc kubenswrapper[4752]: I0929 11:04:32.041846 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c62050de-6e3f-4b09-a7a7-9038430e54ce" path="/var/lib/kubelet/pods/c62050de-6e3f-4b09-a7a7-9038430e54ce/volumes" Sep 29 11:04:32 crc kubenswrapper[4752]: I0929 11:04:32.108662 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"55d465d9-e21f-439a-82b6-3479e802de2d","Type":"ContainerStarted","Data":"3a51239a3cdaa3051e339a16ff25176ebe4e292ff13f4076922c88b06bfc6b21"} Sep 29 11:04:33 crc kubenswrapper[4752]: I0929 11:04:33.120941 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"55d465d9-e21f-439a-82b6-3479e802de2d","Type":"ContainerStarted","Data":"4e8892a588bf4b63f0fddc92dd878a75d46621ef8097799d1d5607553b221fea"} Sep 29 11:04:34 crc kubenswrapper[4752]: I0929 11:04:34.133396 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"55d465d9-e21f-439a-82b6-3479e802de2d","Type":"ContainerStarted","Data":"58191b9ef2d283ef09b494d660b4bd24a94f3194e4301effa22f7b53893466f9"} Sep 29 11:04:34 crc kubenswrapper[4752]: I0929 11:04:34.133734 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"55d465d9-e21f-439a-82b6-3479e802de2d","Type":"ContainerStarted","Data":"1fb13b4b68f3d03678d834847478eaa955b615ed728c6bbe1e9212248f03118b"} Sep 29 11:04:36 crc kubenswrapper[4752]: I0929 11:04:36.155185 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"55d465d9-e21f-439a-82b6-3479e802de2d","Type":"ContainerStarted","Data":"e0c9f0bb35fa45ecabfadad578f00bc4869ad79780aff492b42f6761e87da1e1"} Sep 29 11:04:36 crc kubenswrapper[4752]: I0929 11:04:36.155839 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:04:36 crc kubenswrapper[4752]: I0929 11:04:36.179154 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/ceilometer-0" podStartSLOduration=1.806631607 podStartE2EDuration="5.179127975s" podCreationTimestamp="2025-09-29 11:04:31 +0000 UTC" firstStartedPulling="2025-09-29 11:04:31.994713603 +0000 UTC m=+1212.783855300" lastFinishedPulling="2025-09-29 11:04:35.367209961 +0000 UTC m=+1216.156351668" observedRunningTime="2025-09-29 11:04:36.173651124 +0000 UTC m=+1216.962792801" watchObservedRunningTime="2025-09-29 11:04:36.179127975 +0000 UTC m=+1216.968269682" Sep 29 11:04:52 crc kubenswrapper[4752]: I0929 11:04:52.197104 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/keystone-556fd57974-4ns5d" Sep 29 11:04:53 crc kubenswrapper[4752]: I0929 11:04:53.150235 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/openstackclient"] Sep 29 11:04:53 crc kubenswrapper[4752]: I0929 11:04:53.152111 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/openstackclient" Sep 29 11:04:53 crc kubenswrapper[4752]: I0929 11:04:53.161153 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"openstack-config-secret" Sep 29 11:04:53 crc kubenswrapper[4752]: I0929 11:04:53.161170 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"openstackclient-openstackclient-dockercfg-x2j86" Sep 29 11:04:53 crc kubenswrapper[4752]: I0929 11:04:53.164107 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"watcher-kuttl-default"/"openstack-config" Sep 29 11:04:53 crc kubenswrapper[4752]: I0929 11:04:53.168101 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/openstackclient"] Sep 29 11:04:53 crc kubenswrapper[4752]: I0929 11:04:53.242678 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/228c713d-fd14-4b79-9658-e5e4dd26d2e0-openstack-config-secret\") pod \"openstackclient\" (UID: \"228c713d-fd14-4b79-9658-e5e4dd26d2e0\") " pod="watcher-kuttl-default/openstackclient" Sep 29 11:04:53 crc kubenswrapper[4752]: I0929 11:04:53.242744 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/228c713d-fd14-4b79-9658-e5e4dd26d2e0-combined-ca-bundle\") pod \"openstackclient\" (UID: \"228c713d-fd14-4b79-9658-e5e4dd26d2e0\") " pod="watcher-kuttl-default/openstackclient" Sep 29 11:04:53 crc kubenswrapper[4752]: I0929 11:04:53.242779 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/228c713d-fd14-4b79-9658-e5e4dd26d2e0-openstack-config\") pod \"openstackclient\" (UID: \"228c713d-fd14-4b79-9658-e5e4dd26d2e0\") " pod="watcher-kuttl-default/openstackclient" Sep 29 11:04:53 crc kubenswrapper[4752]: I0929 11:04:53.243105 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46xjg\" (UniqueName: \"kubernetes.io/projected/228c713d-fd14-4b79-9658-e5e4dd26d2e0-kube-api-access-46xjg\") pod \"openstackclient\" (UID: \"228c713d-fd14-4b79-9658-e5e4dd26d2e0\") " pod="watcher-kuttl-default/openstackclient" Sep 29 11:04:53 crc kubenswrapper[4752]: I0929 11:04:53.344660 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46xjg\" (UniqueName: \"kubernetes.io/projected/228c713d-fd14-4b79-9658-e5e4dd26d2e0-kube-api-access-46xjg\") pod \"openstackclient\" (UID: \"228c713d-fd14-4b79-9658-e5e4dd26d2e0\") " pod="watcher-kuttl-default/openstackclient" Sep 29 11:04:53 crc kubenswrapper[4752]: I0929 11:04:53.344745 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/228c713d-fd14-4b79-9658-e5e4dd26d2e0-openstack-config-secret\") pod \"openstackclient\" (UID: \"228c713d-fd14-4b79-9658-e5e4dd26d2e0\") " pod="watcher-kuttl-default/openstackclient" Sep 29 11:04:53 crc kubenswrapper[4752]: I0929 11:04:53.344770 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/228c713d-fd14-4b79-9658-e5e4dd26d2e0-combined-ca-bundle\") pod \"openstackclient\" (UID: \"228c713d-fd14-4b79-9658-e5e4dd26d2e0\") " pod="watcher-kuttl-default/openstackclient" Sep 29 11:04:53 crc kubenswrapper[4752]: I0929 11:04:53.344793 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/228c713d-fd14-4b79-9658-e5e4dd26d2e0-openstack-config\") pod \"openstackclient\" (UID: \"228c713d-fd14-4b79-9658-e5e4dd26d2e0\") " pod="watcher-kuttl-default/openstackclient" Sep 29 11:04:53 crc kubenswrapper[4752]: I0929 11:04:53.345663 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/228c713d-fd14-4b79-9658-e5e4dd26d2e0-openstack-config\") pod \"openstackclient\" (UID: \"228c713d-fd14-4b79-9658-e5e4dd26d2e0\") " pod="watcher-kuttl-default/openstackclient" Sep 29 11:04:53 crc kubenswrapper[4752]: I0929 11:04:53.352935 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/228c713d-fd14-4b79-9658-e5e4dd26d2e0-openstack-config-secret\") pod \"openstackclient\" (UID: \"228c713d-fd14-4b79-9658-e5e4dd26d2e0\") " pod="watcher-kuttl-default/openstackclient" Sep 29 11:04:53 crc kubenswrapper[4752]: I0929 11:04:53.375434 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/228c713d-fd14-4b79-9658-e5e4dd26d2e0-combined-ca-bundle\") pod \"openstackclient\" (UID: \"228c713d-fd14-4b79-9658-e5e4dd26d2e0\") " pod="watcher-kuttl-default/openstackclient" Sep 29 11:04:53 crc kubenswrapper[4752]: I0929 11:04:53.385381 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46xjg\" (UniqueName: \"kubernetes.io/projected/228c713d-fd14-4b79-9658-e5e4dd26d2e0-kube-api-access-46xjg\") pod \"openstackclient\" (UID: \"228c713d-fd14-4b79-9658-e5e4dd26d2e0\") " pod="watcher-kuttl-default/openstackclient" Sep 29 11:04:53 crc kubenswrapper[4752]: I0929 11:04:53.484366 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/openstackclient" Sep 29 11:04:53 crc kubenswrapper[4752]: I0929 11:04:53.960742 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/openstackclient"] Sep 29 11:04:54 crc kubenswrapper[4752]: I0929 11:04:54.328589 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/openstackclient" event={"ID":"228c713d-fd14-4b79-9658-e5e4dd26d2e0","Type":"ContainerStarted","Data":"d3bd5df56a7e14cc4c2cc1fc70fb55667008242e5b01a879d1c3d77604571063"} Sep 29 11:04:56 crc kubenswrapper[4752]: I0929 11:04:56.176087 4752 patch_prober.go:28] interesting pod/machine-config-daemon-mgrvs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 11:04:56 crc kubenswrapper[4752]: I0929 11:04:56.176529 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 11:05:01 crc kubenswrapper[4752]: I0929 11:05:01.535460 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:05:04 crc kubenswrapper[4752]: I0929 11:05:04.443961 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/openstackclient" event={"ID":"228c713d-fd14-4b79-9658-e5e4dd26d2e0","Type":"ContainerStarted","Data":"0d794cdeae79a36814a2a8c85d6ea71c8cd2b4d99c2dd39b47c238d70ad7d655"} Sep 29 11:05:04 crc kubenswrapper[4752]: I0929 11:05:04.461514 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/openstackclient" podStartSLOduration=1.832756263 podStartE2EDuration="11.461495097s" podCreationTimestamp="2025-09-29 11:04:53 +0000 UTC" firstStartedPulling="2025-09-29 11:04:53.972921771 +0000 UTC m=+1234.762063438" lastFinishedPulling="2025-09-29 11:05:03.601660585 +0000 UTC m=+1244.390802272" observedRunningTime="2025-09-29 11:05:04.460369588 +0000 UTC m=+1245.249511255" watchObservedRunningTime="2025-09-29 11:05:04.461495097 +0000 UTC m=+1245.250636764" Sep 29 11:05:04 crc kubenswrapper[4752]: I0929 11:05:04.744041 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/kube-state-metrics-0"] Sep 29 11:05:04 crc kubenswrapper[4752]: I0929 11:05:04.744249 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/kube-state-metrics-0" podUID="238407f7-3389-4791-becd-0852b1e66cea" containerName="kube-state-metrics" containerID="cri-o://1d0f583aba0a2fd55be194208a8969712a520a0590bc133440bf316da56eb8cb" gracePeriod=30 Sep 29 11:05:05 crc kubenswrapper[4752]: I0929 11:05:05.248670 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/kube-state-metrics-0" Sep 29 11:05:05 crc kubenswrapper[4752]: I0929 11:05:05.373875 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vv5qf\" (UniqueName: \"kubernetes.io/projected/238407f7-3389-4791-becd-0852b1e66cea-kube-api-access-vv5qf\") pod \"238407f7-3389-4791-becd-0852b1e66cea\" (UID: \"238407f7-3389-4791-becd-0852b1e66cea\") " Sep 29 11:05:05 crc kubenswrapper[4752]: I0929 11:05:05.380136 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/238407f7-3389-4791-becd-0852b1e66cea-kube-api-access-vv5qf" (OuterVolumeSpecName: "kube-api-access-vv5qf") pod "238407f7-3389-4791-becd-0852b1e66cea" (UID: "238407f7-3389-4791-becd-0852b1e66cea"). InnerVolumeSpecName "kube-api-access-vv5qf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 11:05:05 crc kubenswrapper[4752]: I0929 11:05:05.465314 4752 generic.go:334] "Generic (PLEG): container finished" podID="238407f7-3389-4791-becd-0852b1e66cea" containerID="1d0f583aba0a2fd55be194208a8969712a520a0590bc133440bf316da56eb8cb" exitCode=2 Sep 29 11:05:05 crc kubenswrapper[4752]: I0929 11:05:05.465382 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/kube-state-metrics-0" Sep 29 11:05:05 crc kubenswrapper[4752]: I0929 11:05:05.465395 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/kube-state-metrics-0" event={"ID":"238407f7-3389-4791-becd-0852b1e66cea","Type":"ContainerDied","Data":"1d0f583aba0a2fd55be194208a8969712a520a0590bc133440bf316da56eb8cb"} Sep 29 11:05:05 crc kubenswrapper[4752]: I0929 11:05:05.465449 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/kube-state-metrics-0" event={"ID":"238407f7-3389-4791-becd-0852b1e66cea","Type":"ContainerDied","Data":"93dffe600ec8f59a4c6baa802188934701b56823fb0c2c00a7d0152e4e51dd24"} Sep 29 11:05:05 crc kubenswrapper[4752]: I0929 11:05:05.465470 4752 scope.go:117] "RemoveContainer" containerID="1d0f583aba0a2fd55be194208a8969712a520a0590bc133440bf316da56eb8cb" Sep 29 11:05:05 crc kubenswrapper[4752]: I0929 11:05:05.475604 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vv5qf\" (UniqueName: \"kubernetes.io/projected/238407f7-3389-4791-becd-0852b1e66cea-kube-api-access-vv5qf\") on node \"crc\" DevicePath \"\"" Sep 29 11:05:05 crc kubenswrapper[4752]: I0929 11:05:05.495250 4752 scope.go:117] "RemoveContainer" containerID="1d0f583aba0a2fd55be194208a8969712a520a0590bc133440bf316da56eb8cb" Sep 29 11:05:05 crc kubenswrapper[4752]: E0929 11:05:05.495762 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d0f583aba0a2fd55be194208a8969712a520a0590bc133440bf316da56eb8cb\": container with ID starting with 1d0f583aba0a2fd55be194208a8969712a520a0590bc133440bf316da56eb8cb not found: ID does not exist" containerID="1d0f583aba0a2fd55be194208a8969712a520a0590bc133440bf316da56eb8cb" Sep 29 11:05:05 crc kubenswrapper[4752]: I0929 11:05:05.495795 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d0f583aba0a2fd55be194208a8969712a520a0590bc133440bf316da56eb8cb"} err="failed to get container status \"1d0f583aba0a2fd55be194208a8969712a520a0590bc133440bf316da56eb8cb\": rpc error: code = NotFound desc = could not find container \"1d0f583aba0a2fd55be194208a8969712a520a0590bc133440bf316da56eb8cb\": container with ID starting with 1d0f583aba0a2fd55be194208a8969712a520a0590bc133440bf316da56eb8cb not found: ID does not exist" Sep 29 11:05:05 crc kubenswrapper[4752]: I0929 11:05:05.502876 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/kube-state-metrics-0"] Sep 29 11:05:05 crc kubenswrapper[4752]: I0929 11:05:05.511701 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/kube-state-metrics-0"] Sep 29 11:05:05 crc kubenswrapper[4752]: I0929 11:05:05.523663 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/kube-state-metrics-0"] Sep 29 11:05:05 crc kubenswrapper[4752]: E0929 11:05:05.524126 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="238407f7-3389-4791-becd-0852b1e66cea" containerName="kube-state-metrics" Sep 29 11:05:05 crc kubenswrapper[4752]: I0929 11:05:05.524152 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="238407f7-3389-4791-becd-0852b1e66cea" containerName="kube-state-metrics" Sep 29 11:05:05 crc kubenswrapper[4752]: I0929 11:05:05.524360 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="238407f7-3389-4791-becd-0852b1e66cea" containerName="kube-state-metrics" Sep 29 11:05:05 crc kubenswrapper[4752]: I0929 11:05:05.525120 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/kube-state-metrics-0" Sep 29 11:05:05 crc kubenswrapper[4752]: I0929 11:05:05.527256 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-kube-state-metrics-svc" Sep 29 11:05:05 crc kubenswrapper[4752]: I0929 11:05:05.529589 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"kube-state-metrics-tls-config" Sep 29 11:05:05 crc kubenswrapper[4752]: I0929 11:05:05.535298 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/kube-state-metrics-0"] Sep 29 11:05:05 crc kubenswrapper[4752]: I0929 11:05:05.679288 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ebf91ee-06e1-493b-89ad-8af75463aa3e-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"8ebf91ee-06e1-493b-89ad-8af75463aa3e\") " pod="watcher-kuttl-default/kube-state-metrics-0" Sep 29 11:05:05 crc kubenswrapper[4752]: I0929 11:05:05.679377 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/8ebf91ee-06e1-493b-89ad-8af75463aa3e-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"8ebf91ee-06e1-493b-89ad-8af75463aa3e\") " pod="watcher-kuttl-default/kube-state-metrics-0" Sep 29 11:05:05 crc kubenswrapper[4752]: I0929 11:05:05.679682 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7pv4\" (UniqueName: \"kubernetes.io/projected/8ebf91ee-06e1-493b-89ad-8af75463aa3e-kube-api-access-z7pv4\") pod \"kube-state-metrics-0\" (UID: \"8ebf91ee-06e1-493b-89ad-8af75463aa3e\") " pod="watcher-kuttl-default/kube-state-metrics-0" Sep 29 11:05:05 crc kubenswrapper[4752]: I0929 11:05:05.679761 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ebf91ee-06e1-493b-89ad-8af75463aa3e-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"8ebf91ee-06e1-493b-89ad-8af75463aa3e\") " pod="watcher-kuttl-default/kube-state-metrics-0" Sep 29 11:05:05 crc kubenswrapper[4752]: I0929 11:05:05.781122 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ebf91ee-06e1-493b-89ad-8af75463aa3e-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"8ebf91ee-06e1-493b-89ad-8af75463aa3e\") " pod="watcher-kuttl-default/kube-state-metrics-0" Sep 29 11:05:05 crc kubenswrapper[4752]: I0929 11:05:05.781248 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/8ebf91ee-06e1-493b-89ad-8af75463aa3e-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"8ebf91ee-06e1-493b-89ad-8af75463aa3e\") " pod="watcher-kuttl-default/kube-state-metrics-0" Sep 29 11:05:05 crc kubenswrapper[4752]: I0929 11:05:05.781382 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7pv4\" (UniqueName: \"kubernetes.io/projected/8ebf91ee-06e1-493b-89ad-8af75463aa3e-kube-api-access-z7pv4\") pod \"kube-state-metrics-0\" (UID: \"8ebf91ee-06e1-493b-89ad-8af75463aa3e\") " pod="watcher-kuttl-default/kube-state-metrics-0" Sep 29 11:05:05 crc kubenswrapper[4752]: I0929 11:05:05.781427 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ebf91ee-06e1-493b-89ad-8af75463aa3e-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"8ebf91ee-06e1-493b-89ad-8af75463aa3e\") " pod="watcher-kuttl-default/kube-state-metrics-0" Sep 29 11:05:05 crc kubenswrapper[4752]: I0929 11:05:05.786497 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/8ebf91ee-06e1-493b-89ad-8af75463aa3e-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"8ebf91ee-06e1-493b-89ad-8af75463aa3e\") " pod="watcher-kuttl-default/kube-state-metrics-0" Sep 29 11:05:05 crc kubenswrapper[4752]: I0929 11:05:05.786793 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ebf91ee-06e1-493b-89ad-8af75463aa3e-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"8ebf91ee-06e1-493b-89ad-8af75463aa3e\") " pod="watcher-kuttl-default/kube-state-metrics-0" Sep 29 11:05:05 crc kubenswrapper[4752]: I0929 11:05:05.790766 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ebf91ee-06e1-493b-89ad-8af75463aa3e-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"8ebf91ee-06e1-493b-89ad-8af75463aa3e\") " pod="watcher-kuttl-default/kube-state-metrics-0" Sep 29 11:05:05 crc kubenswrapper[4752]: I0929 11:05:05.811350 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7pv4\" (UniqueName: \"kubernetes.io/projected/8ebf91ee-06e1-493b-89ad-8af75463aa3e-kube-api-access-z7pv4\") pod \"kube-state-metrics-0\" (UID: \"8ebf91ee-06e1-493b-89ad-8af75463aa3e\") " pod="watcher-kuttl-default/kube-state-metrics-0" Sep 29 11:05:05 crc kubenswrapper[4752]: I0929 11:05:05.845978 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/kube-state-metrics-0" Sep 29 11:05:06 crc kubenswrapper[4752]: I0929 11:05:06.043086 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="238407f7-3389-4791-becd-0852b1e66cea" path="/var/lib/kubelet/pods/238407f7-3389-4791-becd-0852b1e66cea/volumes" Sep 29 11:05:06 crc kubenswrapper[4752]: I0929 11:05:06.158105 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Sep 29 11:05:06 crc kubenswrapper[4752]: I0929 11:05:06.158375 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="55d465d9-e21f-439a-82b6-3479e802de2d" containerName="ceilometer-central-agent" containerID="cri-o://4e8892a588bf4b63f0fddc92dd878a75d46621ef8097799d1d5607553b221fea" gracePeriod=30 Sep 29 11:05:06 crc kubenswrapper[4752]: I0929 11:05:06.158440 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="55d465d9-e21f-439a-82b6-3479e802de2d" containerName="sg-core" containerID="cri-o://58191b9ef2d283ef09b494d660b4bd24a94f3194e4301effa22f7b53893466f9" gracePeriod=30 Sep 29 11:05:06 crc kubenswrapper[4752]: I0929 11:05:06.158466 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="55d465d9-e21f-439a-82b6-3479e802de2d" containerName="proxy-httpd" containerID="cri-o://e0c9f0bb35fa45ecabfadad578f00bc4869ad79780aff492b42f6761e87da1e1" gracePeriod=30 Sep 29 11:05:06 crc kubenswrapper[4752]: I0929 11:05:06.158571 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="55d465d9-e21f-439a-82b6-3479e802de2d" containerName="ceilometer-notification-agent" containerID="cri-o://1fb13b4b68f3d03678d834847478eaa955b615ed728c6bbe1e9212248f03118b" gracePeriod=30 Sep 29 11:05:06 crc kubenswrapper[4752]: I0929 11:05:06.326579 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/kube-state-metrics-0"] Sep 29 11:05:06 crc kubenswrapper[4752]: W0929 11:05:06.332355 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8ebf91ee_06e1_493b_89ad_8af75463aa3e.slice/crio-7f5c9d6722768e206c1b867ee047a0c0a4300452949fe6199be287a2e9eb3c9a WatchSource:0}: Error finding container 7f5c9d6722768e206c1b867ee047a0c0a4300452949fe6199be287a2e9eb3c9a: Status 404 returned error can't find the container with id 7f5c9d6722768e206c1b867ee047a0c0a4300452949fe6199be287a2e9eb3c9a Sep 29 11:05:06 crc kubenswrapper[4752]: I0929 11:05:06.473988 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/kube-state-metrics-0" event={"ID":"8ebf91ee-06e1-493b-89ad-8af75463aa3e","Type":"ContainerStarted","Data":"7f5c9d6722768e206c1b867ee047a0c0a4300452949fe6199be287a2e9eb3c9a"} Sep 29 11:05:06 crc kubenswrapper[4752]: I0929 11:05:06.479775 4752 generic.go:334] "Generic (PLEG): container finished" podID="55d465d9-e21f-439a-82b6-3479e802de2d" containerID="e0c9f0bb35fa45ecabfadad578f00bc4869ad79780aff492b42f6761e87da1e1" exitCode=0 Sep 29 11:05:06 crc kubenswrapper[4752]: I0929 11:05:06.480086 4752 generic.go:334] "Generic (PLEG): container finished" podID="55d465d9-e21f-439a-82b6-3479e802de2d" containerID="58191b9ef2d283ef09b494d660b4bd24a94f3194e4301effa22f7b53893466f9" exitCode=2 Sep 29 11:05:06 crc kubenswrapper[4752]: I0929 11:05:06.479828 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"55d465d9-e21f-439a-82b6-3479e802de2d","Type":"ContainerDied","Data":"e0c9f0bb35fa45ecabfadad578f00bc4869ad79780aff492b42f6761e87da1e1"} Sep 29 11:05:06 crc kubenswrapper[4752]: I0929 11:05:06.480115 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"55d465d9-e21f-439a-82b6-3479e802de2d","Type":"ContainerDied","Data":"58191b9ef2d283ef09b494d660b4bd24a94f3194e4301effa22f7b53893466f9"} Sep 29 11:05:07 crc kubenswrapper[4752]: I0929 11:05:07.488364 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/kube-state-metrics-0" event={"ID":"8ebf91ee-06e1-493b-89ad-8af75463aa3e","Type":"ContainerStarted","Data":"fec3941e325251e9d21436449216242d82b798262ff1a964d9c40516f53db56b"} Sep 29 11:05:07 crc kubenswrapper[4752]: I0929 11:05:07.488828 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/kube-state-metrics-0" Sep 29 11:05:07 crc kubenswrapper[4752]: I0929 11:05:07.492994 4752 generic.go:334] "Generic (PLEG): container finished" podID="55d465d9-e21f-439a-82b6-3479e802de2d" containerID="4e8892a588bf4b63f0fddc92dd878a75d46621ef8097799d1d5607553b221fea" exitCode=0 Sep 29 11:05:07 crc kubenswrapper[4752]: I0929 11:05:07.493036 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"55d465d9-e21f-439a-82b6-3479e802de2d","Type":"ContainerDied","Data":"4e8892a588bf4b63f0fddc92dd878a75d46621ef8097799d1d5607553b221fea"} Sep 29 11:05:07 crc kubenswrapper[4752]: I0929 11:05:07.510497 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/kube-state-metrics-0" podStartSLOduration=2.160700228 podStartE2EDuration="2.51047497s" podCreationTimestamp="2025-09-29 11:05:05 +0000 UTC" firstStartedPulling="2025-09-29 11:05:06.33478796 +0000 UTC m=+1247.123929627" lastFinishedPulling="2025-09-29 11:05:06.684562702 +0000 UTC m=+1247.473704369" observedRunningTime="2025-09-29 11:05:07.503175112 +0000 UTC m=+1248.292316779" watchObservedRunningTime="2025-09-29 11:05:07.51047497 +0000 UTC m=+1248.299616677" Sep 29 11:05:08 crc kubenswrapper[4752]: I0929 11:05:08.443130 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-db-create-plzcj"] Sep 29 11:05:08 crc kubenswrapper[4752]: I0929 11:05:08.445835 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-plzcj" Sep 29 11:05:08 crc kubenswrapper[4752]: I0929 11:05:08.471609 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-db-create-plzcj"] Sep 29 11:05:08 crc kubenswrapper[4752]: I0929 11:05:08.538116 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qsn4\" (UniqueName: \"kubernetes.io/projected/a3ca3474-aa46-4a60-8451-524755f59b62-kube-api-access-4qsn4\") pod \"watcher-db-create-plzcj\" (UID: \"a3ca3474-aa46-4a60-8451-524755f59b62\") " pod="watcher-kuttl-default/watcher-db-create-plzcj" Sep 29 11:05:08 crc kubenswrapper[4752]: I0929 11:05:08.639210 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qsn4\" (UniqueName: \"kubernetes.io/projected/a3ca3474-aa46-4a60-8451-524755f59b62-kube-api-access-4qsn4\") pod \"watcher-db-create-plzcj\" (UID: \"a3ca3474-aa46-4a60-8451-524755f59b62\") " pod="watcher-kuttl-default/watcher-db-create-plzcj" Sep 29 11:05:08 crc kubenswrapper[4752]: I0929 11:05:08.657651 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qsn4\" (UniqueName: \"kubernetes.io/projected/a3ca3474-aa46-4a60-8451-524755f59b62-kube-api-access-4qsn4\") pod \"watcher-db-create-plzcj\" (UID: \"a3ca3474-aa46-4a60-8451-524755f59b62\") " pod="watcher-kuttl-default/watcher-db-create-plzcj" Sep 29 11:05:08 crc kubenswrapper[4752]: I0929 11:05:08.783398 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-plzcj" Sep 29 11:05:09 crc kubenswrapper[4752]: I0929 11:05:09.127680 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:05:09 crc kubenswrapper[4752]: I0929 11:05:09.256931 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nmx4f\" (UniqueName: \"kubernetes.io/projected/55d465d9-e21f-439a-82b6-3479e802de2d-kube-api-access-nmx4f\") pod \"55d465d9-e21f-439a-82b6-3479e802de2d\" (UID: \"55d465d9-e21f-439a-82b6-3479e802de2d\") " Sep 29 11:05:09 crc kubenswrapper[4752]: I0929 11:05:09.257005 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55d465d9-e21f-439a-82b6-3479e802de2d-scripts\") pod \"55d465d9-e21f-439a-82b6-3479e802de2d\" (UID: \"55d465d9-e21f-439a-82b6-3479e802de2d\") " Sep 29 11:05:09 crc kubenswrapper[4752]: I0929 11:05:09.257206 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55d465d9-e21f-439a-82b6-3479e802de2d-combined-ca-bundle\") pod \"55d465d9-e21f-439a-82b6-3479e802de2d\" (UID: \"55d465d9-e21f-439a-82b6-3479e802de2d\") " Sep 29 11:05:09 crc kubenswrapper[4752]: I0929 11:05:09.257265 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55d465d9-e21f-439a-82b6-3479e802de2d-config-data\") pod \"55d465d9-e21f-439a-82b6-3479e802de2d\" (UID: \"55d465d9-e21f-439a-82b6-3479e802de2d\") " Sep 29 11:05:09 crc kubenswrapper[4752]: I0929 11:05:09.257297 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/55d465d9-e21f-439a-82b6-3479e802de2d-sg-core-conf-yaml\") pod \"55d465d9-e21f-439a-82b6-3479e802de2d\" (UID: \"55d465d9-e21f-439a-82b6-3479e802de2d\") " Sep 29 11:05:09 crc kubenswrapper[4752]: I0929 11:05:09.257352 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/55d465d9-e21f-439a-82b6-3479e802de2d-log-httpd\") pod \"55d465d9-e21f-439a-82b6-3479e802de2d\" (UID: \"55d465d9-e21f-439a-82b6-3479e802de2d\") " Sep 29 11:05:09 crc kubenswrapper[4752]: I0929 11:05:09.257396 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/55d465d9-e21f-439a-82b6-3479e802de2d-run-httpd\") pod \"55d465d9-e21f-439a-82b6-3479e802de2d\" (UID: \"55d465d9-e21f-439a-82b6-3479e802de2d\") " Sep 29 11:05:09 crc kubenswrapper[4752]: I0929 11:05:09.258302 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55d465d9-e21f-439a-82b6-3479e802de2d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "55d465d9-e21f-439a-82b6-3479e802de2d" (UID: "55d465d9-e21f-439a-82b6-3479e802de2d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 11:05:09 crc kubenswrapper[4752]: I0929 11:05:09.258579 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55d465d9-e21f-439a-82b6-3479e802de2d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "55d465d9-e21f-439a-82b6-3479e802de2d" (UID: "55d465d9-e21f-439a-82b6-3479e802de2d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 11:05:09 crc kubenswrapper[4752]: I0929 11:05:09.262536 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55d465d9-e21f-439a-82b6-3479e802de2d-kube-api-access-nmx4f" (OuterVolumeSpecName: "kube-api-access-nmx4f") pod "55d465d9-e21f-439a-82b6-3479e802de2d" (UID: "55d465d9-e21f-439a-82b6-3479e802de2d"). InnerVolumeSpecName "kube-api-access-nmx4f". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 11:05:09 crc kubenswrapper[4752]: I0929 11:05:09.263184 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55d465d9-e21f-439a-82b6-3479e802de2d-scripts" (OuterVolumeSpecName: "scripts") pod "55d465d9-e21f-439a-82b6-3479e802de2d" (UID: "55d465d9-e21f-439a-82b6-3479e802de2d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:05:09 crc kubenswrapper[4752]: I0929 11:05:09.293071 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55d465d9-e21f-439a-82b6-3479e802de2d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "55d465d9-e21f-439a-82b6-3479e802de2d" (UID: "55d465d9-e21f-439a-82b6-3479e802de2d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:05:09 crc kubenswrapper[4752]: I0929 11:05:09.301267 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-db-create-plzcj"] Sep 29 11:05:09 crc kubenswrapper[4752]: I0929 11:05:09.359846 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nmx4f\" (UniqueName: \"kubernetes.io/projected/55d465d9-e21f-439a-82b6-3479e802de2d-kube-api-access-nmx4f\") on node \"crc\" DevicePath \"\"" Sep 29 11:05:09 crc kubenswrapper[4752]: I0929 11:05:09.360254 4752 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55d465d9-e21f-439a-82b6-3479e802de2d-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 11:05:09 crc kubenswrapper[4752]: I0929 11:05:09.360269 4752 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/55d465d9-e21f-439a-82b6-3479e802de2d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 29 11:05:09 crc kubenswrapper[4752]: I0929 11:05:09.360280 4752 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/55d465d9-e21f-439a-82b6-3479e802de2d-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 29 11:05:09 crc kubenswrapper[4752]: I0929 11:05:09.360293 4752 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/55d465d9-e21f-439a-82b6-3479e802de2d-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 29 11:05:09 crc kubenswrapper[4752]: I0929 11:05:09.362242 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55d465d9-e21f-439a-82b6-3479e802de2d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "55d465d9-e21f-439a-82b6-3479e802de2d" (UID: "55d465d9-e21f-439a-82b6-3479e802de2d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:05:09 crc kubenswrapper[4752]: I0929 11:05:09.376489 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55d465d9-e21f-439a-82b6-3479e802de2d-config-data" (OuterVolumeSpecName: "config-data") pod "55d465d9-e21f-439a-82b6-3479e802de2d" (UID: "55d465d9-e21f-439a-82b6-3479e802de2d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:05:09 crc kubenswrapper[4752]: I0929 11:05:09.461926 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55d465d9-e21f-439a-82b6-3479e802de2d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 11:05:09 crc kubenswrapper[4752]: I0929 11:05:09.462025 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55d465d9-e21f-439a-82b6-3479e802de2d-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 11:05:09 crc kubenswrapper[4752]: I0929 11:05:09.510323 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-plzcj" event={"ID":"a3ca3474-aa46-4a60-8451-524755f59b62","Type":"ContainerStarted","Data":"1ebb764c140be5cb6d96d917c5c99c0adcfd7a731aea30b903e1c7551a5a8f0a"} Sep 29 11:05:09 crc kubenswrapper[4752]: I0929 11:05:09.510371 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-plzcj" event={"ID":"a3ca3474-aa46-4a60-8451-524755f59b62","Type":"ContainerStarted","Data":"6b961233ef8631f0236a6f61583e69e7f65be62d10160ac574a00e6cfb7f558f"} Sep 29 11:05:09 crc kubenswrapper[4752]: I0929 11:05:09.513385 4752 generic.go:334] "Generic (PLEG): container finished" podID="55d465d9-e21f-439a-82b6-3479e802de2d" containerID="1fb13b4b68f3d03678d834847478eaa955b615ed728c6bbe1e9212248f03118b" exitCode=0 Sep 29 11:05:09 crc kubenswrapper[4752]: I0929 11:05:09.513445 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"55d465d9-e21f-439a-82b6-3479e802de2d","Type":"ContainerDied","Data":"1fb13b4b68f3d03678d834847478eaa955b615ed728c6bbe1e9212248f03118b"} Sep 29 11:05:09 crc kubenswrapper[4752]: I0929 11:05:09.513487 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"55d465d9-e21f-439a-82b6-3479e802de2d","Type":"ContainerDied","Data":"3a51239a3cdaa3051e339a16ff25176ebe4e292ff13f4076922c88b06bfc6b21"} Sep 29 11:05:09 crc kubenswrapper[4752]: I0929 11:05:09.513486 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:05:09 crc kubenswrapper[4752]: I0929 11:05:09.513515 4752 scope.go:117] "RemoveContainer" containerID="e0c9f0bb35fa45ecabfadad578f00bc4869ad79780aff492b42f6761e87da1e1" Sep 29 11:05:09 crc kubenswrapper[4752]: I0929 11:05:09.528980 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-db-create-plzcj" podStartSLOduration=1.5289589430000001 podStartE2EDuration="1.528958943s" podCreationTimestamp="2025-09-29 11:05:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 11:05:09.526783508 +0000 UTC m=+1250.315925185" watchObservedRunningTime="2025-09-29 11:05:09.528958943 +0000 UTC m=+1250.318100610" Sep 29 11:05:09 crc kubenswrapper[4752]: I0929 11:05:09.537562 4752 scope.go:117] "RemoveContainer" containerID="58191b9ef2d283ef09b494d660b4bd24a94f3194e4301effa22f7b53893466f9" Sep 29 11:05:09 crc kubenswrapper[4752]: I0929 11:05:09.563935 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Sep 29 11:05:09 crc kubenswrapper[4752]: I0929 11:05:09.572549 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Sep 29 11:05:09 crc kubenswrapper[4752]: I0929 11:05:09.588403 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Sep 29 11:05:09 crc kubenswrapper[4752]: E0929 11:05:09.588729 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55d465d9-e21f-439a-82b6-3479e802de2d" containerName="ceilometer-central-agent" Sep 29 11:05:09 crc kubenswrapper[4752]: I0929 11:05:09.588746 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="55d465d9-e21f-439a-82b6-3479e802de2d" containerName="ceilometer-central-agent" Sep 29 11:05:09 crc kubenswrapper[4752]: E0929 11:05:09.588780 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55d465d9-e21f-439a-82b6-3479e802de2d" containerName="ceilometer-notification-agent" Sep 29 11:05:09 crc kubenswrapper[4752]: I0929 11:05:09.588787 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="55d465d9-e21f-439a-82b6-3479e802de2d" containerName="ceilometer-notification-agent" Sep 29 11:05:09 crc kubenswrapper[4752]: E0929 11:05:09.588812 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55d465d9-e21f-439a-82b6-3479e802de2d" containerName="proxy-httpd" Sep 29 11:05:09 crc kubenswrapper[4752]: I0929 11:05:09.588818 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="55d465d9-e21f-439a-82b6-3479e802de2d" containerName="proxy-httpd" Sep 29 11:05:09 crc kubenswrapper[4752]: E0929 11:05:09.588830 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55d465d9-e21f-439a-82b6-3479e802de2d" containerName="sg-core" Sep 29 11:05:09 crc kubenswrapper[4752]: I0929 11:05:09.588835 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="55d465d9-e21f-439a-82b6-3479e802de2d" containerName="sg-core" Sep 29 11:05:09 crc kubenswrapper[4752]: I0929 11:05:09.588998 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="55d465d9-e21f-439a-82b6-3479e802de2d" containerName="proxy-httpd" Sep 29 11:05:09 crc kubenswrapper[4752]: I0929 11:05:09.589011 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="55d465d9-e21f-439a-82b6-3479e802de2d" containerName="sg-core" Sep 29 11:05:09 crc kubenswrapper[4752]: I0929 11:05:09.589023 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="55d465d9-e21f-439a-82b6-3479e802de2d" containerName="ceilometer-notification-agent" Sep 29 11:05:09 crc kubenswrapper[4752]: I0929 11:05:09.589030 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="55d465d9-e21f-439a-82b6-3479e802de2d" containerName="ceilometer-central-agent" Sep 29 11:05:09 crc kubenswrapper[4752]: I0929 11:05:09.593709 4752 scope.go:117] "RemoveContainer" containerID="1fb13b4b68f3d03678d834847478eaa955b615ed728c6bbe1e9212248f03118b" Sep 29 11:05:09 crc kubenswrapper[4752]: I0929 11:05:09.597543 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:05:09 crc kubenswrapper[4752]: I0929 11:05:09.601423 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Sep 29 11:05:09 crc kubenswrapper[4752]: I0929 11:05:09.601731 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-ceilometer-internal-svc" Sep 29 11:05:09 crc kubenswrapper[4752]: I0929 11:05:09.601901 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Sep 29 11:05:09 crc kubenswrapper[4752]: I0929 11:05:09.632629 4752 scope.go:117] "RemoveContainer" containerID="4e8892a588bf4b63f0fddc92dd878a75d46621ef8097799d1d5607553b221fea" Sep 29 11:05:09 crc kubenswrapper[4752]: I0929 11:05:09.635941 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Sep 29 11:05:09 crc kubenswrapper[4752]: I0929 11:05:09.659614 4752 scope.go:117] "RemoveContainer" containerID="e0c9f0bb35fa45ecabfadad578f00bc4869ad79780aff492b42f6761e87da1e1" Sep 29 11:05:09 crc kubenswrapper[4752]: E0929 11:05:09.661443 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0c9f0bb35fa45ecabfadad578f00bc4869ad79780aff492b42f6761e87da1e1\": container with ID starting with e0c9f0bb35fa45ecabfadad578f00bc4869ad79780aff492b42f6761e87da1e1 not found: ID does not exist" containerID="e0c9f0bb35fa45ecabfadad578f00bc4869ad79780aff492b42f6761e87da1e1" Sep 29 11:05:09 crc kubenswrapper[4752]: I0929 11:05:09.661492 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0c9f0bb35fa45ecabfadad578f00bc4869ad79780aff492b42f6761e87da1e1"} err="failed to get container status \"e0c9f0bb35fa45ecabfadad578f00bc4869ad79780aff492b42f6761e87da1e1\": rpc error: code = NotFound desc = could not find container \"e0c9f0bb35fa45ecabfadad578f00bc4869ad79780aff492b42f6761e87da1e1\": container with ID starting with e0c9f0bb35fa45ecabfadad578f00bc4869ad79780aff492b42f6761e87da1e1 not found: ID does not exist" Sep 29 11:05:09 crc kubenswrapper[4752]: I0929 11:05:09.661513 4752 scope.go:117] "RemoveContainer" containerID="58191b9ef2d283ef09b494d660b4bd24a94f3194e4301effa22f7b53893466f9" Sep 29 11:05:09 crc kubenswrapper[4752]: E0929 11:05:09.662085 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58191b9ef2d283ef09b494d660b4bd24a94f3194e4301effa22f7b53893466f9\": container with ID starting with 58191b9ef2d283ef09b494d660b4bd24a94f3194e4301effa22f7b53893466f9 not found: ID does not exist" containerID="58191b9ef2d283ef09b494d660b4bd24a94f3194e4301effa22f7b53893466f9" Sep 29 11:05:09 crc kubenswrapper[4752]: I0929 11:05:09.662106 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58191b9ef2d283ef09b494d660b4bd24a94f3194e4301effa22f7b53893466f9"} err="failed to get container status \"58191b9ef2d283ef09b494d660b4bd24a94f3194e4301effa22f7b53893466f9\": rpc error: code = NotFound desc = could not find container \"58191b9ef2d283ef09b494d660b4bd24a94f3194e4301effa22f7b53893466f9\": container with ID starting with 58191b9ef2d283ef09b494d660b4bd24a94f3194e4301effa22f7b53893466f9 not found: ID does not exist" Sep 29 11:05:09 crc kubenswrapper[4752]: I0929 11:05:09.662120 4752 scope.go:117] "RemoveContainer" containerID="1fb13b4b68f3d03678d834847478eaa955b615ed728c6bbe1e9212248f03118b" Sep 29 11:05:09 crc kubenswrapper[4752]: E0929 11:05:09.662912 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1fb13b4b68f3d03678d834847478eaa955b615ed728c6bbe1e9212248f03118b\": container with ID starting with 1fb13b4b68f3d03678d834847478eaa955b615ed728c6bbe1e9212248f03118b not found: ID does not exist" containerID="1fb13b4b68f3d03678d834847478eaa955b615ed728c6bbe1e9212248f03118b" Sep 29 11:05:09 crc kubenswrapper[4752]: I0929 11:05:09.662929 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fb13b4b68f3d03678d834847478eaa955b615ed728c6bbe1e9212248f03118b"} err="failed to get container status \"1fb13b4b68f3d03678d834847478eaa955b615ed728c6bbe1e9212248f03118b\": rpc error: code = NotFound desc = could not find container \"1fb13b4b68f3d03678d834847478eaa955b615ed728c6bbe1e9212248f03118b\": container with ID starting with 1fb13b4b68f3d03678d834847478eaa955b615ed728c6bbe1e9212248f03118b not found: ID does not exist" Sep 29 11:05:09 crc kubenswrapper[4752]: I0929 11:05:09.662942 4752 scope.go:117] "RemoveContainer" containerID="4e8892a588bf4b63f0fddc92dd878a75d46621ef8097799d1d5607553b221fea" Sep 29 11:05:09 crc kubenswrapper[4752]: E0929 11:05:09.663236 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e8892a588bf4b63f0fddc92dd878a75d46621ef8097799d1d5607553b221fea\": container with ID starting with 4e8892a588bf4b63f0fddc92dd878a75d46621ef8097799d1d5607553b221fea not found: ID does not exist" containerID="4e8892a588bf4b63f0fddc92dd878a75d46621ef8097799d1d5607553b221fea" Sep 29 11:05:09 crc kubenswrapper[4752]: I0929 11:05:09.663255 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e8892a588bf4b63f0fddc92dd878a75d46621ef8097799d1d5607553b221fea"} err="failed to get container status \"4e8892a588bf4b63f0fddc92dd878a75d46621ef8097799d1d5607553b221fea\": rpc error: code = NotFound desc = could not find container \"4e8892a588bf4b63f0fddc92dd878a75d46621ef8097799d1d5607553b221fea\": container with ID starting with 4e8892a588bf4b63f0fddc92dd878a75d46621ef8097799d1d5607553b221fea not found: ID does not exist" Sep 29 11:05:09 crc kubenswrapper[4752]: I0929 11:05:09.674726 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92rpq\" (UniqueName: \"kubernetes.io/projected/fe6b2f0b-0559-4e95-ab61-b03e24044991-kube-api-access-92rpq\") pod \"ceilometer-0\" (UID: \"fe6b2f0b-0559-4e95-ab61-b03e24044991\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:05:09 crc kubenswrapper[4752]: I0929 11:05:09.674792 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe6b2f0b-0559-4e95-ab61-b03e24044991-scripts\") pod \"ceilometer-0\" (UID: \"fe6b2f0b-0559-4e95-ab61-b03e24044991\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:05:09 crc kubenswrapper[4752]: I0929 11:05:09.674842 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe6b2f0b-0559-4e95-ab61-b03e24044991-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"fe6b2f0b-0559-4e95-ab61-b03e24044991\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:05:09 crc kubenswrapper[4752]: I0929 11:05:09.674875 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe6b2f0b-0559-4e95-ab61-b03e24044991-run-httpd\") pod \"ceilometer-0\" (UID: \"fe6b2f0b-0559-4e95-ab61-b03e24044991\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:05:09 crc kubenswrapper[4752]: I0929 11:05:09.674934 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe6b2f0b-0559-4e95-ab61-b03e24044991-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fe6b2f0b-0559-4e95-ab61-b03e24044991\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:05:09 crc kubenswrapper[4752]: I0929 11:05:09.674957 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe6b2f0b-0559-4e95-ab61-b03e24044991-config-data\") pod \"ceilometer-0\" (UID: \"fe6b2f0b-0559-4e95-ab61-b03e24044991\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:05:09 crc kubenswrapper[4752]: I0929 11:05:09.674975 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fe6b2f0b-0559-4e95-ab61-b03e24044991-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fe6b2f0b-0559-4e95-ab61-b03e24044991\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:05:09 crc kubenswrapper[4752]: I0929 11:05:09.675014 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe6b2f0b-0559-4e95-ab61-b03e24044991-log-httpd\") pod \"ceilometer-0\" (UID: \"fe6b2f0b-0559-4e95-ab61-b03e24044991\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:05:09 crc kubenswrapper[4752]: I0929 11:05:09.776246 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe6b2f0b-0559-4e95-ab61-b03e24044991-scripts\") pod \"ceilometer-0\" (UID: \"fe6b2f0b-0559-4e95-ab61-b03e24044991\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:05:09 crc kubenswrapper[4752]: I0929 11:05:09.776315 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe6b2f0b-0559-4e95-ab61-b03e24044991-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"fe6b2f0b-0559-4e95-ab61-b03e24044991\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:05:09 crc kubenswrapper[4752]: I0929 11:05:09.776354 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe6b2f0b-0559-4e95-ab61-b03e24044991-run-httpd\") pod \"ceilometer-0\" (UID: \"fe6b2f0b-0559-4e95-ab61-b03e24044991\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:05:09 crc kubenswrapper[4752]: I0929 11:05:09.776400 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe6b2f0b-0559-4e95-ab61-b03e24044991-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fe6b2f0b-0559-4e95-ab61-b03e24044991\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:05:09 crc kubenswrapper[4752]: I0929 11:05:09.776424 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe6b2f0b-0559-4e95-ab61-b03e24044991-config-data\") pod \"ceilometer-0\" (UID: \"fe6b2f0b-0559-4e95-ab61-b03e24044991\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:05:09 crc kubenswrapper[4752]: I0929 11:05:09.776445 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fe6b2f0b-0559-4e95-ab61-b03e24044991-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fe6b2f0b-0559-4e95-ab61-b03e24044991\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:05:09 crc kubenswrapper[4752]: I0929 11:05:09.776490 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe6b2f0b-0559-4e95-ab61-b03e24044991-log-httpd\") pod \"ceilometer-0\" (UID: \"fe6b2f0b-0559-4e95-ab61-b03e24044991\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:05:09 crc kubenswrapper[4752]: I0929 11:05:09.776589 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92rpq\" (UniqueName: \"kubernetes.io/projected/fe6b2f0b-0559-4e95-ab61-b03e24044991-kube-api-access-92rpq\") pod \"ceilometer-0\" (UID: \"fe6b2f0b-0559-4e95-ab61-b03e24044991\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:05:09 crc kubenswrapper[4752]: I0929 11:05:09.777134 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe6b2f0b-0559-4e95-ab61-b03e24044991-run-httpd\") pod \"ceilometer-0\" (UID: \"fe6b2f0b-0559-4e95-ab61-b03e24044991\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:05:09 crc kubenswrapper[4752]: I0929 11:05:09.777431 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe6b2f0b-0559-4e95-ab61-b03e24044991-log-httpd\") pod \"ceilometer-0\" (UID: \"fe6b2f0b-0559-4e95-ab61-b03e24044991\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:05:09 crc kubenswrapper[4752]: I0929 11:05:09.780765 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe6b2f0b-0559-4e95-ab61-b03e24044991-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fe6b2f0b-0559-4e95-ab61-b03e24044991\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:05:09 crc kubenswrapper[4752]: I0929 11:05:09.781280 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fe6b2f0b-0559-4e95-ab61-b03e24044991-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fe6b2f0b-0559-4e95-ab61-b03e24044991\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:05:09 crc kubenswrapper[4752]: I0929 11:05:09.781541 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe6b2f0b-0559-4e95-ab61-b03e24044991-scripts\") pod \"ceilometer-0\" (UID: \"fe6b2f0b-0559-4e95-ab61-b03e24044991\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:05:09 crc kubenswrapper[4752]: I0929 11:05:09.781901 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe6b2f0b-0559-4e95-ab61-b03e24044991-config-data\") pod \"ceilometer-0\" (UID: \"fe6b2f0b-0559-4e95-ab61-b03e24044991\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:05:09 crc kubenswrapper[4752]: I0929 11:05:09.782017 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe6b2f0b-0559-4e95-ab61-b03e24044991-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"fe6b2f0b-0559-4e95-ab61-b03e24044991\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:05:09 crc kubenswrapper[4752]: I0929 11:05:09.798116 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92rpq\" (UniqueName: \"kubernetes.io/projected/fe6b2f0b-0559-4e95-ab61-b03e24044991-kube-api-access-92rpq\") pod \"ceilometer-0\" (UID: \"fe6b2f0b-0559-4e95-ab61-b03e24044991\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:05:09 crc kubenswrapper[4752]: I0929 11:05:09.924188 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:05:10 crc kubenswrapper[4752]: I0929 11:05:10.043541 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55d465d9-e21f-439a-82b6-3479e802de2d" path="/var/lib/kubelet/pods/55d465d9-e21f-439a-82b6-3479e802de2d/volumes" Sep 29 11:05:10 crc kubenswrapper[4752]: I0929 11:05:10.428710 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Sep 29 11:05:10 crc kubenswrapper[4752]: W0929 11:05:10.434924 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfe6b2f0b_0559_4e95_ab61_b03e24044991.slice/crio-30fce793ecc1161c8dc28c641d946a5863a186b6ff0cfb3f74213f569406a315 WatchSource:0}: Error finding container 30fce793ecc1161c8dc28c641d946a5863a186b6ff0cfb3f74213f569406a315: Status 404 returned error can't find the container with id 30fce793ecc1161c8dc28c641d946a5863a186b6ff0cfb3f74213f569406a315 Sep 29 11:05:10 crc kubenswrapper[4752]: I0929 11:05:10.523011 4752 generic.go:334] "Generic (PLEG): container finished" podID="a3ca3474-aa46-4a60-8451-524755f59b62" containerID="1ebb764c140be5cb6d96d917c5c99c0adcfd7a731aea30b903e1c7551a5a8f0a" exitCode=0 Sep 29 11:05:10 crc kubenswrapper[4752]: I0929 11:05:10.523088 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-plzcj" event={"ID":"a3ca3474-aa46-4a60-8451-524755f59b62","Type":"ContainerDied","Data":"1ebb764c140be5cb6d96d917c5c99c0adcfd7a731aea30b903e1c7551a5a8f0a"} Sep 29 11:05:10 crc kubenswrapper[4752]: I0929 11:05:10.523916 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"fe6b2f0b-0559-4e95-ab61-b03e24044991","Type":"ContainerStarted","Data":"30fce793ecc1161c8dc28c641d946a5863a186b6ff0cfb3f74213f569406a315"} Sep 29 11:05:11 crc kubenswrapper[4752]: I0929 11:05:11.539450 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"fe6b2f0b-0559-4e95-ab61-b03e24044991","Type":"ContainerStarted","Data":"acebffe58c91265d3760f6170c07ae08f0fd34f74f970b68852d0b0f3367cbd4"} Sep 29 11:05:11 crc kubenswrapper[4752]: I0929 11:05:11.935427 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-plzcj" Sep 29 11:05:12 crc kubenswrapper[4752]: I0929 11:05:12.015149 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4qsn4\" (UniqueName: \"kubernetes.io/projected/a3ca3474-aa46-4a60-8451-524755f59b62-kube-api-access-4qsn4\") pod \"a3ca3474-aa46-4a60-8451-524755f59b62\" (UID: \"a3ca3474-aa46-4a60-8451-524755f59b62\") " Sep 29 11:05:12 crc kubenswrapper[4752]: I0929 11:05:12.022998 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3ca3474-aa46-4a60-8451-524755f59b62-kube-api-access-4qsn4" (OuterVolumeSpecName: "kube-api-access-4qsn4") pod "a3ca3474-aa46-4a60-8451-524755f59b62" (UID: "a3ca3474-aa46-4a60-8451-524755f59b62"). InnerVolumeSpecName "kube-api-access-4qsn4". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 11:05:12 crc kubenswrapper[4752]: I0929 11:05:12.117387 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4qsn4\" (UniqueName: \"kubernetes.io/projected/a3ca3474-aa46-4a60-8451-524755f59b62-kube-api-access-4qsn4\") on node \"crc\" DevicePath \"\"" Sep 29 11:05:12 crc kubenswrapper[4752]: I0929 11:05:12.566620 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"fe6b2f0b-0559-4e95-ab61-b03e24044991","Type":"ContainerStarted","Data":"e6557739b1ea7ba8a590a94c5e65eee60d678c7c1c0eeb5c69d0b79e0ee9d8d7"} Sep 29 11:05:12 crc kubenswrapper[4752]: I0929 11:05:12.569582 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-plzcj" event={"ID":"a3ca3474-aa46-4a60-8451-524755f59b62","Type":"ContainerDied","Data":"6b961233ef8631f0236a6f61583e69e7f65be62d10160ac574a00e6cfb7f558f"} Sep 29 11:05:12 crc kubenswrapper[4752]: I0929 11:05:12.569622 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b961233ef8631f0236a6f61583e69e7f65be62d10160ac574a00e6cfb7f558f" Sep 29 11:05:12 crc kubenswrapper[4752]: I0929 11:05:12.569674 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-plzcj" Sep 29 11:05:13 crc kubenswrapper[4752]: I0929 11:05:13.580548 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"fe6b2f0b-0559-4e95-ab61-b03e24044991","Type":"ContainerStarted","Data":"7c2f805d9b21dcf0c65c1f6d70163ea25ae0000cea5432314846db7fa77df469"} Sep 29 11:05:14 crc kubenswrapper[4752]: I0929 11:05:14.591337 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"fe6b2f0b-0559-4e95-ab61-b03e24044991","Type":"ContainerStarted","Data":"971cc5e7f7e7c493f04b0cdac728a4107a5f081d2435406cd9c73a3b388a07e7"} Sep 29 11:05:14 crc kubenswrapper[4752]: I0929 11:05:14.593430 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:05:14 crc kubenswrapper[4752]: I0929 11:05:14.621301 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/ceilometer-0" podStartSLOduration=1.8865498299999999 podStartE2EDuration="5.6212766s" podCreationTimestamp="2025-09-29 11:05:09 +0000 UTC" firstStartedPulling="2025-09-29 11:05:10.437199553 +0000 UTC m=+1251.226341220" lastFinishedPulling="2025-09-29 11:05:14.171926323 +0000 UTC m=+1254.961067990" observedRunningTime="2025-09-29 11:05:14.612277588 +0000 UTC m=+1255.401419275" watchObservedRunningTime="2025-09-29 11:05:14.6212766 +0000 UTC m=+1255.410418267" Sep 29 11:05:15 crc kubenswrapper[4752]: I0929 11:05:15.869047 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/kube-state-metrics-0" Sep 29 11:05:18 crc kubenswrapper[4752]: I0929 11:05:18.550525 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-147d-account-create-79sqq"] Sep 29 11:05:18 crc kubenswrapper[4752]: E0929 11:05:18.551099 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3ca3474-aa46-4a60-8451-524755f59b62" containerName="mariadb-database-create" Sep 29 11:05:18 crc kubenswrapper[4752]: I0929 11:05:18.551111 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3ca3474-aa46-4a60-8451-524755f59b62" containerName="mariadb-database-create" Sep 29 11:05:18 crc kubenswrapper[4752]: I0929 11:05:18.551302 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3ca3474-aa46-4a60-8451-524755f59b62" containerName="mariadb-database-create" Sep 29 11:05:18 crc kubenswrapper[4752]: I0929 11:05:18.551841 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-147d-account-create-79sqq" Sep 29 11:05:18 crc kubenswrapper[4752]: I0929 11:05:18.560638 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-db-secret" Sep 29 11:05:18 crc kubenswrapper[4752]: I0929 11:05:18.565707 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-147d-account-create-79sqq"] Sep 29 11:05:18 crc kubenswrapper[4752]: I0929 11:05:18.662919 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8wwv\" (UniqueName: \"kubernetes.io/projected/739beb56-a180-4f55-b671-52a96d798d72-kube-api-access-r8wwv\") pod \"watcher-147d-account-create-79sqq\" (UID: \"739beb56-a180-4f55-b671-52a96d798d72\") " pod="watcher-kuttl-default/watcher-147d-account-create-79sqq" Sep 29 11:05:18 crc kubenswrapper[4752]: I0929 11:05:18.764712 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8wwv\" (UniqueName: \"kubernetes.io/projected/739beb56-a180-4f55-b671-52a96d798d72-kube-api-access-r8wwv\") pod \"watcher-147d-account-create-79sqq\" (UID: \"739beb56-a180-4f55-b671-52a96d798d72\") " pod="watcher-kuttl-default/watcher-147d-account-create-79sqq" Sep 29 11:05:18 crc kubenswrapper[4752]: I0929 11:05:18.796298 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8wwv\" (UniqueName: \"kubernetes.io/projected/739beb56-a180-4f55-b671-52a96d798d72-kube-api-access-r8wwv\") pod \"watcher-147d-account-create-79sqq\" (UID: \"739beb56-a180-4f55-b671-52a96d798d72\") " pod="watcher-kuttl-default/watcher-147d-account-create-79sqq" Sep 29 11:05:18 crc kubenswrapper[4752]: I0929 11:05:18.872900 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-147d-account-create-79sqq" Sep 29 11:05:19 crc kubenswrapper[4752]: I0929 11:05:19.319402 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-147d-account-create-79sqq"] Sep 29 11:05:19 crc kubenswrapper[4752]: W0929 11:05:19.325417 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod739beb56_a180_4f55_b671_52a96d798d72.slice/crio-4dbe00c874498c880c7fc87dd85871aba5dd8712e1aa7c0e0e8cb9d87a4c600a WatchSource:0}: Error finding container 4dbe00c874498c880c7fc87dd85871aba5dd8712e1aa7c0e0e8cb9d87a4c600a: Status 404 returned error can't find the container with id 4dbe00c874498c880c7fc87dd85871aba5dd8712e1aa7c0e0e8cb9d87a4c600a Sep 29 11:05:19 crc kubenswrapper[4752]: I0929 11:05:19.654735 4752 generic.go:334] "Generic (PLEG): container finished" podID="739beb56-a180-4f55-b671-52a96d798d72" containerID="97b115dcd0fec96d222056a04cfbc24da589dee1b137fb838ca8229cb9a12518" exitCode=0 Sep 29 11:05:19 crc kubenswrapper[4752]: I0929 11:05:19.654843 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-147d-account-create-79sqq" event={"ID":"739beb56-a180-4f55-b671-52a96d798d72","Type":"ContainerDied","Data":"97b115dcd0fec96d222056a04cfbc24da589dee1b137fb838ca8229cb9a12518"} Sep 29 11:05:19 crc kubenswrapper[4752]: I0929 11:05:19.655271 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-147d-account-create-79sqq" event={"ID":"739beb56-a180-4f55-b671-52a96d798d72","Type":"ContainerStarted","Data":"4dbe00c874498c880c7fc87dd85871aba5dd8712e1aa7c0e0e8cb9d87a4c600a"} Sep 29 11:05:21 crc kubenswrapper[4752]: I0929 11:05:21.044363 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-147d-account-create-79sqq" Sep 29 11:05:21 crc kubenswrapper[4752]: I0929 11:05:21.204142 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r8wwv\" (UniqueName: \"kubernetes.io/projected/739beb56-a180-4f55-b671-52a96d798d72-kube-api-access-r8wwv\") pod \"739beb56-a180-4f55-b671-52a96d798d72\" (UID: \"739beb56-a180-4f55-b671-52a96d798d72\") " Sep 29 11:05:21 crc kubenswrapper[4752]: I0929 11:05:21.215210 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/739beb56-a180-4f55-b671-52a96d798d72-kube-api-access-r8wwv" (OuterVolumeSpecName: "kube-api-access-r8wwv") pod "739beb56-a180-4f55-b671-52a96d798d72" (UID: "739beb56-a180-4f55-b671-52a96d798d72"). InnerVolumeSpecName "kube-api-access-r8wwv". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 11:05:21 crc kubenswrapper[4752]: I0929 11:05:21.306585 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r8wwv\" (UniqueName: \"kubernetes.io/projected/739beb56-a180-4f55-b671-52a96d798d72-kube-api-access-r8wwv\") on node \"crc\" DevicePath \"\"" Sep 29 11:05:21 crc kubenswrapper[4752]: I0929 11:05:21.679541 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-147d-account-create-79sqq" event={"ID":"739beb56-a180-4f55-b671-52a96d798d72","Type":"ContainerDied","Data":"4dbe00c874498c880c7fc87dd85871aba5dd8712e1aa7c0e0e8cb9d87a4c600a"} Sep 29 11:05:21 crc kubenswrapper[4752]: I0929 11:05:21.679607 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4dbe00c874498c880c7fc87dd85871aba5dd8712e1aa7c0e0e8cb9d87a4c600a" Sep 29 11:05:21 crc kubenswrapper[4752]: I0929 11:05:21.679637 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-147d-account-create-79sqq" Sep 29 11:05:23 crc kubenswrapper[4752]: I0929 11:05:23.884026 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-h665g"] Sep 29 11:05:23 crc kubenswrapper[4752]: E0929 11:05:23.885197 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="739beb56-a180-4f55-b671-52a96d798d72" containerName="mariadb-account-create" Sep 29 11:05:23 crc kubenswrapper[4752]: I0929 11:05:23.885219 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="739beb56-a180-4f55-b671-52a96d798d72" containerName="mariadb-account-create" Sep 29 11:05:23 crc kubenswrapper[4752]: I0929 11:05:23.885489 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="739beb56-a180-4f55-b671-52a96d798d72" containerName="mariadb-account-create" Sep 29 11:05:23 crc kubenswrapper[4752]: I0929 11:05:23.886436 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-h665g" Sep 29 11:05:23 crc kubenswrapper[4752]: I0929 11:05:23.888397 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-config-data" Sep 29 11:05:23 crc kubenswrapper[4752]: I0929 11:05:23.888846 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-watcher-kuttl-dockercfg-88q2l" Sep 29 11:05:23 crc kubenswrapper[4752]: I0929 11:05:23.894337 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-h665g"] Sep 29 11:05:23 crc kubenswrapper[4752]: I0929 11:05:23.952983 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b3cc2bbe-57f1-4c2f-802c-e7d954268185-db-sync-config-data\") pod \"watcher-kuttl-db-sync-h665g\" (UID: \"b3cc2bbe-57f1-4c2f-802c-e7d954268185\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-h665g" Sep 29 11:05:23 crc kubenswrapper[4752]: I0929 11:05:23.953028 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hstm\" (UniqueName: \"kubernetes.io/projected/b3cc2bbe-57f1-4c2f-802c-e7d954268185-kube-api-access-4hstm\") pod \"watcher-kuttl-db-sync-h665g\" (UID: \"b3cc2bbe-57f1-4c2f-802c-e7d954268185\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-h665g" Sep 29 11:05:23 crc kubenswrapper[4752]: I0929 11:05:23.953094 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3cc2bbe-57f1-4c2f-802c-e7d954268185-config-data\") pod \"watcher-kuttl-db-sync-h665g\" (UID: \"b3cc2bbe-57f1-4c2f-802c-e7d954268185\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-h665g" Sep 29 11:05:23 crc kubenswrapper[4752]: I0929 11:05:23.953134 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3cc2bbe-57f1-4c2f-802c-e7d954268185-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-h665g\" (UID: \"b3cc2bbe-57f1-4c2f-802c-e7d954268185\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-h665g" Sep 29 11:05:24 crc kubenswrapper[4752]: I0929 11:05:24.054096 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3cc2bbe-57f1-4c2f-802c-e7d954268185-config-data\") pod \"watcher-kuttl-db-sync-h665g\" (UID: \"b3cc2bbe-57f1-4c2f-802c-e7d954268185\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-h665g" Sep 29 11:05:24 crc kubenswrapper[4752]: I0929 11:05:24.054160 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3cc2bbe-57f1-4c2f-802c-e7d954268185-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-h665g\" (UID: \"b3cc2bbe-57f1-4c2f-802c-e7d954268185\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-h665g" Sep 29 11:05:24 crc kubenswrapper[4752]: I0929 11:05:24.054205 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b3cc2bbe-57f1-4c2f-802c-e7d954268185-db-sync-config-data\") pod \"watcher-kuttl-db-sync-h665g\" (UID: \"b3cc2bbe-57f1-4c2f-802c-e7d954268185\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-h665g" Sep 29 11:05:24 crc kubenswrapper[4752]: I0929 11:05:24.054228 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hstm\" (UniqueName: \"kubernetes.io/projected/b3cc2bbe-57f1-4c2f-802c-e7d954268185-kube-api-access-4hstm\") pod \"watcher-kuttl-db-sync-h665g\" (UID: \"b3cc2bbe-57f1-4c2f-802c-e7d954268185\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-h665g" Sep 29 11:05:24 crc kubenswrapper[4752]: I0929 11:05:24.059940 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b3cc2bbe-57f1-4c2f-802c-e7d954268185-db-sync-config-data\") pod \"watcher-kuttl-db-sync-h665g\" (UID: \"b3cc2bbe-57f1-4c2f-802c-e7d954268185\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-h665g" Sep 29 11:05:24 crc kubenswrapper[4752]: I0929 11:05:24.060173 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3cc2bbe-57f1-4c2f-802c-e7d954268185-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-h665g\" (UID: \"b3cc2bbe-57f1-4c2f-802c-e7d954268185\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-h665g" Sep 29 11:05:24 crc kubenswrapper[4752]: I0929 11:05:24.070609 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3cc2bbe-57f1-4c2f-802c-e7d954268185-config-data\") pod \"watcher-kuttl-db-sync-h665g\" (UID: \"b3cc2bbe-57f1-4c2f-802c-e7d954268185\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-h665g" Sep 29 11:05:24 crc kubenswrapper[4752]: I0929 11:05:24.090857 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hstm\" (UniqueName: \"kubernetes.io/projected/b3cc2bbe-57f1-4c2f-802c-e7d954268185-kube-api-access-4hstm\") pod \"watcher-kuttl-db-sync-h665g\" (UID: \"b3cc2bbe-57f1-4c2f-802c-e7d954268185\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-h665g" Sep 29 11:05:24 crc kubenswrapper[4752]: I0929 11:05:24.211361 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-h665g" Sep 29 11:05:24 crc kubenswrapper[4752]: I0929 11:05:24.689360 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-h665g"] Sep 29 11:05:24 crc kubenswrapper[4752]: W0929 11:05:24.689889 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb3cc2bbe_57f1_4c2f_802c_e7d954268185.slice/crio-0dbbbd8f08c2e9a7c12aa46caeca587e45c44ce437e35768195d512b2fba4721 WatchSource:0}: Error finding container 0dbbbd8f08c2e9a7c12aa46caeca587e45c44ce437e35768195d512b2fba4721: Status 404 returned error can't find the container with id 0dbbbd8f08c2e9a7c12aa46caeca587e45c44ce437e35768195d512b2fba4721 Sep 29 11:05:24 crc kubenswrapper[4752]: I0929 11:05:24.719908 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-h665g" event={"ID":"b3cc2bbe-57f1-4c2f-802c-e7d954268185","Type":"ContainerStarted","Data":"0dbbbd8f08c2e9a7c12aa46caeca587e45c44ce437e35768195d512b2fba4721"} Sep 29 11:05:26 crc kubenswrapper[4752]: I0929 11:05:26.175235 4752 patch_prober.go:28] interesting pod/machine-config-daemon-mgrvs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 11:05:26 crc kubenswrapper[4752]: I0929 11:05:26.175621 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 11:05:38 crc kubenswrapper[4752]: E0929 11:05:38.793987 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.66:5001/podified-master-centos10/openstack-watcher-api:watcher_latest" Sep 29 11:05:38 crc kubenswrapper[4752]: E0929 11:05:38.794678 4752 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.66:5001/podified-master-centos10/openstack-watcher-api:watcher_latest" Sep 29 11:05:38 crc kubenswrapper[4752]: E0929 11:05:38.794838 4752 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:watcher-kuttl-db-sync,Image:38.102.83.66:5001/podified-master-centos10/openstack-watcher-api:watcher_latest,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/watcher/watcher.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:watcher-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4hstm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-kuttl-db-sync-h665g_watcher-kuttl-default(b3cc2bbe-57f1-4c2f-802c-e7d954268185): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 29 11:05:38 crc kubenswrapper[4752]: E0929 11:05:38.796160 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-kuttl-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="watcher-kuttl-default/watcher-kuttl-db-sync-h665g" podUID="b3cc2bbe-57f1-4c2f-802c-e7d954268185" Sep 29 11:05:38 crc kubenswrapper[4752]: E0929 11:05:38.881919 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-kuttl-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.66:5001/podified-master-centos10/openstack-watcher-api:watcher_latest\\\"\"" pod="watcher-kuttl-default/watcher-kuttl-db-sync-h665g" podUID="b3cc2bbe-57f1-4c2f-802c-e7d954268185" Sep 29 11:05:39 crc kubenswrapper[4752]: I0929 11:05:39.934714 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:05:52 crc kubenswrapper[4752]: I0929 11:05:52.033672 4752 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 29 11:05:53 crc kubenswrapper[4752]: I0929 11:05:53.006436 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-h665g" event={"ID":"b3cc2bbe-57f1-4c2f-802c-e7d954268185","Type":"ContainerStarted","Data":"687ff988f13298c88fb93ee6de176587291f824339939fe4cfec4216479ceab7"} Sep 29 11:05:53 crc kubenswrapper[4752]: I0929 11:05:53.033476 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-db-sync-h665g" podStartSLOduration=2.600267499 podStartE2EDuration="30.033457336s" podCreationTimestamp="2025-09-29 11:05:23 +0000 UTC" firstStartedPulling="2025-09-29 11:05:24.69167854 +0000 UTC m=+1265.480820207" lastFinishedPulling="2025-09-29 11:05:52.124868367 +0000 UTC m=+1292.914010044" observedRunningTime="2025-09-29 11:05:53.025291145 +0000 UTC m=+1293.814432852" watchObservedRunningTime="2025-09-29 11:05:53.033457336 +0000 UTC m=+1293.822599003" Sep 29 11:05:56 crc kubenswrapper[4752]: I0929 11:05:56.032467 4752 generic.go:334] "Generic (PLEG): container finished" podID="b3cc2bbe-57f1-4c2f-802c-e7d954268185" containerID="687ff988f13298c88fb93ee6de176587291f824339939fe4cfec4216479ceab7" exitCode=0 Sep 29 11:05:56 crc kubenswrapper[4752]: I0929 11:05:56.048499 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-h665g" event={"ID":"b3cc2bbe-57f1-4c2f-802c-e7d954268185","Type":"ContainerDied","Data":"687ff988f13298c88fb93ee6de176587291f824339939fe4cfec4216479ceab7"} Sep 29 11:05:56 crc kubenswrapper[4752]: I0929 11:05:56.175508 4752 patch_prober.go:28] interesting pod/machine-config-daemon-mgrvs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 11:05:56 crc kubenswrapper[4752]: I0929 11:05:56.175602 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 11:05:56 crc kubenswrapper[4752]: I0929 11:05:56.175750 4752 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" Sep 29 11:05:56 crc kubenswrapper[4752]: I0929 11:05:56.176984 4752 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"163ff8dbb1a373e991e8699e30ebf0d1354dad4f96196cd59c49a7d6edcb147e"} pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 29 11:05:56 crc kubenswrapper[4752]: I0929 11:05:56.177115 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" containerName="machine-config-daemon" containerID="cri-o://163ff8dbb1a373e991e8699e30ebf0d1354dad4f96196cd59c49a7d6edcb147e" gracePeriod=600 Sep 29 11:05:57 crc kubenswrapper[4752]: I0929 11:05:57.046194 4752 generic.go:334] "Generic (PLEG): container finished" podID="5863c243-797d-462a-b11f-71aaf005f8d1" containerID="163ff8dbb1a373e991e8699e30ebf0d1354dad4f96196cd59c49a7d6edcb147e" exitCode=0 Sep 29 11:05:57 crc kubenswrapper[4752]: I0929 11:05:57.046262 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" event={"ID":"5863c243-797d-462a-b11f-71aaf005f8d1","Type":"ContainerDied","Data":"163ff8dbb1a373e991e8699e30ebf0d1354dad4f96196cd59c49a7d6edcb147e"} Sep 29 11:05:57 crc kubenswrapper[4752]: I0929 11:05:57.046602 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" event={"ID":"5863c243-797d-462a-b11f-71aaf005f8d1","Type":"ContainerStarted","Data":"18eab399f36ee078445fd05909a0d35ada9fdfa2424d9729b71ad67d5ec2e670"} Sep 29 11:05:57 crc kubenswrapper[4752]: I0929 11:05:57.046662 4752 scope.go:117] "RemoveContainer" containerID="48a0da04429cf7fcc316318f0d1c0bddde646fbce423db761e54fa0241cf9fda" Sep 29 11:05:57 crc kubenswrapper[4752]: I0929 11:05:57.421345 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-h665g" Sep 29 11:05:57 crc kubenswrapper[4752]: I0929 11:05:57.517057 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3cc2bbe-57f1-4c2f-802c-e7d954268185-combined-ca-bundle\") pod \"b3cc2bbe-57f1-4c2f-802c-e7d954268185\" (UID: \"b3cc2bbe-57f1-4c2f-802c-e7d954268185\") " Sep 29 11:05:57 crc kubenswrapper[4752]: I0929 11:05:57.517113 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b3cc2bbe-57f1-4c2f-802c-e7d954268185-db-sync-config-data\") pod \"b3cc2bbe-57f1-4c2f-802c-e7d954268185\" (UID: \"b3cc2bbe-57f1-4c2f-802c-e7d954268185\") " Sep 29 11:05:57 crc kubenswrapper[4752]: I0929 11:05:57.517206 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3cc2bbe-57f1-4c2f-802c-e7d954268185-config-data\") pod \"b3cc2bbe-57f1-4c2f-802c-e7d954268185\" (UID: \"b3cc2bbe-57f1-4c2f-802c-e7d954268185\") " Sep 29 11:05:57 crc kubenswrapper[4752]: I0929 11:05:57.517267 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4hstm\" (UniqueName: \"kubernetes.io/projected/b3cc2bbe-57f1-4c2f-802c-e7d954268185-kube-api-access-4hstm\") pod \"b3cc2bbe-57f1-4c2f-802c-e7d954268185\" (UID: \"b3cc2bbe-57f1-4c2f-802c-e7d954268185\") " Sep 29 11:05:57 crc kubenswrapper[4752]: I0929 11:05:57.522636 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3cc2bbe-57f1-4c2f-802c-e7d954268185-kube-api-access-4hstm" (OuterVolumeSpecName: "kube-api-access-4hstm") pod "b3cc2bbe-57f1-4c2f-802c-e7d954268185" (UID: "b3cc2bbe-57f1-4c2f-802c-e7d954268185"). InnerVolumeSpecName "kube-api-access-4hstm". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 11:05:57 crc kubenswrapper[4752]: I0929 11:05:57.538202 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3cc2bbe-57f1-4c2f-802c-e7d954268185-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "b3cc2bbe-57f1-4c2f-802c-e7d954268185" (UID: "b3cc2bbe-57f1-4c2f-802c-e7d954268185"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:05:57 crc kubenswrapper[4752]: I0929 11:05:57.540341 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3cc2bbe-57f1-4c2f-802c-e7d954268185-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b3cc2bbe-57f1-4c2f-802c-e7d954268185" (UID: "b3cc2bbe-57f1-4c2f-802c-e7d954268185"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:05:57 crc kubenswrapper[4752]: I0929 11:05:57.573507 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3cc2bbe-57f1-4c2f-802c-e7d954268185-config-data" (OuterVolumeSpecName: "config-data") pod "b3cc2bbe-57f1-4c2f-802c-e7d954268185" (UID: "b3cc2bbe-57f1-4c2f-802c-e7d954268185"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:05:57 crc kubenswrapper[4752]: I0929 11:05:57.619755 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3cc2bbe-57f1-4c2f-802c-e7d954268185-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 11:05:57 crc kubenswrapper[4752]: I0929 11:05:57.619820 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4hstm\" (UniqueName: \"kubernetes.io/projected/b3cc2bbe-57f1-4c2f-802c-e7d954268185-kube-api-access-4hstm\") on node \"crc\" DevicePath \"\"" Sep 29 11:05:57 crc kubenswrapper[4752]: I0929 11:05:57.619832 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3cc2bbe-57f1-4c2f-802c-e7d954268185-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 11:05:57 crc kubenswrapper[4752]: I0929 11:05:57.619842 4752 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b3cc2bbe-57f1-4c2f-802c-e7d954268185-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 11:05:58 crc kubenswrapper[4752]: I0929 11:05:58.053792 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-h665g" event={"ID":"b3cc2bbe-57f1-4c2f-802c-e7d954268185","Type":"ContainerDied","Data":"0dbbbd8f08c2e9a7c12aa46caeca587e45c44ce437e35768195d512b2fba4721"} Sep 29 11:05:58 crc kubenswrapper[4752]: I0929 11:05:58.054099 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0dbbbd8f08c2e9a7c12aa46caeca587e45c44ce437e35768195d512b2fba4721" Sep 29 11:05:58 crc kubenswrapper[4752]: I0929 11:05:58.054148 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-h665g" Sep 29 11:05:58 crc kubenswrapper[4752]: I0929 11:05:58.387219 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Sep 29 11:05:58 crc kubenswrapper[4752]: E0929 11:05:58.387569 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3cc2bbe-57f1-4c2f-802c-e7d954268185" containerName="watcher-kuttl-db-sync" Sep 29 11:05:58 crc kubenswrapper[4752]: I0929 11:05:58.387580 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3cc2bbe-57f1-4c2f-802c-e7d954268185" containerName="watcher-kuttl-db-sync" Sep 29 11:05:58 crc kubenswrapper[4752]: I0929 11:05:58.387780 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3cc2bbe-57f1-4c2f-802c-e7d954268185" containerName="watcher-kuttl-db-sync" Sep 29 11:05:58 crc kubenswrapper[4752]: I0929 11:05:58.388696 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:05:58 crc kubenswrapper[4752]: I0929 11:05:58.395934 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-watcher-kuttl-dockercfg-88q2l" Sep 29 11:05:58 crc kubenswrapper[4752]: I0929 11:05:58.396247 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-api-config-data" Sep 29 11:05:58 crc kubenswrapper[4752]: I0929 11:05:58.417511 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Sep 29 11:05:58 crc kubenswrapper[4752]: I0929 11:05:58.433854 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/c2a1fefe-b6e1-4a89-b8ee-4cb201d32c2c-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"c2a1fefe-b6e1-4a89-b8ee-4cb201d32c2c\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:05:58 crc kubenswrapper[4752]: I0929 11:05:58.433925 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2a1fefe-b6e1-4a89-b8ee-4cb201d32c2c-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"c2a1fefe-b6e1-4a89-b8ee-4cb201d32c2c\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:05:58 crc kubenswrapper[4752]: I0929 11:05:58.433999 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2a1fefe-b6e1-4a89-b8ee-4cb201d32c2c-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"c2a1fefe-b6e1-4a89-b8ee-4cb201d32c2c\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:05:58 crc kubenswrapper[4752]: I0929 11:05:58.434090 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zrl5\" (UniqueName: \"kubernetes.io/projected/c2a1fefe-b6e1-4a89-b8ee-4cb201d32c2c-kube-api-access-8zrl5\") pod \"watcher-kuttl-api-0\" (UID: \"c2a1fefe-b6e1-4a89-b8ee-4cb201d32c2c\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:05:58 crc kubenswrapper[4752]: I0929 11:05:58.434134 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2a1fefe-b6e1-4a89-b8ee-4cb201d32c2c-logs\") pod \"watcher-kuttl-api-0\" (UID: \"c2a1fefe-b6e1-4a89-b8ee-4cb201d32c2c\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:05:58 crc kubenswrapper[4752]: I0929 11:05:58.434750 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Sep 29 11:05:58 crc kubenswrapper[4752]: I0929 11:05:58.436021 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:05:58 crc kubenswrapper[4752]: I0929 11:05:58.440183 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-decision-engine-config-data" Sep 29 11:05:58 crc kubenswrapper[4752]: I0929 11:05:58.462854 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Sep 29 11:05:58 crc kubenswrapper[4752]: I0929 11:05:58.520699 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Sep 29 11:05:58 crc kubenswrapper[4752]: I0929 11:05:58.522310 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:05:58 crc kubenswrapper[4752]: I0929 11:05:58.526469 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-applier-config-data" Sep 29 11:05:58 crc kubenswrapper[4752]: I0929 11:05:58.529652 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Sep 29 11:05:58 crc kubenswrapper[4752]: I0929 11:05:58.535520 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2a1fefe-b6e1-4a89-b8ee-4cb201d32c2c-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"c2a1fefe-b6e1-4a89-b8ee-4cb201d32c2c\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:05:58 crc kubenswrapper[4752]: I0929 11:05:58.535693 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vflf6\" (UniqueName: \"kubernetes.io/projected/2c3e5b57-3df6-4005-aa53-7cc77c1101ad-kube-api-access-vflf6\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"2c3e5b57-3df6-4005-aa53-7cc77c1101ad\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:05:58 crc kubenswrapper[4752]: I0929 11:05:58.535779 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/2c3e5b57-3df6-4005-aa53-7cc77c1101ad-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"2c3e5b57-3df6-4005-aa53-7cc77c1101ad\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:05:58 crc kubenswrapper[4752]: I0929 11:05:58.535901 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c3e5b57-3df6-4005-aa53-7cc77c1101ad-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"2c3e5b57-3df6-4005-aa53-7cc77c1101ad\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:05:58 crc kubenswrapper[4752]: I0929 11:05:58.536038 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8zrl5\" (UniqueName: \"kubernetes.io/projected/c2a1fefe-b6e1-4a89-b8ee-4cb201d32c2c-kube-api-access-8zrl5\") pod \"watcher-kuttl-api-0\" (UID: \"c2a1fefe-b6e1-4a89-b8ee-4cb201d32c2c\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:05:58 crc kubenswrapper[4752]: I0929 11:05:58.536119 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2c3e5b57-3df6-4005-aa53-7cc77c1101ad-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"2c3e5b57-3df6-4005-aa53-7cc77c1101ad\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:05:58 crc kubenswrapper[4752]: I0929 11:05:58.536210 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c3e5b57-3df6-4005-aa53-7cc77c1101ad-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"2c3e5b57-3df6-4005-aa53-7cc77c1101ad\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:05:58 crc kubenswrapper[4752]: I0929 11:05:58.536339 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2a1fefe-b6e1-4a89-b8ee-4cb201d32c2c-logs\") pod \"watcher-kuttl-api-0\" (UID: \"c2a1fefe-b6e1-4a89-b8ee-4cb201d32c2c\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:05:58 crc kubenswrapper[4752]: I0929 11:05:58.536482 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2a1fefe-b6e1-4a89-b8ee-4cb201d32c2c-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"c2a1fefe-b6e1-4a89-b8ee-4cb201d32c2c\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:05:58 crc kubenswrapper[4752]: I0929 11:05:58.537508 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/c2a1fefe-b6e1-4a89-b8ee-4cb201d32c2c-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"c2a1fefe-b6e1-4a89-b8ee-4cb201d32c2c\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:05:58 crc kubenswrapper[4752]: I0929 11:05:58.537049 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2a1fefe-b6e1-4a89-b8ee-4cb201d32c2c-logs\") pod \"watcher-kuttl-api-0\" (UID: \"c2a1fefe-b6e1-4a89-b8ee-4cb201d32c2c\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:05:58 crc kubenswrapper[4752]: I0929 11:05:58.556598 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/c2a1fefe-b6e1-4a89-b8ee-4cb201d32c2c-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"c2a1fefe-b6e1-4a89-b8ee-4cb201d32c2c\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:05:58 crc kubenswrapper[4752]: I0929 11:05:58.557090 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2a1fefe-b6e1-4a89-b8ee-4cb201d32c2c-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"c2a1fefe-b6e1-4a89-b8ee-4cb201d32c2c\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:05:58 crc kubenswrapper[4752]: I0929 11:05:58.557651 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2a1fefe-b6e1-4a89-b8ee-4cb201d32c2c-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"c2a1fefe-b6e1-4a89-b8ee-4cb201d32c2c\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:05:58 crc kubenswrapper[4752]: I0929 11:05:58.572324 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zrl5\" (UniqueName: \"kubernetes.io/projected/c2a1fefe-b6e1-4a89-b8ee-4cb201d32c2c-kube-api-access-8zrl5\") pod \"watcher-kuttl-api-0\" (UID: \"c2a1fefe-b6e1-4a89-b8ee-4cb201d32c2c\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:05:58 crc kubenswrapper[4752]: I0929 11:05:58.639205 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c7abae5-8a39-4e88-ac5c-997fb44a9fcc-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"1c7abae5-8a39-4e88-ac5c-997fb44a9fcc\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:05:58 crc kubenswrapper[4752]: I0929 11:05:58.639257 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1c7abae5-8a39-4e88-ac5c-997fb44a9fcc-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"1c7abae5-8a39-4e88-ac5c-997fb44a9fcc\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:05:58 crc kubenswrapper[4752]: I0929 11:05:58.639292 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vflf6\" (UniqueName: \"kubernetes.io/projected/2c3e5b57-3df6-4005-aa53-7cc77c1101ad-kube-api-access-vflf6\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"2c3e5b57-3df6-4005-aa53-7cc77c1101ad\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:05:58 crc kubenswrapper[4752]: I0929 11:05:58.639340 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/2c3e5b57-3df6-4005-aa53-7cc77c1101ad-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"2c3e5b57-3df6-4005-aa53-7cc77c1101ad\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:05:58 crc kubenswrapper[4752]: I0929 11:05:58.639739 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c3e5b57-3df6-4005-aa53-7cc77c1101ad-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"2c3e5b57-3df6-4005-aa53-7cc77c1101ad\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:05:58 crc kubenswrapper[4752]: I0929 11:05:58.639855 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c7abae5-8a39-4e88-ac5c-997fb44a9fcc-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"1c7abae5-8a39-4e88-ac5c-997fb44a9fcc\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:05:58 crc kubenswrapper[4752]: I0929 11:05:58.639915 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cgdh\" (UniqueName: \"kubernetes.io/projected/1c7abae5-8a39-4e88-ac5c-997fb44a9fcc-kube-api-access-4cgdh\") pod \"watcher-kuttl-applier-0\" (UID: \"1c7abae5-8a39-4e88-ac5c-997fb44a9fcc\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:05:58 crc kubenswrapper[4752]: I0929 11:05:58.639994 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2c3e5b57-3df6-4005-aa53-7cc77c1101ad-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"2c3e5b57-3df6-4005-aa53-7cc77c1101ad\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:05:58 crc kubenswrapper[4752]: I0929 11:05:58.640027 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c3e5b57-3df6-4005-aa53-7cc77c1101ad-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"2c3e5b57-3df6-4005-aa53-7cc77c1101ad\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:05:58 crc kubenswrapper[4752]: I0929 11:05:58.640414 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2c3e5b57-3df6-4005-aa53-7cc77c1101ad-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"2c3e5b57-3df6-4005-aa53-7cc77c1101ad\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:05:58 crc kubenswrapper[4752]: I0929 11:05:58.643368 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/2c3e5b57-3df6-4005-aa53-7cc77c1101ad-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"2c3e5b57-3df6-4005-aa53-7cc77c1101ad\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:05:58 crc kubenswrapper[4752]: I0929 11:05:58.644870 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c3e5b57-3df6-4005-aa53-7cc77c1101ad-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"2c3e5b57-3df6-4005-aa53-7cc77c1101ad\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:05:58 crc kubenswrapper[4752]: I0929 11:05:58.645773 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c3e5b57-3df6-4005-aa53-7cc77c1101ad-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"2c3e5b57-3df6-4005-aa53-7cc77c1101ad\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:05:58 crc kubenswrapper[4752]: I0929 11:05:58.657001 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vflf6\" (UniqueName: \"kubernetes.io/projected/2c3e5b57-3df6-4005-aa53-7cc77c1101ad-kube-api-access-vflf6\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"2c3e5b57-3df6-4005-aa53-7cc77c1101ad\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:05:58 crc kubenswrapper[4752]: I0929 11:05:58.721964 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:05:58 crc kubenswrapper[4752]: I0929 11:05:58.741669 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c7abae5-8a39-4e88-ac5c-997fb44a9fcc-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"1c7abae5-8a39-4e88-ac5c-997fb44a9fcc\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:05:58 crc kubenswrapper[4752]: I0929 11:05:58.741731 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4cgdh\" (UniqueName: \"kubernetes.io/projected/1c7abae5-8a39-4e88-ac5c-997fb44a9fcc-kube-api-access-4cgdh\") pod \"watcher-kuttl-applier-0\" (UID: \"1c7abae5-8a39-4e88-ac5c-997fb44a9fcc\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:05:58 crc kubenswrapper[4752]: I0929 11:05:58.741889 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c7abae5-8a39-4e88-ac5c-997fb44a9fcc-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"1c7abae5-8a39-4e88-ac5c-997fb44a9fcc\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:05:58 crc kubenswrapper[4752]: I0929 11:05:58.741917 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1c7abae5-8a39-4e88-ac5c-997fb44a9fcc-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"1c7abae5-8a39-4e88-ac5c-997fb44a9fcc\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:05:58 crc kubenswrapper[4752]: I0929 11:05:58.742431 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1c7abae5-8a39-4e88-ac5c-997fb44a9fcc-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"1c7abae5-8a39-4e88-ac5c-997fb44a9fcc\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:05:58 crc kubenswrapper[4752]: I0929 11:05:58.745882 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c7abae5-8a39-4e88-ac5c-997fb44a9fcc-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"1c7abae5-8a39-4e88-ac5c-997fb44a9fcc\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:05:58 crc kubenswrapper[4752]: I0929 11:05:58.746100 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c7abae5-8a39-4e88-ac5c-997fb44a9fcc-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"1c7abae5-8a39-4e88-ac5c-997fb44a9fcc\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:05:58 crc kubenswrapper[4752]: I0929 11:05:58.756451 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:05:58 crc kubenswrapper[4752]: I0929 11:05:58.759608 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cgdh\" (UniqueName: \"kubernetes.io/projected/1c7abae5-8a39-4e88-ac5c-997fb44a9fcc-kube-api-access-4cgdh\") pod \"watcher-kuttl-applier-0\" (UID: \"1c7abae5-8a39-4e88-ac5c-997fb44a9fcc\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:05:58 crc kubenswrapper[4752]: I0929 11:05:58.857408 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:05:59 crc kubenswrapper[4752]: I0929 11:05:59.184542 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Sep 29 11:05:59 crc kubenswrapper[4752]: W0929 11:05:59.186025 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc2a1fefe_b6e1_4a89_b8ee_4cb201d32c2c.slice/crio-d4d3cc325241ca5109885ac08c038d41e30d8a2bab6356e8645b3052917b673c WatchSource:0}: Error finding container d4d3cc325241ca5109885ac08c038d41e30d8a2bab6356e8645b3052917b673c: Status 404 returned error can't find the container with id d4d3cc325241ca5109885ac08c038d41e30d8a2bab6356e8645b3052917b673c Sep 29 11:05:59 crc kubenswrapper[4752]: I0929 11:05:59.254134 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Sep 29 11:05:59 crc kubenswrapper[4752]: W0929 11:05:59.266482 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2c3e5b57_3df6_4005_aa53_7cc77c1101ad.slice/crio-e0e636169f38ec8a4740544b3da90e992ad2c24fc876db6832b759f270b35fc8 WatchSource:0}: Error finding container e0e636169f38ec8a4740544b3da90e992ad2c24fc876db6832b759f270b35fc8: Status 404 returned error can't find the container with id e0e636169f38ec8a4740544b3da90e992ad2c24fc876db6832b759f270b35fc8 Sep 29 11:05:59 crc kubenswrapper[4752]: I0929 11:05:59.356379 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Sep 29 11:05:59 crc kubenswrapper[4752]: W0929 11:05:59.368766 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1c7abae5_8a39_4e88_ac5c_997fb44a9fcc.slice/crio-89150f4df4eeacd20a1b308954aa66c55c99f862151749c9c5ae791f08087753 WatchSource:0}: Error finding container 89150f4df4eeacd20a1b308954aa66c55c99f862151749c9c5ae791f08087753: Status 404 returned error can't find the container with id 89150f4df4eeacd20a1b308954aa66c55c99f862151749c9c5ae791f08087753 Sep 29 11:06:00 crc kubenswrapper[4752]: I0929 11:06:00.093903 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"c2a1fefe-b6e1-4a89-b8ee-4cb201d32c2c","Type":"ContainerStarted","Data":"b15b69efa07590dd7ca87daeab60e9306e135919b4b681122ee3513e869ee432"} Sep 29 11:06:00 crc kubenswrapper[4752]: I0929 11:06:00.094327 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:06:00 crc kubenswrapper[4752]: I0929 11:06:00.094343 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"c2a1fefe-b6e1-4a89-b8ee-4cb201d32c2c","Type":"ContainerStarted","Data":"34c7061bdfaa8c6f62a36fdf2e10a14fd0db032c6c3de657e5d6e57f95f942f1"} Sep 29 11:06:00 crc kubenswrapper[4752]: I0929 11:06:00.094355 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"c2a1fefe-b6e1-4a89-b8ee-4cb201d32c2c","Type":"ContainerStarted","Data":"d4d3cc325241ca5109885ac08c038d41e30d8a2bab6356e8645b3052917b673c"} Sep 29 11:06:00 crc kubenswrapper[4752]: I0929 11:06:00.096977 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"2c3e5b57-3df6-4005-aa53-7cc77c1101ad","Type":"ContainerStarted","Data":"e0e636169f38ec8a4740544b3da90e992ad2c24fc876db6832b759f270b35fc8"} Sep 29 11:06:00 crc kubenswrapper[4752]: I0929 11:06:00.103997 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"1c7abae5-8a39-4e88-ac5c-997fb44a9fcc","Type":"ContainerStarted","Data":"89150f4df4eeacd20a1b308954aa66c55c99f862151749c9c5ae791f08087753"} Sep 29 11:06:00 crc kubenswrapper[4752]: I0929 11:06:00.137422 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-api-0" podStartSLOduration=2.137400098 podStartE2EDuration="2.137400098s" podCreationTimestamp="2025-09-29 11:05:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 11:06:00.136437323 +0000 UTC m=+1300.925579000" watchObservedRunningTime="2025-09-29 11:06:00.137400098 +0000 UTC m=+1300.926541775" Sep 29 11:06:01 crc kubenswrapper[4752]: I0929 11:06:01.117441 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"2c3e5b57-3df6-4005-aa53-7cc77c1101ad","Type":"ContainerStarted","Data":"338689d2438b2a2babeb9971e3a14d311a846b7870e414d5b8897b75c7dc86d8"} Sep 29 11:06:01 crc kubenswrapper[4752]: I0929 11:06:01.123545 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"1c7abae5-8a39-4e88-ac5c-997fb44a9fcc","Type":"ContainerStarted","Data":"8f129063a19f5270a664c01d8b862f373a2001f1ae0937949cf1cf4af5d08e13"} Sep 29 11:06:01 crc kubenswrapper[4752]: I0929 11:06:01.144768 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podStartSLOduration=1.924231085 podStartE2EDuration="3.144754741s" podCreationTimestamp="2025-09-29 11:05:58 +0000 UTC" firstStartedPulling="2025-09-29 11:05:59.269187719 +0000 UTC m=+1300.058329386" lastFinishedPulling="2025-09-29 11:06:00.489711375 +0000 UTC m=+1301.278853042" observedRunningTime="2025-09-29 11:06:01.142202445 +0000 UTC m=+1301.931344112" watchObservedRunningTime="2025-09-29 11:06:01.144754741 +0000 UTC m=+1301.933896408" Sep 29 11:06:01 crc kubenswrapper[4752]: I0929 11:06:01.166629 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podStartSLOduration=2.046404913 podStartE2EDuration="3.166603363s" podCreationTimestamp="2025-09-29 11:05:58 +0000 UTC" firstStartedPulling="2025-09-29 11:05:59.371386393 +0000 UTC m=+1300.160528060" lastFinishedPulling="2025-09-29 11:06:00.491584843 +0000 UTC m=+1301.280726510" observedRunningTime="2025-09-29 11:06:01.160272091 +0000 UTC m=+1301.949413758" watchObservedRunningTime="2025-09-29 11:06:01.166603363 +0000 UTC m=+1301.955745040" Sep 29 11:06:02 crc kubenswrapper[4752]: I0929 11:06:02.560374 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:06:03 crc kubenswrapper[4752]: I0929 11:06:03.722361 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:06:03 crc kubenswrapper[4752]: I0929 11:06:03.858579 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:06:08 crc kubenswrapper[4752]: I0929 11:06:08.722605 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:06:08 crc kubenswrapper[4752]: I0929 11:06:08.729653 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:06:08 crc kubenswrapper[4752]: I0929 11:06:08.757694 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:06:08 crc kubenswrapper[4752]: I0929 11:06:08.805651 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:06:08 crc kubenswrapper[4752]: I0929 11:06:08.857985 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:06:08 crc kubenswrapper[4752]: I0929 11:06:08.890520 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:06:09 crc kubenswrapper[4752]: I0929 11:06:09.198010 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:06:09 crc kubenswrapper[4752]: I0929 11:06:09.204207 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:06:09 crc kubenswrapper[4752]: I0929 11:06:09.223962 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:06:09 crc kubenswrapper[4752]: I0929 11:06:09.230689 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:06:11 crc kubenswrapper[4752]: I0929 11:06:11.214433 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Sep 29 11:06:11 crc kubenswrapper[4752]: I0929 11:06:11.215145 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="fe6b2f0b-0559-4e95-ab61-b03e24044991" containerName="ceilometer-central-agent" containerID="cri-o://acebffe58c91265d3760f6170c07ae08f0fd34f74f970b68852d0b0f3367cbd4" gracePeriod=30 Sep 29 11:06:11 crc kubenswrapper[4752]: I0929 11:06:11.215296 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="fe6b2f0b-0559-4e95-ab61-b03e24044991" containerName="proxy-httpd" containerID="cri-o://971cc5e7f7e7c493f04b0cdac728a4107a5f081d2435406cd9c73a3b388a07e7" gracePeriod=30 Sep 29 11:06:11 crc kubenswrapper[4752]: I0929 11:06:11.220168 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="fe6b2f0b-0559-4e95-ab61-b03e24044991" containerName="ceilometer-notification-agent" containerID="cri-o://e6557739b1ea7ba8a590a94c5e65eee60d678c7c1c0eeb5c69d0b79e0ee9d8d7" gracePeriod=30 Sep 29 11:06:11 crc kubenswrapper[4752]: I0929 11:06:11.221561 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="fe6b2f0b-0559-4e95-ab61-b03e24044991" containerName="sg-core" containerID="cri-o://7c2f805d9b21dcf0c65c1f6d70163ea25ae0000cea5432314846db7fa77df469" gracePeriod=30 Sep 29 11:06:12 crc kubenswrapper[4752]: I0929 11:06:12.244266 4752 generic.go:334] "Generic (PLEG): container finished" podID="fe6b2f0b-0559-4e95-ab61-b03e24044991" containerID="971cc5e7f7e7c493f04b0cdac728a4107a5f081d2435406cd9c73a3b388a07e7" exitCode=0 Sep 29 11:06:12 crc kubenswrapper[4752]: I0929 11:06:12.244302 4752 generic.go:334] "Generic (PLEG): container finished" podID="fe6b2f0b-0559-4e95-ab61-b03e24044991" containerID="7c2f805d9b21dcf0c65c1f6d70163ea25ae0000cea5432314846db7fa77df469" exitCode=2 Sep 29 11:06:12 crc kubenswrapper[4752]: I0929 11:06:12.244312 4752 generic.go:334] "Generic (PLEG): container finished" podID="fe6b2f0b-0559-4e95-ab61-b03e24044991" containerID="acebffe58c91265d3760f6170c07ae08f0fd34f74f970b68852d0b0f3367cbd4" exitCode=0 Sep 29 11:06:12 crc kubenswrapper[4752]: I0929 11:06:12.244354 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"fe6b2f0b-0559-4e95-ab61-b03e24044991","Type":"ContainerDied","Data":"971cc5e7f7e7c493f04b0cdac728a4107a5f081d2435406cd9c73a3b388a07e7"} Sep 29 11:06:12 crc kubenswrapper[4752]: I0929 11:06:12.244410 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"fe6b2f0b-0559-4e95-ab61-b03e24044991","Type":"ContainerDied","Data":"7c2f805d9b21dcf0c65c1f6d70163ea25ae0000cea5432314846db7fa77df469"} Sep 29 11:06:12 crc kubenswrapper[4752]: I0929 11:06:12.244426 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"fe6b2f0b-0559-4e95-ab61-b03e24044991","Type":"ContainerDied","Data":"acebffe58c91265d3760f6170c07ae08f0fd34f74f970b68852d0b0f3367cbd4"} Sep 29 11:06:14 crc kubenswrapper[4752]: I0929 11:06:14.998468 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:06:15 crc kubenswrapper[4752]: I0929 11:06:15.134482 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe6b2f0b-0559-4e95-ab61-b03e24044991-scripts\") pod \"fe6b2f0b-0559-4e95-ab61-b03e24044991\" (UID: \"fe6b2f0b-0559-4e95-ab61-b03e24044991\") " Sep 29 11:06:15 crc kubenswrapper[4752]: I0929 11:06:15.134598 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe6b2f0b-0559-4e95-ab61-b03e24044991-run-httpd\") pod \"fe6b2f0b-0559-4e95-ab61-b03e24044991\" (UID: \"fe6b2f0b-0559-4e95-ab61-b03e24044991\") " Sep 29 11:06:15 crc kubenswrapper[4752]: I0929 11:06:15.134667 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe6b2f0b-0559-4e95-ab61-b03e24044991-ceilometer-tls-certs\") pod \"fe6b2f0b-0559-4e95-ab61-b03e24044991\" (UID: \"fe6b2f0b-0559-4e95-ab61-b03e24044991\") " Sep 29 11:06:15 crc kubenswrapper[4752]: I0929 11:06:15.134717 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe6b2f0b-0559-4e95-ab61-b03e24044991-log-httpd\") pod \"fe6b2f0b-0559-4e95-ab61-b03e24044991\" (UID: \"fe6b2f0b-0559-4e95-ab61-b03e24044991\") " Sep 29 11:06:15 crc kubenswrapper[4752]: I0929 11:06:15.134786 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe6b2f0b-0559-4e95-ab61-b03e24044991-combined-ca-bundle\") pod \"fe6b2f0b-0559-4e95-ab61-b03e24044991\" (UID: \"fe6b2f0b-0559-4e95-ab61-b03e24044991\") " Sep 29 11:06:15 crc kubenswrapper[4752]: I0929 11:06:15.134838 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe6b2f0b-0559-4e95-ab61-b03e24044991-config-data\") pod \"fe6b2f0b-0559-4e95-ab61-b03e24044991\" (UID: \"fe6b2f0b-0559-4e95-ab61-b03e24044991\") " Sep 29 11:06:15 crc kubenswrapper[4752]: I0929 11:06:15.134905 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fe6b2f0b-0559-4e95-ab61-b03e24044991-sg-core-conf-yaml\") pod \"fe6b2f0b-0559-4e95-ab61-b03e24044991\" (UID: \"fe6b2f0b-0559-4e95-ab61-b03e24044991\") " Sep 29 11:06:15 crc kubenswrapper[4752]: I0929 11:06:15.134985 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-92rpq\" (UniqueName: \"kubernetes.io/projected/fe6b2f0b-0559-4e95-ab61-b03e24044991-kube-api-access-92rpq\") pod \"fe6b2f0b-0559-4e95-ab61-b03e24044991\" (UID: \"fe6b2f0b-0559-4e95-ab61-b03e24044991\") " Sep 29 11:06:15 crc kubenswrapper[4752]: I0929 11:06:15.135397 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe6b2f0b-0559-4e95-ab61-b03e24044991-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "fe6b2f0b-0559-4e95-ab61-b03e24044991" (UID: "fe6b2f0b-0559-4e95-ab61-b03e24044991"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 11:06:15 crc kubenswrapper[4752]: I0929 11:06:15.135595 4752 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe6b2f0b-0559-4e95-ab61-b03e24044991-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 29 11:06:15 crc kubenswrapper[4752]: I0929 11:06:15.136743 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe6b2f0b-0559-4e95-ab61-b03e24044991-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "fe6b2f0b-0559-4e95-ab61-b03e24044991" (UID: "fe6b2f0b-0559-4e95-ab61-b03e24044991"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 11:06:15 crc kubenswrapper[4752]: I0929 11:06:15.142114 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe6b2f0b-0559-4e95-ab61-b03e24044991-scripts" (OuterVolumeSpecName: "scripts") pod "fe6b2f0b-0559-4e95-ab61-b03e24044991" (UID: "fe6b2f0b-0559-4e95-ab61-b03e24044991"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:06:15 crc kubenswrapper[4752]: I0929 11:06:15.152208 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe6b2f0b-0559-4e95-ab61-b03e24044991-kube-api-access-92rpq" (OuterVolumeSpecName: "kube-api-access-92rpq") pod "fe6b2f0b-0559-4e95-ab61-b03e24044991" (UID: "fe6b2f0b-0559-4e95-ab61-b03e24044991"). InnerVolumeSpecName "kube-api-access-92rpq". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 11:06:15 crc kubenswrapper[4752]: I0929 11:06:15.167104 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe6b2f0b-0559-4e95-ab61-b03e24044991-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "fe6b2f0b-0559-4e95-ab61-b03e24044991" (UID: "fe6b2f0b-0559-4e95-ab61-b03e24044991"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:06:15 crc kubenswrapper[4752]: I0929 11:06:15.181864 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe6b2f0b-0559-4e95-ab61-b03e24044991-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "fe6b2f0b-0559-4e95-ab61-b03e24044991" (UID: "fe6b2f0b-0559-4e95-ab61-b03e24044991"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:06:15 crc kubenswrapper[4752]: I0929 11:06:15.209413 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe6b2f0b-0559-4e95-ab61-b03e24044991-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fe6b2f0b-0559-4e95-ab61-b03e24044991" (UID: "fe6b2f0b-0559-4e95-ab61-b03e24044991"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:06:15 crc kubenswrapper[4752]: I0929 11:06:15.237200 4752 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe6b2f0b-0559-4e95-ab61-b03e24044991-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 29 11:06:15 crc kubenswrapper[4752]: I0929 11:06:15.237250 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe6b2f0b-0559-4e95-ab61-b03e24044991-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 11:06:15 crc kubenswrapper[4752]: I0929 11:06:15.237263 4752 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fe6b2f0b-0559-4e95-ab61-b03e24044991-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 29 11:06:15 crc kubenswrapper[4752]: I0929 11:06:15.237275 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-92rpq\" (UniqueName: \"kubernetes.io/projected/fe6b2f0b-0559-4e95-ab61-b03e24044991-kube-api-access-92rpq\") on node \"crc\" DevicePath \"\"" Sep 29 11:06:15 crc kubenswrapper[4752]: I0929 11:06:15.237291 4752 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe6b2f0b-0559-4e95-ab61-b03e24044991-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 11:06:15 crc kubenswrapper[4752]: I0929 11:06:15.237305 4752 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe6b2f0b-0559-4e95-ab61-b03e24044991-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 29 11:06:15 crc kubenswrapper[4752]: I0929 11:06:15.250053 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe6b2f0b-0559-4e95-ab61-b03e24044991-config-data" (OuterVolumeSpecName: "config-data") pod "fe6b2f0b-0559-4e95-ab61-b03e24044991" (UID: "fe6b2f0b-0559-4e95-ab61-b03e24044991"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:06:15 crc kubenswrapper[4752]: I0929 11:06:15.271095 4752 generic.go:334] "Generic (PLEG): container finished" podID="fe6b2f0b-0559-4e95-ab61-b03e24044991" containerID="e6557739b1ea7ba8a590a94c5e65eee60d678c7c1c0eeb5c69d0b79e0ee9d8d7" exitCode=0 Sep 29 11:06:15 crc kubenswrapper[4752]: I0929 11:06:15.271138 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"fe6b2f0b-0559-4e95-ab61-b03e24044991","Type":"ContainerDied","Data":"e6557739b1ea7ba8a590a94c5e65eee60d678c7c1c0eeb5c69d0b79e0ee9d8d7"} Sep 29 11:06:15 crc kubenswrapper[4752]: I0929 11:06:15.271164 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"fe6b2f0b-0559-4e95-ab61-b03e24044991","Type":"ContainerDied","Data":"30fce793ecc1161c8dc28c641d946a5863a186b6ff0cfb3f74213f569406a315"} Sep 29 11:06:15 crc kubenswrapper[4752]: I0929 11:06:15.271175 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:06:15 crc kubenswrapper[4752]: I0929 11:06:15.271183 4752 scope.go:117] "RemoveContainer" containerID="971cc5e7f7e7c493f04b0cdac728a4107a5f081d2435406cd9c73a3b388a07e7" Sep 29 11:06:15 crc kubenswrapper[4752]: I0929 11:06:15.305196 4752 scope.go:117] "RemoveContainer" containerID="7c2f805d9b21dcf0c65c1f6d70163ea25ae0000cea5432314846db7fa77df469" Sep 29 11:06:15 crc kubenswrapper[4752]: I0929 11:06:15.316005 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Sep 29 11:06:15 crc kubenswrapper[4752]: I0929 11:06:15.330576 4752 scope.go:117] "RemoveContainer" containerID="e6557739b1ea7ba8a590a94c5e65eee60d678c7c1c0eeb5c69d0b79e0ee9d8d7" Sep 29 11:06:15 crc kubenswrapper[4752]: I0929 11:06:15.331062 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Sep 29 11:06:15 crc kubenswrapper[4752]: I0929 11:06:15.348692 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe6b2f0b-0559-4e95-ab61-b03e24044991-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 11:06:15 crc kubenswrapper[4752]: I0929 11:06:15.355883 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Sep 29 11:06:15 crc kubenswrapper[4752]: E0929 11:06:15.356957 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe6b2f0b-0559-4e95-ab61-b03e24044991" containerName="ceilometer-central-agent" Sep 29 11:06:15 crc kubenswrapper[4752]: I0929 11:06:15.357004 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe6b2f0b-0559-4e95-ab61-b03e24044991" containerName="ceilometer-central-agent" Sep 29 11:06:15 crc kubenswrapper[4752]: E0929 11:06:15.357022 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe6b2f0b-0559-4e95-ab61-b03e24044991" containerName="proxy-httpd" Sep 29 11:06:15 crc kubenswrapper[4752]: I0929 11:06:15.357029 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe6b2f0b-0559-4e95-ab61-b03e24044991" containerName="proxy-httpd" Sep 29 11:06:15 crc kubenswrapper[4752]: E0929 11:06:15.357049 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe6b2f0b-0559-4e95-ab61-b03e24044991" containerName="sg-core" Sep 29 11:06:15 crc kubenswrapper[4752]: I0929 11:06:15.357054 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe6b2f0b-0559-4e95-ab61-b03e24044991" containerName="sg-core" Sep 29 11:06:15 crc kubenswrapper[4752]: E0929 11:06:15.357069 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe6b2f0b-0559-4e95-ab61-b03e24044991" containerName="ceilometer-notification-agent" Sep 29 11:06:15 crc kubenswrapper[4752]: I0929 11:06:15.357075 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe6b2f0b-0559-4e95-ab61-b03e24044991" containerName="ceilometer-notification-agent" Sep 29 11:06:15 crc kubenswrapper[4752]: I0929 11:06:15.357267 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe6b2f0b-0559-4e95-ab61-b03e24044991" containerName="proxy-httpd" Sep 29 11:06:15 crc kubenswrapper[4752]: I0929 11:06:15.357316 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe6b2f0b-0559-4e95-ab61-b03e24044991" containerName="ceilometer-central-agent" Sep 29 11:06:15 crc kubenswrapper[4752]: I0929 11:06:15.357335 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe6b2f0b-0559-4e95-ab61-b03e24044991" containerName="sg-core" Sep 29 11:06:15 crc kubenswrapper[4752]: I0929 11:06:15.357351 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe6b2f0b-0559-4e95-ab61-b03e24044991" containerName="ceilometer-notification-agent" Sep 29 11:06:15 crc kubenswrapper[4752]: I0929 11:06:15.365437 4752 scope.go:117] "RemoveContainer" containerID="acebffe58c91265d3760f6170c07ae08f0fd34f74f970b68852d0b0f3367cbd4" Sep 29 11:06:15 crc kubenswrapper[4752]: I0929 11:06:15.371055 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Sep 29 11:06:15 crc kubenswrapper[4752]: I0929 11:06:15.371155 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:06:15 crc kubenswrapper[4752]: I0929 11:06:15.374932 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Sep 29 11:06:15 crc kubenswrapper[4752]: I0929 11:06:15.375117 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-ceilometer-internal-svc" Sep 29 11:06:15 crc kubenswrapper[4752]: I0929 11:06:15.376693 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Sep 29 11:06:15 crc kubenswrapper[4752]: I0929 11:06:15.397016 4752 scope.go:117] "RemoveContainer" containerID="971cc5e7f7e7c493f04b0cdac728a4107a5f081d2435406cd9c73a3b388a07e7" Sep 29 11:06:15 crc kubenswrapper[4752]: E0929 11:06:15.397773 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"971cc5e7f7e7c493f04b0cdac728a4107a5f081d2435406cd9c73a3b388a07e7\": container with ID starting with 971cc5e7f7e7c493f04b0cdac728a4107a5f081d2435406cd9c73a3b388a07e7 not found: ID does not exist" containerID="971cc5e7f7e7c493f04b0cdac728a4107a5f081d2435406cd9c73a3b388a07e7" Sep 29 11:06:15 crc kubenswrapper[4752]: I0929 11:06:15.397865 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"971cc5e7f7e7c493f04b0cdac728a4107a5f081d2435406cd9c73a3b388a07e7"} err="failed to get container status \"971cc5e7f7e7c493f04b0cdac728a4107a5f081d2435406cd9c73a3b388a07e7\": rpc error: code = NotFound desc = could not find container \"971cc5e7f7e7c493f04b0cdac728a4107a5f081d2435406cd9c73a3b388a07e7\": container with ID starting with 971cc5e7f7e7c493f04b0cdac728a4107a5f081d2435406cd9c73a3b388a07e7 not found: ID does not exist" Sep 29 11:06:15 crc kubenswrapper[4752]: I0929 11:06:15.397903 4752 scope.go:117] "RemoveContainer" containerID="7c2f805d9b21dcf0c65c1f6d70163ea25ae0000cea5432314846db7fa77df469" Sep 29 11:06:15 crc kubenswrapper[4752]: E0929 11:06:15.398392 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c2f805d9b21dcf0c65c1f6d70163ea25ae0000cea5432314846db7fa77df469\": container with ID starting with 7c2f805d9b21dcf0c65c1f6d70163ea25ae0000cea5432314846db7fa77df469 not found: ID does not exist" containerID="7c2f805d9b21dcf0c65c1f6d70163ea25ae0000cea5432314846db7fa77df469" Sep 29 11:06:15 crc kubenswrapper[4752]: I0929 11:06:15.398436 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c2f805d9b21dcf0c65c1f6d70163ea25ae0000cea5432314846db7fa77df469"} err="failed to get container status \"7c2f805d9b21dcf0c65c1f6d70163ea25ae0000cea5432314846db7fa77df469\": rpc error: code = NotFound desc = could not find container \"7c2f805d9b21dcf0c65c1f6d70163ea25ae0000cea5432314846db7fa77df469\": container with ID starting with 7c2f805d9b21dcf0c65c1f6d70163ea25ae0000cea5432314846db7fa77df469 not found: ID does not exist" Sep 29 11:06:15 crc kubenswrapper[4752]: I0929 11:06:15.398469 4752 scope.go:117] "RemoveContainer" containerID="e6557739b1ea7ba8a590a94c5e65eee60d678c7c1c0eeb5c69d0b79e0ee9d8d7" Sep 29 11:06:15 crc kubenswrapper[4752]: E0929 11:06:15.398827 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6557739b1ea7ba8a590a94c5e65eee60d678c7c1c0eeb5c69d0b79e0ee9d8d7\": container with ID starting with e6557739b1ea7ba8a590a94c5e65eee60d678c7c1c0eeb5c69d0b79e0ee9d8d7 not found: ID does not exist" containerID="e6557739b1ea7ba8a590a94c5e65eee60d678c7c1c0eeb5c69d0b79e0ee9d8d7" Sep 29 11:06:15 crc kubenswrapper[4752]: I0929 11:06:15.398866 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6557739b1ea7ba8a590a94c5e65eee60d678c7c1c0eeb5c69d0b79e0ee9d8d7"} err="failed to get container status \"e6557739b1ea7ba8a590a94c5e65eee60d678c7c1c0eeb5c69d0b79e0ee9d8d7\": rpc error: code = NotFound desc = could not find container \"e6557739b1ea7ba8a590a94c5e65eee60d678c7c1c0eeb5c69d0b79e0ee9d8d7\": container with ID starting with e6557739b1ea7ba8a590a94c5e65eee60d678c7c1c0eeb5c69d0b79e0ee9d8d7 not found: ID does not exist" Sep 29 11:06:15 crc kubenswrapper[4752]: I0929 11:06:15.398891 4752 scope.go:117] "RemoveContainer" containerID="acebffe58c91265d3760f6170c07ae08f0fd34f74f970b68852d0b0f3367cbd4" Sep 29 11:06:15 crc kubenswrapper[4752]: E0929 11:06:15.399169 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"acebffe58c91265d3760f6170c07ae08f0fd34f74f970b68852d0b0f3367cbd4\": container with ID starting with acebffe58c91265d3760f6170c07ae08f0fd34f74f970b68852d0b0f3367cbd4 not found: ID does not exist" containerID="acebffe58c91265d3760f6170c07ae08f0fd34f74f970b68852d0b0f3367cbd4" Sep 29 11:06:15 crc kubenswrapper[4752]: I0929 11:06:15.399189 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"acebffe58c91265d3760f6170c07ae08f0fd34f74f970b68852d0b0f3367cbd4"} err="failed to get container status \"acebffe58c91265d3760f6170c07ae08f0fd34f74f970b68852d0b0f3367cbd4\": rpc error: code = NotFound desc = could not find container \"acebffe58c91265d3760f6170c07ae08f0fd34f74f970b68852d0b0f3367cbd4\": container with ID starting with acebffe58c91265d3760f6170c07ae08f0fd34f74f970b68852d0b0f3367cbd4 not found: ID does not exist" Sep 29 11:06:15 crc kubenswrapper[4752]: I0929 11:06:15.553270 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bd1e5555-3ef3-4c8c-a964-f9b64eb3064c-run-httpd\") pod \"ceilometer-0\" (UID: \"bd1e5555-3ef3-4c8c-a964-f9b64eb3064c\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:06:15 crc kubenswrapper[4752]: I0929 11:06:15.553348 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd1e5555-3ef3-4c8c-a964-f9b64eb3064c-config-data\") pod \"ceilometer-0\" (UID: \"bd1e5555-3ef3-4c8c-a964-f9b64eb3064c\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:06:15 crc kubenswrapper[4752]: I0929 11:06:15.553455 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bd1e5555-3ef3-4c8c-a964-f9b64eb3064c-log-httpd\") pod \"ceilometer-0\" (UID: \"bd1e5555-3ef3-4c8c-a964-f9b64eb3064c\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:06:15 crc kubenswrapper[4752]: I0929 11:06:15.553486 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd1e5555-3ef3-4c8c-a964-f9b64eb3064c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bd1e5555-3ef3-4c8c-a964-f9b64eb3064c\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:06:15 crc kubenswrapper[4752]: I0929 11:06:15.553516 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bd1e5555-3ef3-4c8c-a964-f9b64eb3064c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bd1e5555-3ef3-4c8c-a964-f9b64eb3064c\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:06:15 crc kubenswrapper[4752]: I0929 11:06:15.553539 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd1e5555-3ef3-4c8c-a964-f9b64eb3064c-scripts\") pod \"ceilometer-0\" (UID: \"bd1e5555-3ef3-4c8c-a964-f9b64eb3064c\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:06:15 crc kubenswrapper[4752]: I0929 11:06:15.553558 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dw5vm\" (UniqueName: \"kubernetes.io/projected/bd1e5555-3ef3-4c8c-a964-f9b64eb3064c-kube-api-access-dw5vm\") pod \"ceilometer-0\" (UID: \"bd1e5555-3ef3-4c8c-a964-f9b64eb3064c\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:06:15 crc kubenswrapper[4752]: I0929 11:06:15.553759 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd1e5555-3ef3-4c8c-a964-f9b64eb3064c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"bd1e5555-3ef3-4c8c-a964-f9b64eb3064c\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:06:15 crc kubenswrapper[4752]: I0929 11:06:15.655760 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd1e5555-3ef3-4c8c-a964-f9b64eb3064c-config-data\") pod \"ceilometer-0\" (UID: \"bd1e5555-3ef3-4c8c-a964-f9b64eb3064c\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:06:15 crc kubenswrapper[4752]: I0929 11:06:15.655968 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bd1e5555-3ef3-4c8c-a964-f9b64eb3064c-log-httpd\") pod \"ceilometer-0\" (UID: \"bd1e5555-3ef3-4c8c-a964-f9b64eb3064c\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:06:15 crc kubenswrapper[4752]: I0929 11:06:15.656015 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd1e5555-3ef3-4c8c-a964-f9b64eb3064c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bd1e5555-3ef3-4c8c-a964-f9b64eb3064c\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:06:15 crc kubenswrapper[4752]: I0929 11:06:15.656071 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bd1e5555-3ef3-4c8c-a964-f9b64eb3064c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bd1e5555-3ef3-4c8c-a964-f9b64eb3064c\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:06:15 crc kubenswrapper[4752]: I0929 11:06:15.656119 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd1e5555-3ef3-4c8c-a964-f9b64eb3064c-scripts\") pod \"ceilometer-0\" (UID: \"bd1e5555-3ef3-4c8c-a964-f9b64eb3064c\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:06:15 crc kubenswrapper[4752]: I0929 11:06:15.656154 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dw5vm\" (UniqueName: \"kubernetes.io/projected/bd1e5555-3ef3-4c8c-a964-f9b64eb3064c-kube-api-access-dw5vm\") pod \"ceilometer-0\" (UID: \"bd1e5555-3ef3-4c8c-a964-f9b64eb3064c\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:06:15 crc kubenswrapper[4752]: I0929 11:06:15.656195 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd1e5555-3ef3-4c8c-a964-f9b64eb3064c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"bd1e5555-3ef3-4c8c-a964-f9b64eb3064c\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:06:15 crc kubenswrapper[4752]: I0929 11:06:15.656375 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bd1e5555-3ef3-4c8c-a964-f9b64eb3064c-run-httpd\") pod \"ceilometer-0\" (UID: \"bd1e5555-3ef3-4c8c-a964-f9b64eb3064c\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:06:15 crc kubenswrapper[4752]: I0929 11:06:15.657044 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bd1e5555-3ef3-4c8c-a964-f9b64eb3064c-log-httpd\") pod \"ceilometer-0\" (UID: \"bd1e5555-3ef3-4c8c-a964-f9b64eb3064c\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:06:15 crc kubenswrapper[4752]: I0929 11:06:15.657063 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bd1e5555-3ef3-4c8c-a964-f9b64eb3064c-run-httpd\") pod \"ceilometer-0\" (UID: \"bd1e5555-3ef3-4c8c-a964-f9b64eb3064c\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:06:15 crc kubenswrapper[4752]: I0929 11:06:15.662371 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd1e5555-3ef3-4c8c-a964-f9b64eb3064c-scripts\") pod \"ceilometer-0\" (UID: \"bd1e5555-3ef3-4c8c-a964-f9b64eb3064c\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:06:15 crc kubenswrapper[4752]: I0929 11:06:15.662453 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd1e5555-3ef3-4c8c-a964-f9b64eb3064c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"bd1e5555-3ef3-4c8c-a964-f9b64eb3064c\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:06:15 crc kubenswrapper[4752]: I0929 11:06:15.662677 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bd1e5555-3ef3-4c8c-a964-f9b64eb3064c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bd1e5555-3ef3-4c8c-a964-f9b64eb3064c\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:06:15 crc kubenswrapper[4752]: I0929 11:06:15.663605 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd1e5555-3ef3-4c8c-a964-f9b64eb3064c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bd1e5555-3ef3-4c8c-a964-f9b64eb3064c\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:06:15 crc kubenswrapper[4752]: I0929 11:06:15.663665 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd1e5555-3ef3-4c8c-a964-f9b64eb3064c-config-data\") pod \"ceilometer-0\" (UID: \"bd1e5555-3ef3-4c8c-a964-f9b64eb3064c\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:06:15 crc kubenswrapper[4752]: I0929 11:06:15.685525 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dw5vm\" (UniqueName: \"kubernetes.io/projected/bd1e5555-3ef3-4c8c-a964-f9b64eb3064c-kube-api-access-dw5vm\") pod \"ceilometer-0\" (UID: \"bd1e5555-3ef3-4c8c-a964-f9b64eb3064c\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:06:15 crc kubenswrapper[4752]: I0929 11:06:15.690567 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:06:15 crc kubenswrapper[4752]: I0929 11:06:15.941171 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Sep 29 11:06:16 crc kubenswrapper[4752]: I0929 11:06:16.044619 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe6b2f0b-0559-4e95-ab61-b03e24044991" path="/var/lib/kubelet/pods/fe6b2f0b-0559-4e95-ab61-b03e24044991/volumes" Sep 29 11:06:16 crc kubenswrapper[4752]: I0929 11:06:16.280275 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"bd1e5555-3ef3-4c8c-a964-f9b64eb3064c","Type":"ContainerStarted","Data":"5e0dae4ff71d14ad4590734a7e85850000356c98c598de3fc41615a06486cfbf"} Sep 29 11:06:17 crc kubenswrapper[4752]: I0929 11:06:17.293210 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"bd1e5555-3ef3-4c8c-a964-f9b64eb3064c","Type":"ContainerStarted","Data":"0b1773cffe4f89c66cb8c7d039ce2f198253a606f840e6a7d055078ef0e72e02"} Sep 29 11:06:17 crc kubenswrapper[4752]: I0929 11:06:17.293619 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"bd1e5555-3ef3-4c8c-a964-f9b64eb3064c","Type":"ContainerStarted","Data":"be18ce7da4565251bc9b510cae875334df0124db25fe73cbb79e356458a6f1b3"} Sep 29 11:06:18 crc kubenswrapper[4752]: I0929 11:06:18.304729 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"bd1e5555-3ef3-4c8c-a964-f9b64eb3064c","Type":"ContainerStarted","Data":"724a1a14fc70744885e26a376743ca369ceb4826172cd3a5651fa92076a68fd9"} Sep 29 11:06:20 crc kubenswrapper[4752]: I0929 11:06:20.357536 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"bd1e5555-3ef3-4c8c-a964-f9b64eb3064c","Type":"ContainerStarted","Data":"cd7c78b58f990780c3e6ac29d379c9b8f7a84c8e5c6511f5f02d022a2ff78128"} Sep 29 11:06:20 crc kubenswrapper[4752]: I0929 11:06:20.358974 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:06:45 crc kubenswrapper[4752]: I0929 11:06:45.700118 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:06:45 crc kubenswrapper[4752]: I0929 11:06:45.730904 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/ceilometer-0" podStartSLOduration=27.362323202 podStartE2EDuration="30.730787676s" podCreationTimestamp="2025-09-29 11:06:15 +0000 UTC" firstStartedPulling="2025-09-29 11:06:15.956490373 +0000 UTC m=+1316.745632030" lastFinishedPulling="2025-09-29 11:06:19.324954837 +0000 UTC m=+1320.114096504" observedRunningTime="2025-09-29 11:06:20.403874144 +0000 UTC m=+1321.193015821" watchObservedRunningTime="2025-09-29 11:06:45.730787676 +0000 UTC m=+1346.519929343" Sep 29 11:07:37 crc kubenswrapper[4752]: I0929 11:07:37.587222 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-xs4hd"] Sep 29 11:07:37 crc kubenswrapper[4752]: I0929 11:07:37.593982 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xs4hd" Sep 29 11:07:37 crc kubenswrapper[4752]: I0929 11:07:37.605624 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xs4hd"] Sep 29 11:07:37 crc kubenswrapper[4752]: I0929 11:07:37.712379 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qklxr\" (UniqueName: \"kubernetes.io/projected/b99a9da9-9d27-432b-bb9b-8438f818c766-kube-api-access-qklxr\") pod \"redhat-marketplace-xs4hd\" (UID: \"b99a9da9-9d27-432b-bb9b-8438f818c766\") " pod="openshift-marketplace/redhat-marketplace-xs4hd" Sep 29 11:07:37 crc kubenswrapper[4752]: I0929 11:07:37.712452 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b99a9da9-9d27-432b-bb9b-8438f818c766-catalog-content\") pod \"redhat-marketplace-xs4hd\" (UID: \"b99a9da9-9d27-432b-bb9b-8438f818c766\") " pod="openshift-marketplace/redhat-marketplace-xs4hd" Sep 29 11:07:37 crc kubenswrapper[4752]: I0929 11:07:37.712536 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b99a9da9-9d27-432b-bb9b-8438f818c766-utilities\") pod \"redhat-marketplace-xs4hd\" (UID: \"b99a9da9-9d27-432b-bb9b-8438f818c766\") " pod="openshift-marketplace/redhat-marketplace-xs4hd" Sep 29 11:07:37 crc kubenswrapper[4752]: I0929 11:07:37.814360 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qklxr\" (UniqueName: \"kubernetes.io/projected/b99a9da9-9d27-432b-bb9b-8438f818c766-kube-api-access-qklxr\") pod \"redhat-marketplace-xs4hd\" (UID: \"b99a9da9-9d27-432b-bb9b-8438f818c766\") " pod="openshift-marketplace/redhat-marketplace-xs4hd" Sep 29 11:07:37 crc kubenswrapper[4752]: I0929 11:07:37.814429 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b99a9da9-9d27-432b-bb9b-8438f818c766-catalog-content\") pod \"redhat-marketplace-xs4hd\" (UID: \"b99a9da9-9d27-432b-bb9b-8438f818c766\") " pod="openshift-marketplace/redhat-marketplace-xs4hd" Sep 29 11:07:37 crc kubenswrapper[4752]: I0929 11:07:37.814480 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b99a9da9-9d27-432b-bb9b-8438f818c766-utilities\") pod \"redhat-marketplace-xs4hd\" (UID: \"b99a9da9-9d27-432b-bb9b-8438f818c766\") " pod="openshift-marketplace/redhat-marketplace-xs4hd" Sep 29 11:07:37 crc kubenswrapper[4752]: I0929 11:07:37.815112 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b99a9da9-9d27-432b-bb9b-8438f818c766-utilities\") pod \"redhat-marketplace-xs4hd\" (UID: \"b99a9da9-9d27-432b-bb9b-8438f818c766\") " pod="openshift-marketplace/redhat-marketplace-xs4hd" Sep 29 11:07:37 crc kubenswrapper[4752]: I0929 11:07:37.815337 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b99a9da9-9d27-432b-bb9b-8438f818c766-catalog-content\") pod \"redhat-marketplace-xs4hd\" (UID: \"b99a9da9-9d27-432b-bb9b-8438f818c766\") " pod="openshift-marketplace/redhat-marketplace-xs4hd" Sep 29 11:07:37 crc kubenswrapper[4752]: I0929 11:07:37.836401 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qklxr\" (UniqueName: \"kubernetes.io/projected/b99a9da9-9d27-432b-bb9b-8438f818c766-kube-api-access-qklxr\") pod \"redhat-marketplace-xs4hd\" (UID: \"b99a9da9-9d27-432b-bb9b-8438f818c766\") " pod="openshift-marketplace/redhat-marketplace-xs4hd" Sep 29 11:07:37 crc kubenswrapper[4752]: I0929 11:07:37.940399 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xs4hd" Sep 29 11:07:38 crc kubenswrapper[4752]: I0929 11:07:38.389844 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xs4hd"] Sep 29 11:07:39 crc kubenswrapper[4752]: I0929 11:07:39.082952 4752 generic.go:334] "Generic (PLEG): container finished" podID="b99a9da9-9d27-432b-bb9b-8438f818c766" containerID="a1e37baf9da9b7233e8376e7816e76e1dbcab69e4dafdd74df1a518b4346353b" exitCode=0 Sep 29 11:07:39 crc kubenswrapper[4752]: I0929 11:07:39.083118 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xs4hd" event={"ID":"b99a9da9-9d27-432b-bb9b-8438f818c766","Type":"ContainerDied","Data":"a1e37baf9da9b7233e8376e7816e76e1dbcab69e4dafdd74df1a518b4346353b"} Sep 29 11:07:39 crc kubenswrapper[4752]: I0929 11:07:39.083346 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xs4hd" event={"ID":"b99a9da9-9d27-432b-bb9b-8438f818c766","Type":"ContainerStarted","Data":"7a0a9909589334f77d00fd6a02e85bb60c8c335a6cbd0e7ac4ead0df12b0f518"} Sep 29 11:07:40 crc kubenswrapper[4752]: I0929 11:07:40.093841 4752 generic.go:334] "Generic (PLEG): container finished" podID="b99a9da9-9d27-432b-bb9b-8438f818c766" containerID="852e043dc0888f2bc924d2d297bde9961bb73985c8bc896c9ddb0c12e17fd9b9" exitCode=0 Sep 29 11:07:40 crc kubenswrapper[4752]: I0929 11:07:40.093893 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xs4hd" event={"ID":"b99a9da9-9d27-432b-bb9b-8438f818c766","Type":"ContainerDied","Data":"852e043dc0888f2bc924d2d297bde9961bb73985c8bc896c9ddb0c12e17fd9b9"} Sep 29 11:07:41 crc kubenswrapper[4752]: I0929 11:07:41.106210 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xs4hd" event={"ID":"b99a9da9-9d27-432b-bb9b-8438f818c766","Type":"ContainerStarted","Data":"fdde83ec357965835b8d60cb9b58a7f85716f789bb09d89a8e965e0329d7503d"} Sep 29 11:07:41 crc kubenswrapper[4752]: I0929 11:07:41.133578 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-xs4hd" podStartSLOduration=2.6478235469999998 podStartE2EDuration="4.133560581s" podCreationTimestamp="2025-09-29 11:07:37 +0000 UTC" firstStartedPulling="2025-09-29 11:07:39.08505442 +0000 UTC m=+1399.874196117" lastFinishedPulling="2025-09-29 11:07:40.570791484 +0000 UTC m=+1401.359933151" observedRunningTime="2025-09-29 11:07:41.129789802 +0000 UTC m=+1401.918931479" watchObservedRunningTime="2025-09-29 11:07:41.133560581 +0000 UTC m=+1401.922702258" Sep 29 11:07:47 crc kubenswrapper[4752]: I0929 11:07:47.941025 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-xs4hd" Sep 29 11:07:47 crc kubenswrapper[4752]: I0929 11:07:47.941435 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-xs4hd" Sep 29 11:07:48 crc kubenswrapper[4752]: I0929 11:07:48.084209 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-xs4hd" Sep 29 11:07:48 crc kubenswrapper[4752]: I0929 11:07:48.211704 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-xs4hd" Sep 29 11:07:51 crc kubenswrapper[4752]: I0929 11:07:51.560742 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xs4hd"] Sep 29 11:07:51 crc kubenswrapper[4752]: I0929 11:07:51.561266 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-xs4hd" podUID="b99a9da9-9d27-432b-bb9b-8438f818c766" containerName="registry-server" containerID="cri-o://fdde83ec357965835b8d60cb9b58a7f85716f789bb09d89a8e965e0329d7503d" gracePeriod=2 Sep 29 11:07:52 crc kubenswrapper[4752]: I0929 11:07:52.061530 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xs4hd" Sep 29 11:07:52 crc kubenswrapper[4752]: I0929 11:07:52.163941 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b99a9da9-9d27-432b-bb9b-8438f818c766-catalog-content\") pod \"b99a9da9-9d27-432b-bb9b-8438f818c766\" (UID: \"b99a9da9-9d27-432b-bb9b-8438f818c766\") " Sep 29 11:07:52 crc kubenswrapper[4752]: I0929 11:07:52.164318 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qklxr\" (UniqueName: \"kubernetes.io/projected/b99a9da9-9d27-432b-bb9b-8438f818c766-kube-api-access-qklxr\") pod \"b99a9da9-9d27-432b-bb9b-8438f818c766\" (UID: \"b99a9da9-9d27-432b-bb9b-8438f818c766\") " Sep 29 11:07:52 crc kubenswrapper[4752]: I0929 11:07:52.164466 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b99a9da9-9d27-432b-bb9b-8438f818c766-utilities\") pod \"b99a9da9-9d27-432b-bb9b-8438f818c766\" (UID: \"b99a9da9-9d27-432b-bb9b-8438f818c766\") " Sep 29 11:07:52 crc kubenswrapper[4752]: I0929 11:07:52.166155 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b99a9da9-9d27-432b-bb9b-8438f818c766-utilities" (OuterVolumeSpecName: "utilities") pod "b99a9da9-9d27-432b-bb9b-8438f818c766" (UID: "b99a9da9-9d27-432b-bb9b-8438f818c766"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 11:07:52 crc kubenswrapper[4752]: I0929 11:07:52.183082 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b99a9da9-9d27-432b-bb9b-8438f818c766-kube-api-access-qklxr" (OuterVolumeSpecName: "kube-api-access-qklxr") pod "b99a9da9-9d27-432b-bb9b-8438f818c766" (UID: "b99a9da9-9d27-432b-bb9b-8438f818c766"). InnerVolumeSpecName "kube-api-access-qklxr". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 11:07:52 crc kubenswrapper[4752]: I0929 11:07:52.189651 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b99a9da9-9d27-432b-bb9b-8438f818c766-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b99a9da9-9d27-432b-bb9b-8438f818c766" (UID: "b99a9da9-9d27-432b-bb9b-8438f818c766"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 11:07:52 crc kubenswrapper[4752]: I0929 11:07:52.211775 4752 generic.go:334] "Generic (PLEG): container finished" podID="b99a9da9-9d27-432b-bb9b-8438f818c766" containerID="fdde83ec357965835b8d60cb9b58a7f85716f789bb09d89a8e965e0329d7503d" exitCode=0 Sep 29 11:07:52 crc kubenswrapper[4752]: I0929 11:07:52.211854 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xs4hd" event={"ID":"b99a9da9-9d27-432b-bb9b-8438f818c766","Type":"ContainerDied","Data":"fdde83ec357965835b8d60cb9b58a7f85716f789bb09d89a8e965e0329d7503d"} Sep 29 11:07:52 crc kubenswrapper[4752]: I0929 11:07:52.211885 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xs4hd" event={"ID":"b99a9da9-9d27-432b-bb9b-8438f818c766","Type":"ContainerDied","Data":"7a0a9909589334f77d00fd6a02e85bb60c8c335a6cbd0e7ac4ead0df12b0f518"} Sep 29 11:07:52 crc kubenswrapper[4752]: I0929 11:07:52.211907 4752 scope.go:117] "RemoveContainer" containerID="fdde83ec357965835b8d60cb9b58a7f85716f789bb09d89a8e965e0329d7503d" Sep 29 11:07:52 crc kubenswrapper[4752]: I0929 11:07:52.212051 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xs4hd" Sep 29 11:07:52 crc kubenswrapper[4752]: I0929 11:07:52.254221 4752 scope.go:117] "RemoveContainer" containerID="852e043dc0888f2bc924d2d297bde9961bb73985c8bc896c9ddb0c12e17fd9b9" Sep 29 11:07:52 crc kubenswrapper[4752]: I0929 11:07:52.254373 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xs4hd"] Sep 29 11:07:52 crc kubenswrapper[4752]: I0929 11:07:52.262546 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-xs4hd"] Sep 29 11:07:52 crc kubenswrapper[4752]: I0929 11:07:52.272338 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qklxr\" (UniqueName: \"kubernetes.io/projected/b99a9da9-9d27-432b-bb9b-8438f818c766-kube-api-access-qklxr\") on node \"crc\" DevicePath \"\"" Sep 29 11:07:52 crc kubenswrapper[4752]: I0929 11:07:52.272370 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b99a9da9-9d27-432b-bb9b-8438f818c766-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 11:07:52 crc kubenswrapper[4752]: I0929 11:07:52.272385 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b99a9da9-9d27-432b-bb9b-8438f818c766-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 11:07:52 crc kubenswrapper[4752]: I0929 11:07:52.309243 4752 scope.go:117] "RemoveContainer" containerID="a1e37baf9da9b7233e8376e7816e76e1dbcab69e4dafdd74df1a518b4346353b" Sep 29 11:07:52 crc kubenswrapper[4752]: I0929 11:07:52.360253 4752 scope.go:117] "RemoveContainer" containerID="fdde83ec357965835b8d60cb9b58a7f85716f789bb09d89a8e965e0329d7503d" Sep 29 11:07:52 crc kubenswrapper[4752]: E0929 11:07:52.363047 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fdde83ec357965835b8d60cb9b58a7f85716f789bb09d89a8e965e0329d7503d\": container with ID starting with fdde83ec357965835b8d60cb9b58a7f85716f789bb09d89a8e965e0329d7503d not found: ID does not exist" containerID="fdde83ec357965835b8d60cb9b58a7f85716f789bb09d89a8e965e0329d7503d" Sep 29 11:07:52 crc kubenswrapper[4752]: I0929 11:07:52.363084 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fdde83ec357965835b8d60cb9b58a7f85716f789bb09d89a8e965e0329d7503d"} err="failed to get container status \"fdde83ec357965835b8d60cb9b58a7f85716f789bb09d89a8e965e0329d7503d\": rpc error: code = NotFound desc = could not find container \"fdde83ec357965835b8d60cb9b58a7f85716f789bb09d89a8e965e0329d7503d\": container with ID starting with fdde83ec357965835b8d60cb9b58a7f85716f789bb09d89a8e965e0329d7503d not found: ID does not exist" Sep 29 11:07:52 crc kubenswrapper[4752]: I0929 11:07:52.363110 4752 scope.go:117] "RemoveContainer" containerID="852e043dc0888f2bc924d2d297bde9961bb73985c8bc896c9ddb0c12e17fd9b9" Sep 29 11:07:52 crc kubenswrapper[4752]: E0929 11:07:52.363307 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"852e043dc0888f2bc924d2d297bde9961bb73985c8bc896c9ddb0c12e17fd9b9\": container with ID starting with 852e043dc0888f2bc924d2d297bde9961bb73985c8bc896c9ddb0c12e17fd9b9 not found: ID does not exist" containerID="852e043dc0888f2bc924d2d297bde9961bb73985c8bc896c9ddb0c12e17fd9b9" Sep 29 11:07:52 crc kubenswrapper[4752]: I0929 11:07:52.363332 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"852e043dc0888f2bc924d2d297bde9961bb73985c8bc896c9ddb0c12e17fd9b9"} err="failed to get container status \"852e043dc0888f2bc924d2d297bde9961bb73985c8bc896c9ddb0c12e17fd9b9\": rpc error: code = NotFound desc = could not find container \"852e043dc0888f2bc924d2d297bde9961bb73985c8bc896c9ddb0c12e17fd9b9\": container with ID starting with 852e043dc0888f2bc924d2d297bde9961bb73985c8bc896c9ddb0c12e17fd9b9 not found: ID does not exist" Sep 29 11:07:52 crc kubenswrapper[4752]: I0929 11:07:52.363351 4752 scope.go:117] "RemoveContainer" containerID="a1e37baf9da9b7233e8376e7816e76e1dbcab69e4dafdd74df1a518b4346353b" Sep 29 11:07:52 crc kubenswrapper[4752]: E0929 11:07:52.363551 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1e37baf9da9b7233e8376e7816e76e1dbcab69e4dafdd74df1a518b4346353b\": container with ID starting with a1e37baf9da9b7233e8376e7816e76e1dbcab69e4dafdd74df1a518b4346353b not found: ID does not exist" containerID="a1e37baf9da9b7233e8376e7816e76e1dbcab69e4dafdd74df1a518b4346353b" Sep 29 11:07:52 crc kubenswrapper[4752]: I0929 11:07:52.363576 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1e37baf9da9b7233e8376e7816e76e1dbcab69e4dafdd74df1a518b4346353b"} err="failed to get container status \"a1e37baf9da9b7233e8376e7816e76e1dbcab69e4dafdd74df1a518b4346353b\": rpc error: code = NotFound desc = could not find container \"a1e37baf9da9b7233e8376e7816e76e1dbcab69e4dafdd74df1a518b4346353b\": container with ID starting with a1e37baf9da9b7233e8376e7816e76e1dbcab69e4dafdd74df1a518b4346353b not found: ID does not exist" Sep 29 11:07:54 crc kubenswrapper[4752]: I0929 11:07:54.042155 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b99a9da9-9d27-432b-bb9b-8438f818c766" path="/var/lib/kubelet/pods/b99a9da9-9d27-432b-bb9b-8438f818c766/volumes" Sep 29 11:07:56 crc kubenswrapper[4752]: I0929 11:07:56.176044 4752 patch_prober.go:28] interesting pod/machine-config-daemon-mgrvs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 11:07:56 crc kubenswrapper[4752]: I0929 11:07:56.176121 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 11:08:26 crc kubenswrapper[4752]: I0929 11:08:26.175410 4752 patch_prober.go:28] interesting pod/machine-config-daemon-mgrvs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 11:08:26 crc kubenswrapper[4752]: I0929 11:08:26.176176 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 11:08:42 crc kubenswrapper[4752]: I0929 11:08:42.378945 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5xz8h"] Sep 29 11:08:42 crc kubenswrapper[4752]: E0929 11:08:42.380397 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b99a9da9-9d27-432b-bb9b-8438f818c766" containerName="extract-content" Sep 29 11:08:42 crc kubenswrapper[4752]: I0929 11:08:42.380417 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="b99a9da9-9d27-432b-bb9b-8438f818c766" containerName="extract-content" Sep 29 11:08:42 crc kubenswrapper[4752]: E0929 11:08:42.380435 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b99a9da9-9d27-432b-bb9b-8438f818c766" containerName="registry-server" Sep 29 11:08:42 crc kubenswrapper[4752]: I0929 11:08:42.380443 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="b99a9da9-9d27-432b-bb9b-8438f818c766" containerName="registry-server" Sep 29 11:08:42 crc kubenswrapper[4752]: E0929 11:08:42.380462 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b99a9da9-9d27-432b-bb9b-8438f818c766" containerName="extract-utilities" Sep 29 11:08:42 crc kubenswrapper[4752]: I0929 11:08:42.380471 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="b99a9da9-9d27-432b-bb9b-8438f818c766" containerName="extract-utilities" Sep 29 11:08:42 crc kubenswrapper[4752]: I0929 11:08:42.380698 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="b99a9da9-9d27-432b-bb9b-8438f818c766" containerName="registry-server" Sep 29 11:08:42 crc kubenswrapper[4752]: I0929 11:08:42.385650 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5xz8h" Sep 29 11:08:42 crc kubenswrapper[4752]: I0929 11:08:42.406622 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5xz8h"] Sep 29 11:08:42 crc kubenswrapper[4752]: I0929 11:08:42.446534 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9859752d-fefb-4fc1-8af4-3daa54e9b0ed-utilities\") pod \"community-operators-5xz8h\" (UID: \"9859752d-fefb-4fc1-8af4-3daa54e9b0ed\") " pod="openshift-marketplace/community-operators-5xz8h" Sep 29 11:08:42 crc kubenswrapper[4752]: I0929 11:08:42.446621 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pb77w\" (UniqueName: \"kubernetes.io/projected/9859752d-fefb-4fc1-8af4-3daa54e9b0ed-kube-api-access-pb77w\") pod \"community-operators-5xz8h\" (UID: \"9859752d-fefb-4fc1-8af4-3daa54e9b0ed\") " pod="openshift-marketplace/community-operators-5xz8h" Sep 29 11:08:42 crc kubenswrapper[4752]: I0929 11:08:42.446836 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9859752d-fefb-4fc1-8af4-3daa54e9b0ed-catalog-content\") pod \"community-operators-5xz8h\" (UID: \"9859752d-fefb-4fc1-8af4-3daa54e9b0ed\") " pod="openshift-marketplace/community-operators-5xz8h" Sep 29 11:08:42 crc kubenswrapper[4752]: I0929 11:08:42.548373 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9859752d-fefb-4fc1-8af4-3daa54e9b0ed-utilities\") pod \"community-operators-5xz8h\" (UID: \"9859752d-fefb-4fc1-8af4-3daa54e9b0ed\") " pod="openshift-marketplace/community-operators-5xz8h" Sep 29 11:08:42 crc kubenswrapper[4752]: I0929 11:08:42.548456 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pb77w\" (UniqueName: \"kubernetes.io/projected/9859752d-fefb-4fc1-8af4-3daa54e9b0ed-kube-api-access-pb77w\") pod \"community-operators-5xz8h\" (UID: \"9859752d-fefb-4fc1-8af4-3daa54e9b0ed\") " pod="openshift-marketplace/community-operators-5xz8h" Sep 29 11:08:42 crc kubenswrapper[4752]: I0929 11:08:42.548548 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9859752d-fefb-4fc1-8af4-3daa54e9b0ed-catalog-content\") pod \"community-operators-5xz8h\" (UID: \"9859752d-fefb-4fc1-8af4-3daa54e9b0ed\") " pod="openshift-marketplace/community-operators-5xz8h" Sep 29 11:08:42 crc kubenswrapper[4752]: I0929 11:08:42.549070 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9859752d-fefb-4fc1-8af4-3daa54e9b0ed-catalog-content\") pod \"community-operators-5xz8h\" (UID: \"9859752d-fefb-4fc1-8af4-3daa54e9b0ed\") " pod="openshift-marketplace/community-operators-5xz8h" Sep 29 11:08:42 crc kubenswrapper[4752]: I0929 11:08:42.549239 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9859752d-fefb-4fc1-8af4-3daa54e9b0ed-utilities\") pod \"community-operators-5xz8h\" (UID: \"9859752d-fefb-4fc1-8af4-3daa54e9b0ed\") " pod="openshift-marketplace/community-operators-5xz8h" Sep 29 11:08:42 crc kubenswrapper[4752]: I0929 11:08:42.571544 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pb77w\" (UniqueName: \"kubernetes.io/projected/9859752d-fefb-4fc1-8af4-3daa54e9b0ed-kube-api-access-pb77w\") pod \"community-operators-5xz8h\" (UID: \"9859752d-fefb-4fc1-8af4-3daa54e9b0ed\") " pod="openshift-marketplace/community-operators-5xz8h" Sep 29 11:08:42 crc kubenswrapper[4752]: I0929 11:08:42.712251 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5xz8h" Sep 29 11:08:43 crc kubenswrapper[4752]: I0929 11:08:43.031996 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5xz8h"] Sep 29 11:08:43 crc kubenswrapper[4752]: I0929 11:08:43.699815 4752 generic.go:334] "Generic (PLEG): container finished" podID="9859752d-fefb-4fc1-8af4-3daa54e9b0ed" containerID="c70d2a10c95b3ff6d6d029c7feff77846feca7ce3b827aca5725fd9092848c38" exitCode=0 Sep 29 11:08:43 crc kubenswrapper[4752]: I0929 11:08:43.699926 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5xz8h" event={"ID":"9859752d-fefb-4fc1-8af4-3daa54e9b0ed","Type":"ContainerDied","Data":"c70d2a10c95b3ff6d6d029c7feff77846feca7ce3b827aca5725fd9092848c38"} Sep 29 11:08:43 crc kubenswrapper[4752]: I0929 11:08:43.700114 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5xz8h" event={"ID":"9859752d-fefb-4fc1-8af4-3daa54e9b0ed","Type":"ContainerStarted","Data":"830ab7e39c333472dc63e86ba2f4e768c1a10d1a0a279016be6be2ddfbdbc867"} Sep 29 11:08:45 crc kubenswrapper[4752]: I0929 11:08:45.721215 4752 generic.go:334] "Generic (PLEG): container finished" podID="9859752d-fefb-4fc1-8af4-3daa54e9b0ed" containerID="410bccda64adfeff6f5b68bef6aabd7d4dfe4caeb03ca091406eea252c0600d1" exitCode=0 Sep 29 11:08:45 crc kubenswrapper[4752]: I0929 11:08:45.721279 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5xz8h" event={"ID":"9859752d-fefb-4fc1-8af4-3daa54e9b0ed","Type":"ContainerDied","Data":"410bccda64adfeff6f5b68bef6aabd7d4dfe4caeb03ca091406eea252c0600d1"} Sep 29 11:08:46 crc kubenswrapper[4752]: I0929 11:08:46.734644 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5xz8h" event={"ID":"9859752d-fefb-4fc1-8af4-3daa54e9b0ed","Type":"ContainerStarted","Data":"5a9f5b0277b81a2b37fedfb04aaaa356c8b91a5fd81588047df4b7ec1fc8aafb"} Sep 29 11:08:46 crc kubenswrapper[4752]: I0929 11:08:46.756480 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5xz8h" podStartSLOduration=2.326635769 podStartE2EDuration="4.756443992s" podCreationTimestamp="2025-09-29 11:08:42 +0000 UTC" firstStartedPulling="2025-09-29 11:08:43.702396761 +0000 UTC m=+1464.491538418" lastFinishedPulling="2025-09-29 11:08:46.132204964 +0000 UTC m=+1466.921346641" observedRunningTime="2025-09-29 11:08:46.752839848 +0000 UTC m=+1467.541981545" watchObservedRunningTime="2025-09-29 11:08:46.756443992 +0000 UTC m=+1467.545585709" Sep 29 11:08:52 crc kubenswrapper[4752]: I0929 11:08:52.713127 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5xz8h" Sep 29 11:08:52 crc kubenswrapper[4752]: I0929 11:08:52.713793 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5xz8h" Sep 29 11:08:52 crc kubenswrapper[4752]: I0929 11:08:52.765182 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5xz8h" Sep 29 11:08:52 crc kubenswrapper[4752]: I0929 11:08:52.833857 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5xz8h" Sep 29 11:08:56 crc kubenswrapper[4752]: I0929 11:08:56.175992 4752 patch_prober.go:28] interesting pod/machine-config-daemon-mgrvs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 11:08:56 crc kubenswrapper[4752]: I0929 11:08:56.176673 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 11:08:56 crc kubenswrapper[4752]: I0929 11:08:56.176738 4752 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" Sep 29 11:08:56 crc kubenswrapper[4752]: I0929 11:08:56.177703 4752 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"18eab399f36ee078445fd05909a0d35ada9fdfa2424d9729b71ad67d5ec2e670"} pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 29 11:08:56 crc kubenswrapper[4752]: I0929 11:08:56.177836 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" containerName="machine-config-daemon" containerID="cri-o://18eab399f36ee078445fd05909a0d35ada9fdfa2424d9729b71ad67d5ec2e670" gracePeriod=600 Sep 29 11:08:56 crc kubenswrapper[4752]: E0929 11:08:56.301189 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgrvs_openshift-machine-config-operator(5863c243-797d-462a-b11f-71aaf005f8d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" Sep 29 11:08:56 crc kubenswrapper[4752]: I0929 11:08:56.369626 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5xz8h"] Sep 29 11:08:56 crc kubenswrapper[4752]: I0929 11:08:56.370095 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-5xz8h" podUID="9859752d-fefb-4fc1-8af4-3daa54e9b0ed" containerName="registry-server" containerID="cri-o://5a9f5b0277b81a2b37fedfb04aaaa356c8b91a5fd81588047df4b7ec1fc8aafb" gracePeriod=2 Sep 29 11:08:56 crc kubenswrapper[4752]: I0929 11:08:56.817129 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5xz8h" Sep 29 11:08:56 crc kubenswrapper[4752]: I0929 11:08:56.820639 4752 generic.go:334] "Generic (PLEG): container finished" podID="5863c243-797d-462a-b11f-71aaf005f8d1" containerID="18eab399f36ee078445fd05909a0d35ada9fdfa2424d9729b71ad67d5ec2e670" exitCode=0 Sep 29 11:08:56 crc kubenswrapper[4752]: I0929 11:08:56.820838 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" event={"ID":"5863c243-797d-462a-b11f-71aaf005f8d1","Type":"ContainerDied","Data":"18eab399f36ee078445fd05909a0d35ada9fdfa2424d9729b71ad67d5ec2e670"} Sep 29 11:08:56 crc kubenswrapper[4752]: I0929 11:08:56.821077 4752 scope.go:117] "RemoveContainer" containerID="163ff8dbb1a373e991e8699e30ebf0d1354dad4f96196cd59c49a7d6edcb147e" Sep 29 11:08:56 crc kubenswrapper[4752]: I0929 11:08:56.822105 4752 scope.go:117] "RemoveContainer" containerID="18eab399f36ee078445fd05909a0d35ada9fdfa2424d9729b71ad67d5ec2e670" Sep 29 11:08:56 crc kubenswrapper[4752]: E0929 11:08:56.822461 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgrvs_openshift-machine-config-operator(5863c243-797d-462a-b11f-71aaf005f8d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" Sep 29 11:08:56 crc kubenswrapper[4752]: I0929 11:08:56.824592 4752 generic.go:334] "Generic (PLEG): container finished" podID="9859752d-fefb-4fc1-8af4-3daa54e9b0ed" containerID="5a9f5b0277b81a2b37fedfb04aaaa356c8b91a5fd81588047df4b7ec1fc8aafb" exitCode=0 Sep 29 11:08:56 crc kubenswrapper[4752]: I0929 11:08:56.824626 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5xz8h" event={"ID":"9859752d-fefb-4fc1-8af4-3daa54e9b0ed","Type":"ContainerDied","Data":"5a9f5b0277b81a2b37fedfb04aaaa356c8b91a5fd81588047df4b7ec1fc8aafb"} Sep 29 11:08:56 crc kubenswrapper[4752]: I0929 11:08:56.824649 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5xz8h" event={"ID":"9859752d-fefb-4fc1-8af4-3daa54e9b0ed","Type":"ContainerDied","Data":"830ab7e39c333472dc63e86ba2f4e768c1a10d1a0a279016be6be2ddfbdbc867"} Sep 29 11:08:56 crc kubenswrapper[4752]: I0929 11:08:56.824676 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5xz8h" Sep 29 11:08:56 crc kubenswrapper[4752]: I0929 11:08:56.856582 4752 scope.go:117] "RemoveContainer" containerID="5a9f5b0277b81a2b37fedfb04aaaa356c8b91a5fd81588047df4b7ec1fc8aafb" Sep 29 11:08:56 crc kubenswrapper[4752]: I0929 11:08:56.879414 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9859752d-fefb-4fc1-8af4-3daa54e9b0ed-utilities\") pod \"9859752d-fefb-4fc1-8af4-3daa54e9b0ed\" (UID: \"9859752d-fefb-4fc1-8af4-3daa54e9b0ed\") " Sep 29 11:08:56 crc kubenswrapper[4752]: I0929 11:08:56.879500 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pb77w\" (UniqueName: \"kubernetes.io/projected/9859752d-fefb-4fc1-8af4-3daa54e9b0ed-kube-api-access-pb77w\") pod \"9859752d-fefb-4fc1-8af4-3daa54e9b0ed\" (UID: \"9859752d-fefb-4fc1-8af4-3daa54e9b0ed\") " Sep 29 11:08:56 crc kubenswrapper[4752]: I0929 11:08:56.879549 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9859752d-fefb-4fc1-8af4-3daa54e9b0ed-catalog-content\") pod \"9859752d-fefb-4fc1-8af4-3daa54e9b0ed\" (UID: \"9859752d-fefb-4fc1-8af4-3daa54e9b0ed\") " Sep 29 11:08:56 crc kubenswrapper[4752]: I0929 11:08:56.881879 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9859752d-fefb-4fc1-8af4-3daa54e9b0ed-utilities" (OuterVolumeSpecName: "utilities") pod "9859752d-fefb-4fc1-8af4-3daa54e9b0ed" (UID: "9859752d-fefb-4fc1-8af4-3daa54e9b0ed"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 11:08:56 crc kubenswrapper[4752]: I0929 11:08:56.885264 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9859752d-fefb-4fc1-8af4-3daa54e9b0ed-kube-api-access-pb77w" (OuterVolumeSpecName: "kube-api-access-pb77w") pod "9859752d-fefb-4fc1-8af4-3daa54e9b0ed" (UID: "9859752d-fefb-4fc1-8af4-3daa54e9b0ed"). InnerVolumeSpecName "kube-api-access-pb77w". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 11:08:56 crc kubenswrapper[4752]: I0929 11:08:56.889553 4752 scope.go:117] "RemoveContainer" containerID="410bccda64adfeff6f5b68bef6aabd7d4dfe4caeb03ca091406eea252c0600d1" Sep 29 11:08:56 crc kubenswrapper[4752]: I0929 11:08:56.934701 4752 scope.go:117] "RemoveContainer" containerID="c70d2a10c95b3ff6d6d029c7feff77846feca7ce3b827aca5725fd9092848c38" Sep 29 11:08:56 crc kubenswrapper[4752]: I0929 11:08:56.937534 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9859752d-fefb-4fc1-8af4-3daa54e9b0ed-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9859752d-fefb-4fc1-8af4-3daa54e9b0ed" (UID: "9859752d-fefb-4fc1-8af4-3daa54e9b0ed"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 11:08:56 crc kubenswrapper[4752]: I0929 11:08:56.953759 4752 scope.go:117] "RemoveContainer" containerID="5a9f5b0277b81a2b37fedfb04aaaa356c8b91a5fd81588047df4b7ec1fc8aafb" Sep 29 11:08:56 crc kubenswrapper[4752]: E0929 11:08:56.954291 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a9f5b0277b81a2b37fedfb04aaaa356c8b91a5fd81588047df4b7ec1fc8aafb\": container with ID starting with 5a9f5b0277b81a2b37fedfb04aaaa356c8b91a5fd81588047df4b7ec1fc8aafb not found: ID does not exist" containerID="5a9f5b0277b81a2b37fedfb04aaaa356c8b91a5fd81588047df4b7ec1fc8aafb" Sep 29 11:08:56 crc kubenswrapper[4752]: I0929 11:08:56.954339 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a9f5b0277b81a2b37fedfb04aaaa356c8b91a5fd81588047df4b7ec1fc8aafb"} err="failed to get container status \"5a9f5b0277b81a2b37fedfb04aaaa356c8b91a5fd81588047df4b7ec1fc8aafb\": rpc error: code = NotFound desc = could not find container \"5a9f5b0277b81a2b37fedfb04aaaa356c8b91a5fd81588047df4b7ec1fc8aafb\": container with ID starting with 5a9f5b0277b81a2b37fedfb04aaaa356c8b91a5fd81588047df4b7ec1fc8aafb not found: ID does not exist" Sep 29 11:08:56 crc kubenswrapper[4752]: I0929 11:08:56.954368 4752 scope.go:117] "RemoveContainer" containerID="410bccda64adfeff6f5b68bef6aabd7d4dfe4caeb03ca091406eea252c0600d1" Sep 29 11:08:56 crc kubenswrapper[4752]: E0929 11:08:56.954726 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"410bccda64adfeff6f5b68bef6aabd7d4dfe4caeb03ca091406eea252c0600d1\": container with ID starting with 410bccda64adfeff6f5b68bef6aabd7d4dfe4caeb03ca091406eea252c0600d1 not found: ID does not exist" containerID="410bccda64adfeff6f5b68bef6aabd7d4dfe4caeb03ca091406eea252c0600d1" Sep 29 11:08:56 crc kubenswrapper[4752]: I0929 11:08:56.954763 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"410bccda64adfeff6f5b68bef6aabd7d4dfe4caeb03ca091406eea252c0600d1"} err="failed to get container status \"410bccda64adfeff6f5b68bef6aabd7d4dfe4caeb03ca091406eea252c0600d1\": rpc error: code = NotFound desc = could not find container \"410bccda64adfeff6f5b68bef6aabd7d4dfe4caeb03ca091406eea252c0600d1\": container with ID starting with 410bccda64adfeff6f5b68bef6aabd7d4dfe4caeb03ca091406eea252c0600d1 not found: ID does not exist" Sep 29 11:08:56 crc kubenswrapper[4752]: I0929 11:08:56.954789 4752 scope.go:117] "RemoveContainer" containerID="c70d2a10c95b3ff6d6d029c7feff77846feca7ce3b827aca5725fd9092848c38" Sep 29 11:08:56 crc kubenswrapper[4752]: E0929 11:08:56.955053 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c70d2a10c95b3ff6d6d029c7feff77846feca7ce3b827aca5725fd9092848c38\": container with ID starting with c70d2a10c95b3ff6d6d029c7feff77846feca7ce3b827aca5725fd9092848c38 not found: ID does not exist" containerID="c70d2a10c95b3ff6d6d029c7feff77846feca7ce3b827aca5725fd9092848c38" Sep 29 11:08:56 crc kubenswrapper[4752]: I0929 11:08:56.955082 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c70d2a10c95b3ff6d6d029c7feff77846feca7ce3b827aca5725fd9092848c38"} err="failed to get container status \"c70d2a10c95b3ff6d6d029c7feff77846feca7ce3b827aca5725fd9092848c38\": rpc error: code = NotFound desc = could not find container \"c70d2a10c95b3ff6d6d029c7feff77846feca7ce3b827aca5725fd9092848c38\": container with ID starting with c70d2a10c95b3ff6d6d029c7feff77846feca7ce3b827aca5725fd9092848c38 not found: ID does not exist" Sep 29 11:08:56 crc kubenswrapper[4752]: I0929 11:08:56.981754 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pb77w\" (UniqueName: \"kubernetes.io/projected/9859752d-fefb-4fc1-8af4-3daa54e9b0ed-kube-api-access-pb77w\") on node \"crc\" DevicePath \"\"" Sep 29 11:08:56 crc kubenswrapper[4752]: I0929 11:08:56.981795 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9859752d-fefb-4fc1-8af4-3daa54e9b0ed-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 11:08:56 crc kubenswrapper[4752]: I0929 11:08:56.981827 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9859752d-fefb-4fc1-8af4-3daa54e9b0ed-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 11:08:57 crc kubenswrapper[4752]: I0929 11:08:57.168487 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5xz8h"] Sep 29 11:08:57 crc kubenswrapper[4752]: I0929 11:08:57.175862 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-5xz8h"] Sep 29 11:08:58 crc kubenswrapper[4752]: I0929 11:08:58.045683 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9859752d-fefb-4fc1-8af4-3daa54e9b0ed" path="/var/lib/kubelet/pods/9859752d-fefb-4fc1-8af4-3daa54e9b0ed/volumes" Sep 29 11:09:08 crc kubenswrapper[4752]: I0929 11:09:08.031160 4752 scope.go:117] "RemoveContainer" containerID="18eab399f36ee078445fd05909a0d35ada9fdfa2424d9729b71ad67d5ec2e670" Sep 29 11:09:08 crc kubenswrapper[4752]: E0929 11:09:08.031995 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgrvs_openshift-machine-config-operator(5863c243-797d-462a-b11f-71aaf005f8d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" Sep 29 11:09:19 crc kubenswrapper[4752]: I0929 11:09:19.031778 4752 scope.go:117] "RemoveContainer" containerID="18eab399f36ee078445fd05909a0d35ada9fdfa2424d9729b71ad67d5ec2e670" Sep 29 11:09:19 crc kubenswrapper[4752]: E0929 11:09:19.032603 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgrvs_openshift-machine-config-operator(5863c243-797d-462a-b11f-71aaf005f8d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" Sep 29 11:09:32 crc kubenswrapper[4752]: I0929 11:09:32.031653 4752 scope.go:117] "RemoveContainer" containerID="18eab399f36ee078445fd05909a0d35ada9fdfa2424d9729b71ad67d5ec2e670" Sep 29 11:09:32 crc kubenswrapper[4752]: E0929 11:09:32.032497 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgrvs_openshift-machine-config-operator(5863c243-797d-462a-b11f-71aaf005f8d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" Sep 29 11:09:46 crc kubenswrapper[4752]: I0929 11:09:46.031633 4752 scope.go:117] "RemoveContainer" containerID="18eab399f36ee078445fd05909a0d35ada9fdfa2424d9729b71ad67d5ec2e670" Sep 29 11:09:46 crc kubenswrapper[4752]: E0929 11:09:46.033349 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgrvs_openshift-machine-config-operator(5863c243-797d-462a-b11f-71aaf005f8d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" Sep 29 11:09:57 crc kubenswrapper[4752]: I0929 11:09:57.032132 4752 scope.go:117] "RemoveContainer" containerID="18eab399f36ee078445fd05909a0d35ada9fdfa2424d9729b71ad67d5ec2e670" Sep 29 11:09:57 crc kubenswrapper[4752]: E0929 11:09:57.033604 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgrvs_openshift-machine-config-operator(5863c243-797d-462a-b11f-71aaf005f8d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" Sep 29 11:10:08 crc kubenswrapper[4752]: I0929 11:10:08.703733 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-h665g"] Sep 29 11:10:08 crc kubenswrapper[4752]: I0929 11:10:08.710360 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-h665g"] Sep 29 11:10:08 crc kubenswrapper[4752]: I0929 11:10:08.763528 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher147d-account-delete-cncqk"] Sep 29 11:10:08 crc kubenswrapper[4752]: E0929 11:10:08.763877 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9859752d-fefb-4fc1-8af4-3daa54e9b0ed" containerName="extract-content" Sep 29 11:10:08 crc kubenswrapper[4752]: I0929 11:10:08.763894 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="9859752d-fefb-4fc1-8af4-3daa54e9b0ed" containerName="extract-content" Sep 29 11:10:08 crc kubenswrapper[4752]: E0929 11:10:08.763906 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9859752d-fefb-4fc1-8af4-3daa54e9b0ed" containerName="registry-server" Sep 29 11:10:08 crc kubenswrapper[4752]: I0929 11:10:08.763915 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="9859752d-fefb-4fc1-8af4-3daa54e9b0ed" containerName="registry-server" Sep 29 11:10:08 crc kubenswrapper[4752]: E0929 11:10:08.763935 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9859752d-fefb-4fc1-8af4-3daa54e9b0ed" containerName="extract-utilities" Sep 29 11:10:08 crc kubenswrapper[4752]: I0929 11:10:08.763941 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="9859752d-fefb-4fc1-8af4-3daa54e9b0ed" containerName="extract-utilities" Sep 29 11:10:08 crc kubenswrapper[4752]: I0929 11:10:08.764088 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="9859752d-fefb-4fc1-8af4-3daa54e9b0ed" containerName="registry-server" Sep 29 11:10:08 crc kubenswrapper[4752]: I0929 11:10:08.764641 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher147d-account-delete-cncqk" Sep 29 11:10:08 crc kubenswrapper[4752]: I0929 11:10:08.773502 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher147d-account-delete-cncqk"] Sep 29 11:10:08 crc kubenswrapper[4752]: I0929 11:10:08.813871 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-db-create-plzcj"] Sep 29 11:10:08 crc kubenswrapper[4752]: I0929 11:10:08.836025 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-db-create-plzcj"] Sep 29 11:10:08 crc kubenswrapper[4752]: I0929 11:10:08.846814 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher147d-account-delete-cncqk"] Sep 29 11:10:08 crc kubenswrapper[4752]: E0929 11:10:08.847402 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-7kpqs], unattached volumes=[], failed to process volumes=[]: context canceled" pod="watcher-kuttl-default/watcher147d-account-delete-cncqk" podUID="f0d94fff-5473-40e2-b386-bd8e69a834ef" Sep 29 11:10:08 crc kubenswrapper[4752]: I0929 11:10:08.858741 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-147d-account-create-79sqq"] Sep 29 11:10:08 crc kubenswrapper[4752]: I0929 11:10:08.866212 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kpqs\" (UniqueName: \"kubernetes.io/projected/f0d94fff-5473-40e2-b386-bd8e69a834ef-kube-api-access-7kpqs\") pod \"watcher147d-account-delete-cncqk\" (UID: \"f0d94fff-5473-40e2-b386-bd8e69a834ef\") " pod="watcher-kuttl-default/watcher147d-account-delete-cncqk" Sep 29 11:10:08 crc kubenswrapper[4752]: I0929 11:10:08.866382 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Sep 29 11:10:08 crc kubenswrapper[4752]: I0929 11:10:08.866572 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="2c3e5b57-3df6-4005-aa53-7cc77c1101ad" containerName="watcher-decision-engine" containerID="cri-o://338689d2438b2a2babeb9971e3a14d311a846b7870e414d5b8897b75c7dc86d8" gracePeriod=30 Sep 29 11:10:08 crc kubenswrapper[4752]: I0929 11:10:08.881669 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-147d-account-create-79sqq"] Sep 29 11:10:08 crc kubenswrapper[4752]: I0929 11:10:08.902881 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Sep 29 11:10:08 crc kubenswrapper[4752]: I0929 11:10:08.903130 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podUID="1c7abae5-8a39-4e88-ac5c-997fb44a9fcc" containerName="watcher-applier" containerID="cri-o://8f129063a19f5270a664c01d8b862f373a2001f1ae0937949cf1cf4af5d08e13" gracePeriod=30 Sep 29 11:10:08 crc kubenswrapper[4752]: I0929 11:10:08.914447 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Sep 29 11:10:08 crc kubenswrapper[4752]: I0929 11:10:08.916280 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="c2a1fefe-b6e1-4a89-b8ee-4cb201d32c2c" containerName="watcher-kuttl-api-log" containerID="cri-o://34c7061bdfaa8c6f62a36fdf2e10a14fd0db032c6c3de657e5d6e57f95f942f1" gracePeriod=30 Sep 29 11:10:08 crc kubenswrapper[4752]: I0929 11:10:08.916658 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="c2a1fefe-b6e1-4a89-b8ee-4cb201d32c2c" containerName="watcher-api" containerID="cri-o://b15b69efa07590dd7ca87daeab60e9306e135919b4b681122ee3513e869ee432" gracePeriod=30 Sep 29 11:10:08 crc kubenswrapper[4752]: I0929 11:10:08.967891 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7kpqs\" (UniqueName: \"kubernetes.io/projected/f0d94fff-5473-40e2-b386-bd8e69a834ef-kube-api-access-7kpqs\") pod \"watcher147d-account-delete-cncqk\" (UID: \"f0d94fff-5473-40e2-b386-bd8e69a834ef\") " pod="watcher-kuttl-default/watcher147d-account-delete-cncqk" Sep 29 11:10:08 crc kubenswrapper[4752]: I0929 11:10:08.994816 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kpqs\" (UniqueName: \"kubernetes.io/projected/f0d94fff-5473-40e2-b386-bd8e69a834ef-kube-api-access-7kpqs\") pod \"watcher147d-account-delete-cncqk\" (UID: \"f0d94fff-5473-40e2-b386-bd8e69a834ef\") " pod="watcher-kuttl-default/watcher147d-account-delete-cncqk" Sep 29 11:10:09 crc kubenswrapper[4752]: I0929 11:10:09.245089 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-db-create-9xkb5"] Sep 29 11:10:09 crc kubenswrapper[4752]: I0929 11:10:09.246563 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-9xkb5" Sep 29 11:10:09 crc kubenswrapper[4752]: I0929 11:10:09.264594 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-db-create-9xkb5"] Sep 29 11:10:09 crc kubenswrapper[4752]: I0929 11:10:09.373552 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6x7r\" (UniqueName: \"kubernetes.io/projected/2c26afed-cf6c-4876-a817-216eae20e80f-kube-api-access-j6x7r\") pod \"watcher-db-create-9xkb5\" (UID: \"2c26afed-cf6c-4876-a817-216eae20e80f\") " pod="watcher-kuttl-default/watcher-db-create-9xkb5" Sep 29 11:10:09 crc kubenswrapper[4752]: I0929 11:10:09.440323 4752 generic.go:334] "Generic (PLEG): container finished" podID="c2a1fefe-b6e1-4a89-b8ee-4cb201d32c2c" containerID="34c7061bdfaa8c6f62a36fdf2e10a14fd0db032c6c3de657e5d6e57f95f942f1" exitCode=143 Sep 29 11:10:09 crc kubenswrapper[4752]: I0929 11:10:09.440395 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher147d-account-delete-cncqk" Sep 29 11:10:09 crc kubenswrapper[4752]: I0929 11:10:09.440384 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"c2a1fefe-b6e1-4a89-b8ee-4cb201d32c2c","Type":"ContainerDied","Data":"34c7061bdfaa8c6f62a36fdf2e10a14fd0db032c6c3de657e5d6e57f95f942f1"} Sep 29 11:10:09 crc kubenswrapper[4752]: I0929 11:10:09.450454 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher147d-account-delete-cncqk" Sep 29 11:10:09 crc kubenswrapper[4752]: I0929 11:10:09.475674 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6x7r\" (UniqueName: \"kubernetes.io/projected/2c26afed-cf6c-4876-a817-216eae20e80f-kube-api-access-j6x7r\") pod \"watcher-db-create-9xkb5\" (UID: \"2c26afed-cf6c-4876-a817-216eae20e80f\") " pod="watcher-kuttl-default/watcher-db-create-9xkb5" Sep 29 11:10:09 crc kubenswrapper[4752]: I0929 11:10:09.501217 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6x7r\" (UniqueName: \"kubernetes.io/projected/2c26afed-cf6c-4876-a817-216eae20e80f-kube-api-access-j6x7r\") pod \"watcher-db-create-9xkb5\" (UID: \"2c26afed-cf6c-4876-a817-216eae20e80f\") " pod="watcher-kuttl-default/watcher-db-create-9xkb5" Sep 29 11:10:09 crc kubenswrapper[4752]: I0929 11:10:09.567689 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-9xkb5" Sep 29 11:10:09 crc kubenswrapper[4752]: I0929 11:10:09.577318 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7kpqs\" (UniqueName: \"kubernetes.io/projected/f0d94fff-5473-40e2-b386-bd8e69a834ef-kube-api-access-7kpqs\") pod \"f0d94fff-5473-40e2-b386-bd8e69a834ef\" (UID: \"f0d94fff-5473-40e2-b386-bd8e69a834ef\") " Sep 29 11:10:09 crc kubenswrapper[4752]: I0929 11:10:09.580481 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0d94fff-5473-40e2-b386-bd8e69a834ef-kube-api-access-7kpqs" (OuterVolumeSpecName: "kube-api-access-7kpqs") pod "f0d94fff-5473-40e2-b386-bd8e69a834ef" (UID: "f0d94fff-5473-40e2-b386-bd8e69a834ef"). InnerVolumeSpecName "kube-api-access-7kpqs". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 11:10:09 crc kubenswrapper[4752]: I0929 11:10:09.679772 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7kpqs\" (UniqueName: \"kubernetes.io/projected/f0d94fff-5473-40e2-b386-bd8e69a834ef-kube-api-access-7kpqs\") on node \"crc\" DevicePath \"\"" Sep 29 11:10:10 crc kubenswrapper[4752]: I0929 11:10:10.044432 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="739beb56-a180-4f55-b671-52a96d798d72" path="/var/lib/kubelet/pods/739beb56-a180-4f55-b671-52a96d798d72/volumes" Sep 29 11:10:10 crc kubenswrapper[4752]: I0929 11:10:10.045946 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3ca3474-aa46-4a60-8451-524755f59b62" path="/var/lib/kubelet/pods/a3ca3474-aa46-4a60-8451-524755f59b62/volumes" Sep 29 11:10:10 crc kubenswrapper[4752]: I0929 11:10:10.048929 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3cc2bbe-57f1-4c2f-802c-e7d954268185" path="/var/lib/kubelet/pods/b3cc2bbe-57f1-4c2f-802c-e7d954268185/volumes" Sep 29 11:10:10 crc kubenswrapper[4752]: I0929 11:10:10.102755 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-db-create-9xkb5"] Sep 29 11:10:10 crc kubenswrapper[4752]: I0929 11:10:10.205010 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:10:10 crc kubenswrapper[4752]: I0929 11:10:10.292310 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2a1fefe-b6e1-4a89-b8ee-4cb201d32c2c-logs\") pod \"c2a1fefe-b6e1-4a89-b8ee-4cb201d32c2c\" (UID: \"c2a1fefe-b6e1-4a89-b8ee-4cb201d32c2c\") " Sep 29 11:10:10 crc kubenswrapper[4752]: I0929 11:10:10.292380 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2a1fefe-b6e1-4a89-b8ee-4cb201d32c2c-config-data\") pod \"c2a1fefe-b6e1-4a89-b8ee-4cb201d32c2c\" (UID: \"c2a1fefe-b6e1-4a89-b8ee-4cb201d32c2c\") " Sep 29 11:10:10 crc kubenswrapper[4752]: I0929 11:10:10.292434 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/c2a1fefe-b6e1-4a89-b8ee-4cb201d32c2c-custom-prometheus-ca\") pod \"c2a1fefe-b6e1-4a89-b8ee-4cb201d32c2c\" (UID: \"c2a1fefe-b6e1-4a89-b8ee-4cb201d32c2c\") " Sep 29 11:10:10 crc kubenswrapper[4752]: I0929 11:10:10.292492 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2a1fefe-b6e1-4a89-b8ee-4cb201d32c2c-combined-ca-bundle\") pod \"c2a1fefe-b6e1-4a89-b8ee-4cb201d32c2c\" (UID: \"c2a1fefe-b6e1-4a89-b8ee-4cb201d32c2c\") " Sep 29 11:10:10 crc kubenswrapper[4752]: I0929 11:10:10.292604 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8zrl5\" (UniqueName: \"kubernetes.io/projected/c2a1fefe-b6e1-4a89-b8ee-4cb201d32c2c-kube-api-access-8zrl5\") pod \"c2a1fefe-b6e1-4a89-b8ee-4cb201d32c2c\" (UID: \"c2a1fefe-b6e1-4a89-b8ee-4cb201d32c2c\") " Sep 29 11:10:10 crc kubenswrapper[4752]: I0929 11:10:10.294206 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2a1fefe-b6e1-4a89-b8ee-4cb201d32c2c-logs" (OuterVolumeSpecName: "logs") pod "c2a1fefe-b6e1-4a89-b8ee-4cb201d32c2c" (UID: "c2a1fefe-b6e1-4a89-b8ee-4cb201d32c2c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 11:10:10 crc kubenswrapper[4752]: I0929 11:10:10.318977 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2a1fefe-b6e1-4a89-b8ee-4cb201d32c2c-kube-api-access-8zrl5" (OuterVolumeSpecName: "kube-api-access-8zrl5") pod "c2a1fefe-b6e1-4a89-b8ee-4cb201d32c2c" (UID: "c2a1fefe-b6e1-4a89-b8ee-4cb201d32c2c"). InnerVolumeSpecName "kube-api-access-8zrl5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 11:10:10 crc kubenswrapper[4752]: I0929 11:10:10.345685 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2a1fefe-b6e1-4a89-b8ee-4cb201d32c2c-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "c2a1fefe-b6e1-4a89-b8ee-4cb201d32c2c" (UID: "c2a1fefe-b6e1-4a89-b8ee-4cb201d32c2c"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:10:10 crc kubenswrapper[4752]: I0929 11:10:10.346544 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2a1fefe-b6e1-4a89-b8ee-4cb201d32c2c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c2a1fefe-b6e1-4a89-b8ee-4cb201d32c2c" (UID: "c2a1fefe-b6e1-4a89-b8ee-4cb201d32c2c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:10:10 crc kubenswrapper[4752]: I0929 11:10:10.368235 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2a1fefe-b6e1-4a89-b8ee-4cb201d32c2c-config-data" (OuterVolumeSpecName: "config-data") pod "c2a1fefe-b6e1-4a89-b8ee-4cb201d32c2c" (UID: "c2a1fefe-b6e1-4a89-b8ee-4cb201d32c2c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:10:10 crc kubenswrapper[4752]: I0929 11:10:10.390864 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:10:10 crc kubenswrapper[4752]: I0929 11:10:10.394149 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8zrl5\" (UniqueName: \"kubernetes.io/projected/c2a1fefe-b6e1-4a89-b8ee-4cb201d32c2c-kube-api-access-8zrl5\") on node \"crc\" DevicePath \"\"" Sep 29 11:10:10 crc kubenswrapper[4752]: I0929 11:10:10.394190 4752 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2a1fefe-b6e1-4a89-b8ee-4cb201d32c2c-logs\") on node \"crc\" DevicePath \"\"" Sep 29 11:10:10 crc kubenswrapper[4752]: I0929 11:10:10.394202 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2a1fefe-b6e1-4a89-b8ee-4cb201d32c2c-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 11:10:10 crc kubenswrapper[4752]: I0929 11:10:10.394213 4752 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/c2a1fefe-b6e1-4a89-b8ee-4cb201d32c2c-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Sep 29 11:10:10 crc kubenswrapper[4752]: I0929 11:10:10.394414 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2a1fefe-b6e1-4a89-b8ee-4cb201d32c2c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 11:10:10 crc kubenswrapper[4752]: I0929 11:10:10.450634 4752 generic.go:334] "Generic (PLEG): container finished" podID="c2a1fefe-b6e1-4a89-b8ee-4cb201d32c2c" containerID="b15b69efa07590dd7ca87daeab60e9306e135919b4b681122ee3513e869ee432" exitCode=0 Sep 29 11:10:10 crc kubenswrapper[4752]: I0929 11:10:10.450762 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:10:10 crc kubenswrapper[4752]: I0929 11:10:10.450960 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"c2a1fefe-b6e1-4a89-b8ee-4cb201d32c2c","Type":"ContainerDied","Data":"b15b69efa07590dd7ca87daeab60e9306e135919b4b681122ee3513e869ee432"} Sep 29 11:10:10 crc kubenswrapper[4752]: I0929 11:10:10.451057 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"c2a1fefe-b6e1-4a89-b8ee-4cb201d32c2c","Type":"ContainerDied","Data":"d4d3cc325241ca5109885ac08c038d41e30d8a2bab6356e8645b3052917b673c"} Sep 29 11:10:10 crc kubenswrapper[4752]: I0929 11:10:10.451120 4752 scope.go:117] "RemoveContainer" containerID="b15b69efa07590dd7ca87daeab60e9306e135919b4b681122ee3513e869ee432" Sep 29 11:10:10 crc kubenswrapper[4752]: I0929 11:10:10.453829 4752 generic.go:334] "Generic (PLEG): container finished" podID="2c3e5b57-3df6-4005-aa53-7cc77c1101ad" containerID="338689d2438b2a2babeb9971e3a14d311a846b7870e414d5b8897b75c7dc86d8" exitCode=0 Sep 29 11:10:10 crc kubenswrapper[4752]: I0929 11:10:10.453905 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"2c3e5b57-3df6-4005-aa53-7cc77c1101ad","Type":"ContainerDied","Data":"338689d2438b2a2babeb9971e3a14d311a846b7870e414d5b8897b75c7dc86d8"} Sep 29 11:10:10 crc kubenswrapper[4752]: I0929 11:10:10.453930 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"2c3e5b57-3df6-4005-aa53-7cc77c1101ad","Type":"ContainerDied","Data":"e0e636169f38ec8a4740544b3da90e992ad2c24fc876db6832b759f270b35fc8"} Sep 29 11:10:10 crc kubenswrapper[4752]: I0929 11:10:10.453907 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:10:10 crc kubenswrapper[4752]: I0929 11:10:10.457734 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher147d-account-delete-cncqk" Sep 29 11:10:10 crc kubenswrapper[4752]: I0929 11:10:10.457733 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-9xkb5" event={"ID":"2c26afed-cf6c-4876-a817-216eae20e80f","Type":"ContainerStarted","Data":"30e5b3313d62a38a3cc1d0ae790605dd5cc2e3cc17a37a875a5e931339bdd518"} Sep 29 11:10:10 crc kubenswrapper[4752]: I0929 11:10:10.457780 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-9xkb5" event={"ID":"2c26afed-cf6c-4876-a817-216eae20e80f","Type":"ContainerStarted","Data":"7e89c276e31bad8c49fc88c3ec6335adb87b2415140612f836194933286e45b3"} Sep 29 11:10:10 crc kubenswrapper[4752]: I0929 11:10:10.474580 4752 scope.go:117] "RemoveContainer" containerID="34c7061bdfaa8c6f62a36fdf2e10a14fd0db032c6c3de657e5d6e57f95f942f1" Sep 29 11:10:10 crc kubenswrapper[4752]: I0929 11:10:10.489563 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-db-create-9xkb5" podStartSLOduration=1.489543377 podStartE2EDuration="1.489543377s" podCreationTimestamp="2025-09-29 11:10:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 11:10:10.480832119 +0000 UTC m=+1551.269973786" watchObservedRunningTime="2025-09-29 11:10:10.489543377 +0000 UTC m=+1551.278685044" Sep 29 11:10:10 crc kubenswrapper[4752]: I0929 11:10:10.495191 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vflf6\" (UniqueName: \"kubernetes.io/projected/2c3e5b57-3df6-4005-aa53-7cc77c1101ad-kube-api-access-vflf6\") pod \"2c3e5b57-3df6-4005-aa53-7cc77c1101ad\" (UID: \"2c3e5b57-3df6-4005-aa53-7cc77c1101ad\") " Sep 29 11:10:10 crc kubenswrapper[4752]: I0929 11:10:10.495250 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c3e5b57-3df6-4005-aa53-7cc77c1101ad-combined-ca-bundle\") pod \"2c3e5b57-3df6-4005-aa53-7cc77c1101ad\" (UID: \"2c3e5b57-3df6-4005-aa53-7cc77c1101ad\") " Sep 29 11:10:10 crc kubenswrapper[4752]: I0929 11:10:10.495359 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2c3e5b57-3df6-4005-aa53-7cc77c1101ad-logs\") pod \"2c3e5b57-3df6-4005-aa53-7cc77c1101ad\" (UID: \"2c3e5b57-3df6-4005-aa53-7cc77c1101ad\") " Sep 29 11:10:10 crc kubenswrapper[4752]: I0929 11:10:10.495397 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/2c3e5b57-3df6-4005-aa53-7cc77c1101ad-custom-prometheus-ca\") pod \"2c3e5b57-3df6-4005-aa53-7cc77c1101ad\" (UID: \"2c3e5b57-3df6-4005-aa53-7cc77c1101ad\") " Sep 29 11:10:10 crc kubenswrapper[4752]: I0929 11:10:10.495475 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c3e5b57-3df6-4005-aa53-7cc77c1101ad-config-data\") pod \"2c3e5b57-3df6-4005-aa53-7cc77c1101ad\" (UID: \"2c3e5b57-3df6-4005-aa53-7cc77c1101ad\") " Sep 29 11:10:10 crc kubenswrapper[4752]: I0929 11:10:10.496984 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c3e5b57-3df6-4005-aa53-7cc77c1101ad-logs" (OuterVolumeSpecName: "logs") pod "2c3e5b57-3df6-4005-aa53-7cc77c1101ad" (UID: "2c3e5b57-3df6-4005-aa53-7cc77c1101ad"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 11:10:10 crc kubenswrapper[4752]: I0929 11:10:10.503640 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c3e5b57-3df6-4005-aa53-7cc77c1101ad-kube-api-access-vflf6" (OuterVolumeSpecName: "kube-api-access-vflf6") pod "2c3e5b57-3df6-4005-aa53-7cc77c1101ad" (UID: "2c3e5b57-3df6-4005-aa53-7cc77c1101ad"). InnerVolumeSpecName "kube-api-access-vflf6". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 11:10:10 crc kubenswrapper[4752]: I0929 11:10:10.509228 4752 scope.go:117] "RemoveContainer" containerID="b15b69efa07590dd7ca87daeab60e9306e135919b4b681122ee3513e869ee432" Sep 29 11:10:10 crc kubenswrapper[4752]: E0929 11:10:10.510048 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b15b69efa07590dd7ca87daeab60e9306e135919b4b681122ee3513e869ee432\": container with ID starting with b15b69efa07590dd7ca87daeab60e9306e135919b4b681122ee3513e869ee432 not found: ID does not exist" containerID="b15b69efa07590dd7ca87daeab60e9306e135919b4b681122ee3513e869ee432" Sep 29 11:10:10 crc kubenswrapper[4752]: I0929 11:10:10.510080 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b15b69efa07590dd7ca87daeab60e9306e135919b4b681122ee3513e869ee432"} err="failed to get container status \"b15b69efa07590dd7ca87daeab60e9306e135919b4b681122ee3513e869ee432\": rpc error: code = NotFound desc = could not find container \"b15b69efa07590dd7ca87daeab60e9306e135919b4b681122ee3513e869ee432\": container with ID starting with b15b69efa07590dd7ca87daeab60e9306e135919b4b681122ee3513e869ee432 not found: ID does not exist" Sep 29 11:10:10 crc kubenswrapper[4752]: I0929 11:10:10.510099 4752 scope.go:117] "RemoveContainer" containerID="34c7061bdfaa8c6f62a36fdf2e10a14fd0db032c6c3de657e5d6e57f95f942f1" Sep 29 11:10:10 crc kubenswrapper[4752]: E0929 11:10:10.510311 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34c7061bdfaa8c6f62a36fdf2e10a14fd0db032c6c3de657e5d6e57f95f942f1\": container with ID starting with 34c7061bdfaa8c6f62a36fdf2e10a14fd0db032c6c3de657e5d6e57f95f942f1 not found: ID does not exist" containerID="34c7061bdfaa8c6f62a36fdf2e10a14fd0db032c6c3de657e5d6e57f95f942f1" Sep 29 11:10:10 crc kubenswrapper[4752]: I0929 11:10:10.510361 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34c7061bdfaa8c6f62a36fdf2e10a14fd0db032c6c3de657e5d6e57f95f942f1"} err="failed to get container status \"34c7061bdfaa8c6f62a36fdf2e10a14fd0db032c6c3de657e5d6e57f95f942f1\": rpc error: code = NotFound desc = could not find container \"34c7061bdfaa8c6f62a36fdf2e10a14fd0db032c6c3de657e5d6e57f95f942f1\": container with ID starting with 34c7061bdfaa8c6f62a36fdf2e10a14fd0db032c6c3de657e5d6e57f95f942f1 not found: ID does not exist" Sep 29 11:10:10 crc kubenswrapper[4752]: I0929 11:10:10.510375 4752 scope.go:117] "RemoveContainer" containerID="338689d2438b2a2babeb9971e3a14d311a846b7870e414d5b8897b75c7dc86d8" Sep 29 11:10:10 crc kubenswrapper[4752]: I0929 11:10:10.514057 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Sep 29 11:10:10 crc kubenswrapper[4752]: I0929 11:10:10.539223 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Sep 29 11:10:10 crc kubenswrapper[4752]: I0929 11:10:10.539455 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c3e5b57-3df6-4005-aa53-7cc77c1101ad-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2c3e5b57-3df6-4005-aa53-7cc77c1101ad" (UID: "2c3e5b57-3df6-4005-aa53-7cc77c1101ad"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:10:10 crc kubenswrapper[4752]: I0929 11:10:10.555868 4752 scope.go:117] "RemoveContainer" containerID="338689d2438b2a2babeb9971e3a14d311a846b7870e414d5b8897b75c7dc86d8" Sep 29 11:10:10 crc kubenswrapper[4752]: E0929 11:10:10.560518 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"338689d2438b2a2babeb9971e3a14d311a846b7870e414d5b8897b75c7dc86d8\": container with ID starting with 338689d2438b2a2babeb9971e3a14d311a846b7870e414d5b8897b75c7dc86d8 not found: ID does not exist" containerID="338689d2438b2a2babeb9971e3a14d311a846b7870e414d5b8897b75c7dc86d8" Sep 29 11:10:10 crc kubenswrapper[4752]: I0929 11:10:10.560712 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"338689d2438b2a2babeb9971e3a14d311a846b7870e414d5b8897b75c7dc86d8"} err="failed to get container status \"338689d2438b2a2babeb9971e3a14d311a846b7870e414d5b8897b75c7dc86d8\": rpc error: code = NotFound desc = could not find container \"338689d2438b2a2babeb9971e3a14d311a846b7870e414d5b8897b75c7dc86d8\": container with ID starting with 338689d2438b2a2babeb9971e3a14d311a846b7870e414d5b8897b75c7dc86d8 not found: ID does not exist" Sep 29 11:10:10 crc kubenswrapper[4752]: I0929 11:10:10.594038 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c3e5b57-3df6-4005-aa53-7cc77c1101ad-config-data" (OuterVolumeSpecName: "config-data") pod "2c3e5b57-3df6-4005-aa53-7cc77c1101ad" (UID: "2c3e5b57-3df6-4005-aa53-7cc77c1101ad"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:10:10 crc kubenswrapper[4752]: I0929 11:10:10.596724 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c3e5b57-3df6-4005-aa53-7cc77c1101ad-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "2c3e5b57-3df6-4005-aa53-7cc77c1101ad" (UID: "2c3e5b57-3df6-4005-aa53-7cc77c1101ad"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:10:10 crc kubenswrapper[4752]: I0929 11:10:10.602530 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c3e5b57-3df6-4005-aa53-7cc77c1101ad-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 11:10:10 crc kubenswrapper[4752]: I0929 11:10:10.604007 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vflf6\" (UniqueName: \"kubernetes.io/projected/2c3e5b57-3df6-4005-aa53-7cc77c1101ad-kube-api-access-vflf6\") on node \"crc\" DevicePath \"\"" Sep 29 11:10:10 crc kubenswrapper[4752]: I0929 11:10:10.604118 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c3e5b57-3df6-4005-aa53-7cc77c1101ad-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 11:10:10 crc kubenswrapper[4752]: I0929 11:10:10.604217 4752 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2c3e5b57-3df6-4005-aa53-7cc77c1101ad-logs\") on node \"crc\" DevicePath \"\"" Sep 29 11:10:10 crc kubenswrapper[4752]: I0929 11:10:10.604281 4752 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/2c3e5b57-3df6-4005-aa53-7cc77c1101ad-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Sep 29 11:10:10 crc kubenswrapper[4752]: I0929 11:10:10.614872 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher147d-account-delete-cncqk"] Sep 29 11:10:10 crc kubenswrapper[4752]: I0929 11:10:10.663029 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher147d-account-delete-cncqk"] Sep 29 11:10:10 crc kubenswrapper[4752]: I0929 11:10:10.798062 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Sep 29 11:10:10 crc kubenswrapper[4752]: I0929 11:10:10.813547 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Sep 29 11:10:11 crc kubenswrapper[4752]: I0929 11:10:11.470593 4752 generic.go:334] "Generic (PLEG): container finished" podID="2c26afed-cf6c-4876-a817-216eae20e80f" containerID="30e5b3313d62a38a3cc1d0ae790605dd5cc2e3cc17a37a875a5e931339bdd518" exitCode=0 Sep 29 11:10:11 crc kubenswrapper[4752]: I0929 11:10:11.470676 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-9xkb5" event={"ID":"2c26afed-cf6c-4876-a817-216eae20e80f","Type":"ContainerDied","Data":"30e5b3313d62a38a3cc1d0ae790605dd5cc2e3cc17a37a875a5e931339bdd518"} Sep 29 11:10:11 crc kubenswrapper[4752]: I0929 11:10:11.619264 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Sep 29 11:10:11 crc kubenswrapper[4752]: I0929 11:10:11.619824 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="bd1e5555-3ef3-4c8c-a964-f9b64eb3064c" containerName="ceilometer-central-agent" containerID="cri-o://be18ce7da4565251bc9b510cae875334df0124db25fe73cbb79e356458a6f1b3" gracePeriod=30 Sep 29 11:10:11 crc kubenswrapper[4752]: I0929 11:10:11.619900 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="bd1e5555-3ef3-4c8c-a964-f9b64eb3064c" containerName="proxy-httpd" containerID="cri-o://cd7c78b58f990780c3e6ac29d379c9b8f7a84c8e5c6511f5f02d022a2ff78128" gracePeriod=30 Sep 29 11:10:11 crc kubenswrapper[4752]: I0929 11:10:11.619919 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="bd1e5555-3ef3-4c8c-a964-f9b64eb3064c" containerName="sg-core" containerID="cri-o://724a1a14fc70744885e26a376743ca369ceb4826172cd3a5651fa92076a68fd9" gracePeriod=30 Sep 29 11:10:11 crc kubenswrapper[4752]: I0929 11:10:11.619941 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="bd1e5555-3ef3-4c8c-a964-f9b64eb3064c" containerName="ceilometer-notification-agent" containerID="cri-o://0b1773cffe4f89c66cb8c7d039ce2f198253a606f840e6a7d055078ef0e72e02" gracePeriod=30 Sep 29 11:10:12 crc kubenswrapper[4752]: I0929 11:10:12.031049 4752 scope.go:117] "RemoveContainer" containerID="18eab399f36ee078445fd05909a0d35ada9fdfa2424d9729b71ad67d5ec2e670" Sep 29 11:10:12 crc kubenswrapper[4752]: E0929 11:10:12.031326 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgrvs_openshift-machine-config-operator(5863c243-797d-462a-b11f-71aaf005f8d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" Sep 29 11:10:12 crc kubenswrapper[4752]: I0929 11:10:12.043995 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c3e5b57-3df6-4005-aa53-7cc77c1101ad" path="/var/lib/kubelet/pods/2c3e5b57-3df6-4005-aa53-7cc77c1101ad/volumes" Sep 29 11:10:12 crc kubenswrapper[4752]: I0929 11:10:12.044655 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2a1fefe-b6e1-4a89-b8ee-4cb201d32c2c" path="/var/lib/kubelet/pods/c2a1fefe-b6e1-4a89-b8ee-4cb201d32c2c/volumes" Sep 29 11:10:12 crc kubenswrapper[4752]: I0929 11:10:12.045327 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0d94fff-5473-40e2-b386-bd8e69a834ef" path="/var/lib/kubelet/pods/f0d94fff-5473-40e2-b386-bd8e69a834ef/volumes" Sep 29 11:10:12 crc kubenswrapper[4752]: I0929 11:10:12.486928 4752 generic.go:334] "Generic (PLEG): container finished" podID="bd1e5555-3ef3-4c8c-a964-f9b64eb3064c" containerID="cd7c78b58f990780c3e6ac29d379c9b8f7a84c8e5c6511f5f02d022a2ff78128" exitCode=0 Sep 29 11:10:12 crc kubenswrapper[4752]: I0929 11:10:12.487214 4752 generic.go:334] "Generic (PLEG): container finished" podID="bd1e5555-3ef3-4c8c-a964-f9b64eb3064c" containerID="724a1a14fc70744885e26a376743ca369ceb4826172cd3a5651fa92076a68fd9" exitCode=2 Sep 29 11:10:12 crc kubenswrapper[4752]: I0929 11:10:12.487222 4752 generic.go:334] "Generic (PLEG): container finished" podID="bd1e5555-3ef3-4c8c-a964-f9b64eb3064c" containerID="be18ce7da4565251bc9b510cae875334df0124db25fe73cbb79e356458a6f1b3" exitCode=0 Sep 29 11:10:12 crc kubenswrapper[4752]: I0929 11:10:12.487012 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"bd1e5555-3ef3-4c8c-a964-f9b64eb3064c","Type":"ContainerDied","Data":"cd7c78b58f990780c3e6ac29d379c9b8f7a84c8e5c6511f5f02d022a2ff78128"} Sep 29 11:10:12 crc kubenswrapper[4752]: I0929 11:10:12.487377 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"bd1e5555-3ef3-4c8c-a964-f9b64eb3064c","Type":"ContainerDied","Data":"724a1a14fc70744885e26a376743ca369ceb4826172cd3a5651fa92076a68fd9"} Sep 29 11:10:12 crc kubenswrapper[4752]: I0929 11:10:12.487391 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"bd1e5555-3ef3-4c8c-a964-f9b64eb3064c","Type":"ContainerDied","Data":"be18ce7da4565251bc9b510cae875334df0124db25fe73cbb79e356458a6f1b3"} Sep 29 11:10:12 crc kubenswrapper[4752]: I0929 11:10:12.790421 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-9xkb5" Sep 29 11:10:12 crc kubenswrapper[4752]: I0929 11:10:12.938468 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j6x7r\" (UniqueName: \"kubernetes.io/projected/2c26afed-cf6c-4876-a817-216eae20e80f-kube-api-access-j6x7r\") pod \"2c26afed-cf6c-4876-a817-216eae20e80f\" (UID: \"2c26afed-cf6c-4876-a817-216eae20e80f\") " Sep 29 11:10:12 crc kubenswrapper[4752]: I0929 11:10:12.943913 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c26afed-cf6c-4876-a817-216eae20e80f-kube-api-access-j6x7r" (OuterVolumeSpecName: "kube-api-access-j6x7r") pod "2c26afed-cf6c-4876-a817-216eae20e80f" (UID: "2c26afed-cf6c-4876-a817-216eae20e80f"). InnerVolumeSpecName "kube-api-access-j6x7r". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 11:10:13 crc kubenswrapper[4752]: I0929 11:10:13.039997 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j6x7r\" (UniqueName: \"kubernetes.io/projected/2c26afed-cf6c-4876-a817-216eae20e80f-kube-api-access-j6x7r\") on node \"crc\" DevicePath \"\"" Sep 29 11:10:13 crc kubenswrapper[4752]: I0929 11:10:13.496966 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-9xkb5" event={"ID":"2c26afed-cf6c-4876-a817-216eae20e80f","Type":"ContainerDied","Data":"7e89c276e31bad8c49fc88c3ec6335adb87b2415140612f836194933286e45b3"} Sep 29 11:10:13 crc kubenswrapper[4752]: I0929 11:10:13.497217 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7e89c276e31bad8c49fc88c3ec6335adb87b2415140612f836194933286e45b3" Sep 29 11:10:13 crc kubenswrapper[4752]: I0929 11:10:13.497032 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-9xkb5" Sep 29 11:10:13 crc kubenswrapper[4752]: E0929 11:10:13.859450 4752 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8f129063a19f5270a664c01d8b862f373a2001f1ae0937949cf1cf4af5d08e13" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Sep 29 11:10:13 crc kubenswrapper[4752]: E0929 11:10:13.868292 4752 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8f129063a19f5270a664c01d8b862f373a2001f1ae0937949cf1cf4af5d08e13" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Sep 29 11:10:13 crc kubenswrapper[4752]: E0929 11:10:13.878190 4752 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8f129063a19f5270a664c01d8b862f373a2001f1ae0937949cf1cf4af5d08e13" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Sep 29 11:10:13 crc kubenswrapper[4752]: E0929 11:10:13.878276 4752 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podUID="1c7abae5-8a39-4e88-ac5c-997fb44a9fcc" containerName="watcher-applier" Sep 29 11:10:14 crc kubenswrapper[4752]: I0929 11:10:14.507979 4752 generic.go:334] "Generic (PLEG): container finished" podID="1c7abae5-8a39-4e88-ac5c-997fb44a9fcc" containerID="8f129063a19f5270a664c01d8b862f373a2001f1ae0937949cf1cf4af5d08e13" exitCode=0 Sep 29 11:10:14 crc kubenswrapper[4752]: I0929 11:10:14.508069 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"1c7abae5-8a39-4e88-ac5c-997fb44a9fcc","Type":"ContainerDied","Data":"8f129063a19f5270a664c01d8b862f373a2001f1ae0937949cf1cf4af5d08e13"} Sep 29 11:10:14 crc kubenswrapper[4752]: I0929 11:10:14.754629 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:10:14 crc kubenswrapper[4752]: I0929 11:10:14.873202 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4cgdh\" (UniqueName: \"kubernetes.io/projected/1c7abae5-8a39-4e88-ac5c-997fb44a9fcc-kube-api-access-4cgdh\") pod \"1c7abae5-8a39-4e88-ac5c-997fb44a9fcc\" (UID: \"1c7abae5-8a39-4e88-ac5c-997fb44a9fcc\") " Sep 29 11:10:14 crc kubenswrapper[4752]: I0929 11:10:14.873313 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1c7abae5-8a39-4e88-ac5c-997fb44a9fcc-logs\") pod \"1c7abae5-8a39-4e88-ac5c-997fb44a9fcc\" (UID: \"1c7abae5-8a39-4e88-ac5c-997fb44a9fcc\") " Sep 29 11:10:14 crc kubenswrapper[4752]: I0929 11:10:14.873590 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c7abae5-8a39-4e88-ac5c-997fb44a9fcc-combined-ca-bundle\") pod \"1c7abae5-8a39-4e88-ac5c-997fb44a9fcc\" (UID: \"1c7abae5-8a39-4e88-ac5c-997fb44a9fcc\") " Sep 29 11:10:14 crc kubenswrapper[4752]: I0929 11:10:14.873723 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c7abae5-8a39-4e88-ac5c-997fb44a9fcc-config-data\") pod \"1c7abae5-8a39-4e88-ac5c-997fb44a9fcc\" (UID: \"1c7abae5-8a39-4e88-ac5c-997fb44a9fcc\") " Sep 29 11:10:14 crc kubenswrapper[4752]: I0929 11:10:14.874538 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c7abae5-8a39-4e88-ac5c-997fb44a9fcc-logs" (OuterVolumeSpecName: "logs") pod "1c7abae5-8a39-4e88-ac5c-997fb44a9fcc" (UID: "1c7abae5-8a39-4e88-ac5c-997fb44a9fcc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 11:10:14 crc kubenswrapper[4752]: I0929 11:10:14.880439 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c7abae5-8a39-4e88-ac5c-997fb44a9fcc-kube-api-access-4cgdh" (OuterVolumeSpecName: "kube-api-access-4cgdh") pod "1c7abae5-8a39-4e88-ac5c-997fb44a9fcc" (UID: "1c7abae5-8a39-4e88-ac5c-997fb44a9fcc"). InnerVolumeSpecName "kube-api-access-4cgdh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 11:10:14 crc kubenswrapper[4752]: I0929 11:10:14.901742 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c7abae5-8a39-4e88-ac5c-997fb44a9fcc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1c7abae5-8a39-4e88-ac5c-997fb44a9fcc" (UID: "1c7abae5-8a39-4e88-ac5c-997fb44a9fcc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:10:14 crc kubenswrapper[4752]: I0929 11:10:14.904787 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:10:14 crc kubenswrapper[4752]: I0929 11:10:14.920779 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c7abae5-8a39-4e88-ac5c-997fb44a9fcc-config-data" (OuterVolumeSpecName: "config-data") pod "1c7abae5-8a39-4e88-ac5c-997fb44a9fcc" (UID: "1c7abae5-8a39-4e88-ac5c-997fb44a9fcc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:10:14 crc kubenswrapper[4752]: I0929 11:10:14.975677 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bd1e5555-3ef3-4c8c-a964-f9b64eb3064c-run-httpd\") pod \"bd1e5555-3ef3-4c8c-a964-f9b64eb3064c\" (UID: \"bd1e5555-3ef3-4c8c-a964-f9b64eb3064c\") " Sep 29 11:10:14 crc kubenswrapper[4752]: I0929 11:10:14.976036 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bd1e5555-3ef3-4c8c-a964-f9b64eb3064c-log-httpd\") pod \"bd1e5555-3ef3-4c8c-a964-f9b64eb3064c\" (UID: \"bd1e5555-3ef3-4c8c-a964-f9b64eb3064c\") " Sep 29 11:10:14 crc kubenswrapper[4752]: I0929 11:10:14.976071 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bd1e5555-3ef3-4c8c-a964-f9b64eb3064c-sg-core-conf-yaml\") pod \"bd1e5555-3ef3-4c8c-a964-f9b64eb3064c\" (UID: \"bd1e5555-3ef3-4c8c-a964-f9b64eb3064c\") " Sep 29 11:10:14 crc kubenswrapper[4752]: I0929 11:10:14.976109 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dw5vm\" (UniqueName: \"kubernetes.io/projected/bd1e5555-3ef3-4c8c-a964-f9b64eb3064c-kube-api-access-dw5vm\") pod \"bd1e5555-3ef3-4c8c-a964-f9b64eb3064c\" (UID: \"bd1e5555-3ef3-4c8c-a964-f9b64eb3064c\") " Sep 29 11:10:14 crc kubenswrapper[4752]: I0929 11:10:14.976160 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd1e5555-3ef3-4c8c-a964-f9b64eb3064c-combined-ca-bundle\") pod \"bd1e5555-3ef3-4c8c-a964-f9b64eb3064c\" (UID: \"bd1e5555-3ef3-4c8c-a964-f9b64eb3064c\") " Sep 29 11:10:14 crc kubenswrapper[4752]: I0929 11:10:14.976199 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd1e5555-3ef3-4c8c-a964-f9b64eb3064c-config-data\") pod \"bd1e5555-3ef3-4c8c-a964-f9b64eb3064c\" (UID: \"bd1e5555-3ef3-4c8c-a964-f9b64eb3064c\") " Sep 29 11:10:14 crc kubenswrapper[4752]: I0929 11:10:14.976213 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd1e5555-3ef3-4c8c-a964-f9b64eb3064c-scripts\") pod \"bd1e5555-3ef3-4c8c-a964-f9b64eb3064c\" (UID: \"bd1e5555-3ef3-4c8c-a964-f9b64eb3064c\") " Sep 29 11:10:14 crc kubenswrapper[4752]: I0929 11:10:14.976250 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd1e5555-3ef3-4c8c-a964-f9b64eb3064c-ceilometer-tls-certs\") pod \"bd1e5555-3ef3-4c8c-a964-f9b64eb3064c\" (UID: \"bd1e5555-3ef3-4c8c-a964-f9b64eb3064c\") " Sep 29 11:10:14 crc kubenswrapper[4752]: I0929 11:10:14.976596 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4cgdh\" (UniqueName: \"kubernetes.io/projected/1c7abae5-8a39-4e88-ac5c-997fb44a9fcc-kube-api-access-4cgdh\") on node \"crc\" DevicePath \"\"" Sep 29 11:10:14 crc kubenswrapper[4752]: I0929 11:10:14.976614 4752 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1c7abae5-8a39-4e88-ac5c-997fb44a9fcc-logs\") on node \"crc\" DevicePath \"\"" Sep 29 11:10:14 crc kubenswrapper[4752]: I0929 11:10:14.976624 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c7abae5-8a39-4e88-ac5c-997fb44a9fcc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 11:10:14 crc kubenswrapper[4752]: I0929 11:10:14.976634 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c7abae5-8a39-4e88-ac5c-997fb44a9fcc-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 11:10:14 crc kubenswrapper[4752]: I0929 11:10:14.980437 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd1e5555-3ef3-4c8c-a964-f9b64eb3064c-kube-api-access-dw5vm" (OuterVolumeSpecName: "kube-api-access-dw5vm") pod "bd1e5555-3ef3-4c8c-a964-f9b64eb3064c" (UID: "bd1e5555-3ef3-4c8c-a964-f9b64eb3064c"). InnerVolumeSpecName "kube-api-access-dw5vm". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 11:10:14 crc kubenswrapper[4752]: I0929 11:10:14.980459 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd1e5555-3ef3-4c8c-a964-f9b64eb3064c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "bd1e5555-3ef3-4c8c-a964-f9b64eb3064c" (UID: "bd1e5555-3ef3-4c8c-a964-f9b64eb3064c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 11:10:14 crc kubenswrapper[4752]: I0929 11:10:14.981196 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd1e5555-3ef3-4c8c-a964-f9b64eb3064c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "bd1e5555-3ef3-4c8c-a964-f9b64eb3064c" (UID: "bd1e5555-3ef3-4c8c-a964-f9b64eb3064c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 11:10:14 crc kubenswrapper[4752]: I0929 11:10:14.984934 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd1e5555-3ef3-4c8c-a964-f9b64eb3064c-scripts" (OuterVolumeSpecName: "scripts") pod "bd1e5555-3ef3-4c8c-a964-f9b64eb3064c" (UID: "bd1e5555-3ef3-4c8c-a964-f9b64eb3064c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:10:15 crc kubenswrapper[4752]: I0929 11:10:15.004296 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd1e5555-3ef3-4c8c-a964-f9b64eb3064c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "bd1e5555-3ef3-4c8c-a964-f9b64eb3064c" (UID: "bd1e5555-3ef3-4c8c-a964-f9b64eb3064c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:10:15 crc kubenswrapper[4752]: I0929 11:10:15.031504 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd1e5555-3ef3-4c8c-a964-f9b64eb3064c-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "bd1e5555-3ef3-4c8c-a964-f9b64eb3064c" (UID: "bd1e5555-3ef3-4c8c-a964-f9b64eb3064c"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:10:15 crc kubenswrapper[4752]: I0929 11:10:15.041990 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd1e5555-3ef3-4c8c-a964-f9b64eb3064c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bd1e5555-3ef3-4c8c-a964-f9b64eb3064c" (UID: "bd1e5555-3ef3-4c8c-a964-f9b64eb3064c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:10:15 crc kubenswrapper[4752]: I0929 11:10:15.072546 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd1e5555-3ef3-4c8c-a964-f9b64eb3064c-config-data" (OuterVolumeSpecName: "config-data") pod "bd1e5555-3ef3-4c8c-a964-f9b64eb3064c" (UID: "bd1e5555-3ef3-4c8c-a964-f9b64eb3064c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:10:15 crc kubenswrapper[4752]: I0929 11:10:15.078218 4752 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bd1e5555-3ef3-4c8c-a964-f9b64eb3064c-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 29 11:10:15 crc kubenswrapper[4752]: I0929 11:10:15.078264 4752 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bd1e5555-3ef3-4c8c-a964-f9b64eb3064c-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 29 11:10:15 crc kubenswrapper[4752]: I0929 11:10:15.078277 4752 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bd1e5555-3ef3-4c8c-a964-f9b64eb3064c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 29 11:10:15 crc kubenswrapper[4752]: I0929 11:10:15.078289 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dw5vm\" (UniqueName: \"kubernetes.io/projected/bd1e5555-3ef3-4c8c-a964-f9b64eb3064c-kube-api-access-dw5vm\") on node \"crc\" DevicePath \"\"" Sep 29 11:10:15 crc kubenswrapper[4752]: I0929 11:10:15.078300 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd1e5555-3ef3-4c8c-a964-f9b64eb3064c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 11:10:15 crc kubenswrapper[4752]: I0929 11:10:15.078313 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd1e5555-3ef3-4c8c-a964-f9b64eb3064c-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 11:10:15 crc kubenswrapper[4752]: I0929 11:10:15.078323 4752 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd1e5555-3ef3-4c8c-a964-f9b64eb3064c-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 11:10:15 crc kubenswrapper[4752]: I0929 11:10:15.078334 4752 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd1e5555-3ef3-4c8c-a964-f9b64eb3064c-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 29 11:10:15 crc kubenswrapper[4752]: I0929 11:10:15.516994 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"1c7abae5-8a39-4e88-ac5c-997fb44a9fcc","Type":"ContainerDied","Data":"89150f4df4eeacd20a1b308954aa66c55c99f862151749c9c5ae791f08087753"} Sep 29 11:10:15 crc kubenswrapper[4752]: I0929 11:10:15.517052 4752 scope.go:117] "RemoveContainer" containerID="8f129063a19f5270a664c01d8b862f373a2001f1ae0937949cf1cf4af5d08e13" Sep 29 11:10:15 crc kubenswrapper[4752]: I0929 11:10:15.517566 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:10:15 crc kubenswrapper[4752]: I0929 11:10:15.521095 4752 generic.go:334] "Generic (PLEG): container finished" podID="bd1e5555-3ef3-4c8c-a964-f9b64eb3064c" containerID="0b1773cffe4f89c66cb8c7d039ce2f198253a606f840e6a7d055078ef0e72e02" exitCode=0 Sep 29 11:10:15 crc kubenswrapper[4752]: I0929 11:10:15.521149 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"bd1e5555-3ef3-4c8c-a964-f9b64eb3064c","Type":"ContainerDied","Data":"0b1773cffe4f89c66cb8c7d039ce2f198253a606f840e6a7d055078ef0e72e02"} Sep 29 11:10:15 crc kubenswrapper[4752]: I0929 11:10:15.521166 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:10:15 crc kubenswrapper[4752]: I0929 11:10:15.521182 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"bd1e5555-3ef3-4c8c-a964-f9b64eb3064c","Type":"ContainerDied","Data":"5e0dae4ff71d14ad4590734a7e85850000356c98c598de3fc41615a06486cfbf"} Sep 29 11:10:15 crc kubenswrapper[4752]: I0929 11:10:15.548338 4752 scope.go:117] "RemoveContainer" containerID="cd7c78b58f990780c3e6ac29d379c9b8f7a84c8e5c6511f5f02d022a2ff78128" Sep 29 11:10:15 crc kubenswrapper[4752]: I0929 11:10:15.564892 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Sep 29 11:10:15 crc kubenswrapper[4752]: I0929 11:10:15.577998 4752 scope.go:117] "RemoveContainer" containerID="724a1a14fc70744885e26a376743ca369ceb4826172cd3a5651fa92076a68fd9" Sep 29 11:10:15 crc kubenswrapper[4752]: I0929 11:10:15.580504 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Sep 29 11:10:15 crc kubenswrapper[4752]: I0929 11:10:15.590246 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Sep 29 11:10:15 crc kubenswrapper[4752]: I0929 11:10:15.603834 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Sep 29 11:10:15 crc kubenswrapper[4752]: E0929 11:10:15.604320 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd1e5555-3ef3-4c8c-a964-f9b64eb3064c" containerName="ceilometer-notification-agent" Sep 29 11:10:15 crc kubenswrapper[4752]: I0929 11:10:15.604338 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd1e5555-3ef3-4c8c-a964-f9b64eb3064c" containerName="ceilometer-notification-agent" Sep 29 11:10:15 crc kubenswrapper[4752]: E0929 11:10:15.604367 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd1e5555-3ef3-4c8c-a964-f9b64eb3064c" containerName="proxy-httpd" Sep 29 11:10:15 crc kubenswrapper[4752]: I0929 11:10:15.604374 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd1e5555-3ef3-4c8c-a964-f9b64eb3064c" containerName="proxy-httpd" Sep 29 11:10:15 crc kubenswrapper[4752]: E0929 11:10:15.604406 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2a1fefe-b6e1-4a89-b8ee-4cb201d32c2c" containerName="watcher-api" Sep 29 11:10:15 crc kubenswrapper[4752]: I0929 11:10:15.604413 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2a1fefe-b6e1-4a89-b8ee-4cb201d32c2c" containerName="watcher-api" Sep 29 11:10:15 crc kubenswrapper[4752]: E0929 11:10:15.604434 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd1e5555-3ef3-4c8c-a964-f9b64eb3064c" containerName="ceilometer-central-agent" Sep 29 11:10:15 crc kubenswrapper[4752]: I0929 11:10:15.604440 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd1e5555-3ef3-4c8c-a964-f9b64eb3064c" containerName="ceilometer-central-agent" Sep 29 11:10:15 crc kubenswrapper[4752]: E0929 11:10:15.604470 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c3e5b57-3df6-4005-aa53-7cc77c1101ad" containerName="watcher-decision-engine" Sep 29 11:10:15 crc kubenswrapper[4752]: I0929 11:10:15.604477 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c3e5b57-3df6-4005-aa53-7cc77c1101ad" containerName="watcher-decision-engine" Sep 29 11:10:15 crc kubenswrapper[4752]: E0929 11:10:15.604488 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c7abae5-8a39-4e88-ac5c-997fb44a9fcc" containerName="watcher-applier" Sep 29 11:10:15 crc kubenswrapper[4752]: I0929 11:10:15.604493 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c7abae5-8a39-4e88-ac5c-997fb44a9fcc" containerName="watcher-applier" Sep 29 11:10:15 crc kubenswrapper[4752]: E0929 11:10:15.604503 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c26afed-cf6c-4876-a817-216eae20e80f" containerName="mariadb-database-create" Sep 29 11:10:15 crc kubenswrapper[4752]: I0929 11:10:15.604508 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c26afed-cf6c-4876-a817-216eae20e80f" containerName="mariadb-database-create" Sep 29 11:10:15 crc kubenswrapper[4752]: E0929 11:10:15.604518 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2a1fefe-b6e1-4a89-b8ee-4cb201d32c2c" containerName="watcher-kuttl-api-log" Sep 29 11:10:15 crc kubenswrapper[4752]: I0929 11:10:15.604524 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2a1fefe-b6e1-4a89-b8ee-4cb201d32c2c" containerName="watcher-kuttl-api-log" Sep 29 11:10:15 crc kubenswrapper[4752]: E0929 11:10:15.604536 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd1e5555-3ef3-4c8c-a964-f9b64eb3064c" containerName="sg-core" Sep 29 11:10:15 crc kubenswrapper[4752]: I0929 11:10:15.604556 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd1e5555-3ef3-4c8c-a964-f9b64eb3064c" containerName="sg-core" Sep 29 11:10:15 crc kubenswrapper[4752]: I0929 11:10:15.604865 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c26afed-cf6c-4876-a817-216eae20e80f" containerName="mariadb-database-create" Sep 29 11:10:15 crc kubenswrapper[4752]: I0929 11:10:15.604879 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2a1fefe-b6e1-4a89-b8ee-4cb201d32c2c" containerName="watcher-api" Sep 29 11:10:15 crc kubenswrapper[4752]: I0929 11:10:15.604896 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c7abae5-8a39-4e88-ac5c-997fb44a9fcc" containerName="watcher-applier" Sep 29 11:10:15 crc kubenswrapper[4752]: I0929 11:10:15.604906 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd1e5555-3ef3-4c8c-a964-f9b64eb3064c" containerName="ceilometer-central-agent" Sep 29 11:10:15 crc kubenswrapper[4752]: I0929 11:10:15.604914 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd1e5555-3ef3-4c8c-a964-f9b64eb3064c" containerName="sg-core" Sep 29 11:10:15 crc kubenswrapper[4752]: I0929 11:10:15.604939 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd1e5555-3ef3-4c8c-a964-f9b64eb3064c" containerName="ceilometer-notification-agent" Sep 29 11:10:15 crc kubenswrapper[4752]: I0929 11:10:15.604947 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2a1fefe-b6e1-4a89-b8ee-4cb201d32c2c" containerName="watcher-kuttl-api-log" Sep 29 11:10:15 crc kubenswrapper[4752]: I0929 11:10:15.604970 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd1e5555-3ef3-4c8c-a964-f9b64eb3064c" containerName="proxy-httpd" Sep 29 11:10:15 crc kubenswrapper[4752]: I0929 11:10:15.604979 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c3e5b57-3df6-4005-aa53-7cc77c1101ad" containerName="watcher-decision-engine" Sep 29 11:10:15 crc kubenswrapper[4752]: I0929 11:10:15.607115 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:10:15 crc kubenswrapper[4752]: I0929 11:10:15.608549 4752 scope.go:117] "RemoveContainer" containerID="0b1773cffe4f89c66cb8c7d039ce2f198253a606f840e6a7d055078ef0e72e02" Sep 29 11:10:15 crc kubenswrapper[4752]: I0929 11:10:15.610699 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Sep 29 11:10:15 crc kubenswrapper[4752]: I0929 11:10:15.610735 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-ceilometer-internal-svc" Sep 29 11:10:15 crc kubenswrapper[4752]: I0929 11:10:15.610853 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Sep 29 11:10:15 crc kubenswrapper[4752]: I0929 11:10:15.619027 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Sep 29 11:10:15 crc kubenswrapper[4752]: I0929 11:10:15.628251 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Sep 29 11:10:15 crc kubenswrapper[4752]: I0929 11:10:15.636288 4752 scope.go:117] "RemoveContainer" containerID="be18ce7da4565251bc9b510cae875334df0124db25fe73cbb79e356458a6f1b3" Sep 29 11:10:15 crc kubenswrapper[4752]: I0929 11:10:15.655270 4752 scope.go:117] "RemoveContainer" containerID="cd7c78b58f990780c3e6ac29d379c9b8f7a84c8e5c6511f5f02d022a2ff78128" Sep 29 11:10:15 crc kubenswrapper[4752]: E0929 11:10:15.655900 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd7c78b58f990780c3e6ac29d379c9b8f7a84c8e5c6511f5f02d022a2ff78128\": container with ID starting with cd7c78b58f990780c3e6ac29d379c9b8f7a84c8e5c6511f5f02d022a2ff78128 not found: ID does not exist" containerID="cd7c78b58f990780c3e6ac29d379c9b8f7a84c8e5c6511f5f02d022a2ff78128" Sep 29 11:10:15 crc kubenswrapper[4752]: I0929 11:10:15.655932 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd7c78b58f990780c3e6ac29d379c9b8f7a84c8e5c6511f5f02d022a2ff78128"} err="failed to get container status \"cd7c78b58f990780c3e6ac29d379c9b8f7a84c8e5c6511f5f02d022a2ff78128\": rpc error: code = NotFound desc = could not find container \"cd7c78b58f990780c3e6ac29d379c9b8f7a84c8e5c6511f5f02d022a2ff78128\": container with ID starting with cd7c78b58f990780c3e6ac29d379c9b8f7a84c8e5c6511f5f02d022a2ff78128 not found: ID does not exist" Sep 29 11:10:15 crc kubenswrapper[4752]: I0929 11:10:15.655958 4752 scope.go:117] "RemoveContainer" containerID="724a1a14fc70744885e26a376743ca369ceb4826172cd3a5651fa92076a68fd9" Sep 29 11:10:15 crc kubenswrapper[4752]: E0929 11:10:15.656216 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"724a1a14fc70744885e26a376743ca369ceb4826172cd3a5651fa92076a68fd9\": container with ID starting with 724a1a14fc70744885e26a376743ca369ceb4826172cd3a5651fa92076a68fd9 not found: ID does not exist" containerID="724a1a14fc70744885e26a376743ca369ceb4826172cd3a5651fa92076a68fd9" Sep 29 11:10:15 crc kubenswrapper[4752]: I0929 11:10:15.656317 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"724a1a14fc70744885e26a376743ca369ceb4826172cd3a5651fa92076a68fd9"} err="failed to get container status \"724a1a14fc70744885e26a376743ca369ceb4826172cd3a5651fa92076a68fd9\": rpc error: code = NotFound desc = could not find container \"724a1a14fc70744885e26a376743ca369ceb4826172cd3a5651fa92076a68fd9\": container with ID starting with 724a1a14fc70744885e26a376743ca369ceb4826172cd3a5651fa92076a68fd9 not found: ID does not exist" Sep 29 11:10:15 crc kubenswrapper[4752]: I0929 11:10:15.656403 4752 scope.go:117] "RemoveContainer" containerID="0b1773cffe4f89c66cb8c7d039ce2f198253a606f840e6a7d055078ef0e72e02" Sep 29 11:10:15 crc kubenswrapper[4752]: E0929 11:10:15.656734 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b1773cffe4f89c66cb8c7d039ce2f198253a606f840e6a7d055078ef0e72e02\": container with ID starting with 0b1773cffe4f89c66cb8c7d039ce2f198253a606f840e6a7d055078ef0e72e02 not found: ID does not exist" containerID="0b1773cffe4f89c66cb8c7d039ce2f198253a606f840e6a7d055078ef0e72e02" Sep 29 11:10:15 crc kubenswrapper[4752]: I0929 11:10:15.656760 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b1773cffe4f89c66cb8c7d039ce2f198253a606f840e6a7d055078ef0e72e02"} err="failed to get container status \"0b1773cffe4f89c66cb8c7d039ce2f198253a606f840e6a7d055078ef0e72e02\": rpc error: code = NotFound desc = could not find container \"0b1773cffe4f89c66cb8c7d039ce2f198253a606f840e6a7d055078ef0e72e02\": container with ID starting with 0b1773cffe4f89c66cb8c7d039ce2f198253a606f840e6a7d055078ef0e72e02 not found: ID does not exist" Sep 29 11:10:15 crc kubenswrapper[4752]: I0929 11:10:15.656780 4752 scope.go:117] "RemoveContainer" containerID="be18ce7da4565251bc9b510cae875334df0124db25fe73cbb79e356458a6f1b3" Sep 29 11:10:15 crc kubenswrapper[4752]: E0929 11:10:15.657127 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be18ce7da4565251bc9b510cae875334df0124db25fe73cbb79e356458a6f1b3\": container with ID starting with be18ce7da4565251bc9b510cae875334df0124db25fe73cbb79e356458a6f1b3 not found: ID does not exist" containerID="be18ce7da4565251bc9b510cae875334df0124db25fe73cbb79e356458a6f1b3" Sep 29 11:10:15 crc kubenswrapper[4752]: I0929 11:10:15.657238 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be18ce7da4565251bc9b510cae875334df0124db25fe73cbb79e356458a6f1b3"} err="failed to get container status \"be18ce7da4565251bc9b510cae875334df0124db25fe73cbb79e356458a6f1b3\": rpc error: code = NotFound desc = could not find container \"be18ce7da4565251bc9b510cae875334df0124db25fe73cbb79e356458a6f1b3\": container with ID starting with be18ce7da4565251bc9b510cae875334df0124db25fe73cbb79e356458a6f1b3 not found: ID does not exist" Sep 29 11:10:15 crc kubenswrapper[4752]: I0929 11:10:15.688621 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l52r2\" (UniqueName: \"kubernetes.io/projected/58504de4-1881-41d4-8b99-630c0e8cce8a-kube-api-access-l52r2\") pod \"ceilometer-0\" (UID: \"58504de4-1881-41d4-8b99-630c0e8cce8a\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:10:15 crc kubenswrapper[4752]: I0929 11:10:15.688711 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/58504de4-1881-41d4-8b99-630c0e8cce8a-log-httpd\") pod \"ceilometer-0\" (UID: \"58504de4-1881-41d4-8b99-630c0e8cce8a\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:10:15 crc kubenswrapper[4752]: I0929 11:10:15.688747 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58504de4-1881-41d4-8b99-630c0e8cce8a-scripts\") pod \"ceilometer-0\" (UID: \"58504de4-1881-41d4-8b99-630c0e8cce8a\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:10:15 crc kubenswrapper[4752]: I0929 11:10:15.688777 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58504de4-1881-41d4-8b99-630c0e8cce8a-config-data\") pod \"ceilometer-0\" (UID: \"58504de4-1881-41d4-8b99-630c0e8cce8a\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:10:15 crc kubenswrapper[4752]: I0929 11:10:15.688841 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/58504de4-1881-41d4-8b99-630c0e8cce8a-run-httpd\") pod \"ceilometer-0\" (UID: \"58504de4-1881-41d4-8b99-630c0e8cce8a\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:10:15 crc kubenswrapper[4752]: I0929 11:10:15.688882 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/58504de4-1881-41d4-8b99-630c0e8cce8a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"58504de4-1881-41d4-8b99-630c0e8cce8a\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:10:15 crc kubenswrapper[4752]: I0929 11:10:15.688902 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58504de4-1881-41d4-8b99-630c0e8cce8a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"58504de4-1881-41d4-8b99-630c0e8cce8a\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:10:15 crc kubenswrapper[4752]: I0929 11:10:15.688919 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/58504de4-1881-41d4-8b99-630c0e8cce8a-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"58504de4-1881-41d4-8b99-630c0e8cce8a\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:10:15 crc kubenswrapper[4752]: E0929 11:10:15.713993 4752 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbd1e5555_3ef3_4c8c_a964_f9b64eb3064c.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1c7abae5_8a39_4e88_ac5c_997fb44a9fcc.slice/crio-89150f4df4eeacd20a1b308954aa66c55c99f862151749c9c5ae791f08087753\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1c7abae5_8a39_4e88_ac5c_997fb44a9fcc.slice\": RecentStats: unable to find data in memory cache]" Sep 29 11:10:15 crc kubenswrapper[4752]: I0929 11:10:15.790747 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58504de4-1881-41d4-8b99-630c0e8cce8a-scripts\") pod \"ceilometer-0\" (UID: \"58504de4-1881-41d4-8b99-630c0e8cce8a\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:10:15 crc kubenswrapper[4752]: I0929 11:10:15.790851 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58504de4-1881-41d4-8b99-630c0e8cce8a-config-data\") pod \"ceilometer-0\" (UID: \"58504de4-1881-41d4-8b99-630c0e8cce8a\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:10:15 crc kubenswrapper[4752]: I0929 11:10:15.790907 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/58504de4-1881-41d4-8b99-630c0e8cce8a-run-httpd\") pod \"ceilometer-0\" (UID: \"58504de4-1881-41d4-8b99-630c0e8cce8a\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:10:15 crc kubenswrapper[4752]: I0929 11:10:15.790971 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/58504de4-1881-41d4-8b99-630c0e8cce8a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"58504de4-1881-41d4-8b99-630c0e8cce8a\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:10:15 crc kubenswrapper[4752]: I0929 11:10:15.791010 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58504de4-1881-41d4-8b99-630c0e8cce8a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"58504de4-1881-41d4-8b99-630c0e8cce8a\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:10:15 crc kubenswrapper[4752]: I0929 11:10:15.791036 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/58504de4-1881-41d4-8b99-630c0e8cce8a-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"58504de4-1881-41d4-8b99-630c0e8cce8a\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:10:15 crc kubenswrapper[4752]: I0929 11:10:15.791082 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l52r2\" (UniqueName: \"kubernetes.io/projected/58504de4-1881-41d4-8b99-630c0e8cce8a-kube-api-access-l52r2\") pod \"ceilometer-0\" (UID: \"58504de4-1881-41d4-8b99-630c0e8cce8a\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:10:15 crc kubenswrapper[4752]: I0929 11:10:15.791158 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/58504de4-1881-41d4-8b99-630c0e8cce8a-log-httpd\") pod \"ceilometer-0\" (UID: \"58504de4-1881-41d4-8b99-630c0e8cce8a\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:10:15 crc kubenswrapper[4752]: I0929 11:10:15.791615 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/58504de4-1881-41d4-8b99-630c0e8cce8a-log-httpd\") pod \"ceilometer-0\" (UID: \"58504de4-1881-41d4-8b99-630c0e8cce8a\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:10:15 crc kubenswrapper[4752]: I0929 11:10:15.792370 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/58504de4-1881-41d4-8b99-630c0e8cce8a-run-httpd\") pod \"ceilometer-0\" (UID: \"58504de4-1881-41d4-8b99-630c0e8cce8a\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:10:15 crc kubenswrapper[4752]: I0929 11:10:15.795974 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58504de4-1881-41d4-8b99-630c0e8cce8a-scripts\") pod \"ceilometer-0\" (UID: \"58504de4-1881-41d4-8b99-630c0e8cce8a\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:10:15 crc kubenswrapper[4752]: I0929 11:10:15.795976 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/58504de4-1881-41d4-8b99-630c0e8cce8a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"58504de4-1881-41d4-8b99-630c0e8cce8a\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:10:15 crc kubenswrapper[4752]: I0929 11:10:15.795970 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/58504de4-1881-41d4-8b99-630c0e8cce8a-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"58504de4-1881-41d4-8b99-630c0e8cce8a\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:10:15 crc kubenswrapper[4752]: I0929 11:10:15.796520 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58504de4-1881-41d4-8b99-630c0e8cce8a-config-data\") pod \"ceilometer-0\" (UID: \"58504de4-1881-41d4-8b99-630c0e8cce8a\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:10:15 crc kubenswrapper[4752]: I0929 11:10:15.797409 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58504de4-1881-41d4-8b99-630c0e8cce8a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"58504de4-1881-41d4-8b99-630c0e8cce8a\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:10:15 crc kubenswrapper[4752]: I0929 11:10:15.813175 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l52r2\" (UniqueName: \"kubernetes.io/projected/58504de4-1881-41d4-8b99-630c0e8cce8a-kube-api-access-l52r2\") pod \"ceilometer-0\" (UID: \"58504de4-1881-41d4-8b99-630c0e8cce8a\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:10:15 crc kubenswrapper[4752]: I0929 11:10:15.935972 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:10:16 crc kubenswrapper[4752]: I0929 11:10:16.061521 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c7abae5-8a39-4e88-ac5c-997fb44a9fcc" path="/var/lib/kubelet/pods/1c7abae5-8a39-4e88-ac5c-997fb44a9fcc/volumes" Sep 29 11:10:16 crc kubenswrapper[4752]: I0929 11:10:16.062087 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd1e5555-3ef3-4c8c-a964-f9b64eb3064c" path="/var/lib/kubelet/pods/bd1e5555-3ef3-4c8c-a964-f9b64eb3064c/volumes" Sep 29 11:10:16 crc kubenswrapper[4752]: I0929 11:10:16.407563 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Sep 29 11:10:16 crc kubenswrapper[4752]: I0929 11:10:16.529104 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"58504de4-1881-41d4-8b99-630c0e8cce8a","Type":"ContainerStarted","Data":"24f29ff5061d8ec9ac918087887645ae124c2b259a6cade44ba550fefb8b41a0"} Sep 29 11:10:17 crc kubenswrapper[4752]: I0929 11:10:17.544372 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"58504de4-1881-41d4-8b99-630c0e8cce8a","Type":"ContainerStarted","Data":"74c9db24212877ed8a19e02590367c51eb30537fcb48d1bd3a4f84b3ee364e0a"} Sep 29 11:10:18 crc kubenswrapper[4752]: I0929 11:10:18.554924 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"58504de4-1881-41d4-8b99-630c0e8cce8a","Type":"ContainerStarted","Data":"4393caf974990eb735039281298a506791486ea24bd9966039f8cbeb151a8e7e"} Sep 29 11:10:19 crc kubenswrapper[4752]: I0929 11:10:19.564515 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"58504de4-1881-41d4-8b99-630c0e8cce8a","Type":"ContainerStarted","Data":"5ff5485ae83a07ddf8e9978b23ae2f4fcca8c865952210451b9eb567a565b012"} Sep 29 11:10:20 crc kubenswrapper[4752]: I0929 11:10:20.757046 4752 scope.go:117] "RemoveContainer" containerID="e01016c76542e51402483036d4b612e7c0e136acbb18a2b4d89914506235824b" Sep 29 11:10:21 crc kubenswrapper[4752]: I0929 11:10:21.579837 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"58504de4-1881-41d4-8b99-630c0e8cce8a","Type":"ContainerStarted","Data":"6de5ab66ff6da4b6fd574ef251dbad6c1d30077f6f0b67c40d464cc56af5cede"} Sep 29 11:10:21 crc kubenswrapper[4752]: I0929 11:10:21.581867 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:10:21 crc kubenswrapper[4752]: I0929 11:10:21.609245 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/ceilometer-0" podStartSLOduration=2.9198253469999997 podStartE2EDuration="6.609228127s" podCreationTimestamp="2025-09-29 11:10:15 +0000 UTC" firstStartedPulling="2025-09-29 11:10:16.416613035 +0000 UTC m=+1557.205754712" lastFinishedPulling="2025-09-29 11:10:20.106015825 +0000 UTC m=+1560.895157492" observedRunningTime="2025-09-29 11:10:21.601105663 +0000 UTC m=+1562.390247330" watchObservedRunningTime="2025-09-29 11:10:21.609228127 +0000 UTC m=+1562.398369794" Sep 29 11:10:23 crc kubenswrapper[4752]: I0929 11:10:23.906104 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-3aef-account-create-c6ng4"] Sep 29 11:10:23 crc kubenswrapper[4752]: I0929 11:10:23.907852 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-3aef-account-create-c6ng4" Sep 29 11:10:23 crc kubenswrapper[4752]: I0929 11:10:23.916405 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-db-secret" Sep 29 11:10:23 crc kubenswrapper[4752]: I0929 11:10:23.921106 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-3aef-account-create-c6ng4"] Sep 29 11:10:24 crc kubenswrapper[4752]: I0929 11:10:24.018723 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvjj7\" (UniqueName: \"kubernetes.io/projected/88232ec0-75f8-407a-90eb-2a8d6fbb7703-kube-api-access-gvjj7\") pod \"watcher-3aef-account-create-c6ng4\" (UID: \"88232ec0-75f8-407a-90eb-2a8d6fbb7703\") " pod="watcher-kuttl-default/watcher-3aef-account-create-c6ng4" Sep 29 11:10:24 crc kubenswrapper[4752]: I0929 11:10:24.119844 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvjj7\" (UniqueName: \"kubernetes.io/projected/88232ec0-75f8-407a-90eb-2a8d6fbb7703-kube-api-access-gvjj7\") pod \"watcher-3aef-account-create-c6ng4\" (UID: \"88232ec0-75f8-407a-90eb-2a8d6fbb7703\") " pod="watcher-kuttl-default/watcher-3aef-account-create-c6ng4" Sep 29 11:10:24 crc kubenswrapper[4752]: I0929 11:10:24.164127 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvjj7\" (UniqueName: \"kubernetes.io/projected/88232ec0-75f8-407a-90eb-2a8d6fbb7703-kube-api-access-gvjj7\") pod \"watcher-3aef-account-create-c6ng4\" (UID: \"88232ec0-75f8-407a-90eb-2a8d6fbb7703\") " pod="watcher-kuttl-default/watcher-3aef-account-create-c6ng4" Sep 29 11:10:24 crc kubenswrapper[4752]: I0929 11:10:24.227212 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-3aef-account-create-c6ng4" Sep 29 11:10:24 crc kubenswrapper[4752]: I0929 11:10:24.656124 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-3aef-account-create-c6ng4"] Sep 29 11:10:24 crc kubenswrapper[4752]: W0929 11:10:24.674219 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod88232ec0_75f8_407a_90eb_2a8d6fbb7703.slice/crio-dc59d600b56bab39b9c1ea34535680b5e2d5b11e78557e38759cfee07ff4380a WatchSource:0}: Error finding container dc59d600b56bab39b9c1ea34535680b5e2d5b11e78557e38759cfee07ff4380a: Status 404 returned error can't find the container with id dc59d600b56bab39b9c1ea34535680b5e2d5b11e78557e38759cfee07ff4380a Sep 29 11:10:25 crc kubenswrapper[4752]: I0929 11:10:25.616456 4752 generic.go:334] "Generic (PLEG): container finished" podID="88232ec0-75f8-407a-90eb-2a8d6fbb7703" containerID="cb8a557c5b5a0ca2f113f14da7e6e690901826eefcdf792986324eb9d242d188" exitCode=0 Sep 29 11:10:25 crc kubenswrapper[4752]: I0929 11:10:25.616682 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-3aef-account-create-c6ng4" event={"ID":"88232ec0-75f8-407a-90eb-2a8d6fbb7703","Type":"ContainerDied","Data":"cb8a557c5b5a0ca2f113f14da7e6e690901826eefcdf792986324eb9d242d188"} Sep 29 11:10:25 crc kubenswrapper[4752]: I0929 11:10:25.616792 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-3aef-account-create-c6ng4" event={"ID":"88232ec0-75f8-407a-90eb-2a8d6fbb7703","Type":"ContainerStarted","Data":"dc59d600b56bab39b9c1ea34535680b5e2d5b11e78557e38759cfee07ff4380a"} Sep 29 11:10:26 crc kubenswrapper[4752]: I0929 11:10:26.909089 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-3aef-account-create-c6ng4" Sep 29 11:10:26 crc kubenswrapper[4752]: I0929 11:10:26.983424 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gvjj7\" (UniqueName: \"kubernetes.io/projected/88232ec0-75f8-407a-90eb-2a8d6fbb7703-kube-api-access-gvjj7\") pod \"88232ec0-75f8-407a-90eb-2a8d6fbb7703\" (UID: \"88232ec0-75f8-407a-90eb-2a8d6fbb7703\") " Sep 29 11:10:27 crc kubenswrapper[4752]: I0929 11:10:27.002842 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88232ec0-75f8-407a-90eb-2a8d6fbb7703-kube-api-access-gvjj7" (OuterVolumeSpecName: "kube-api-access-gvjj7") pod "88232ec0-75f8-407a-90eb-2a8d6fbb7703" (UID: "88232ec0-75f8-407a-90eb-2a8d6fbb7703"). InnerVolumeSpecName "kube-api-access-gvjj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 11:10:27 crc kubenswrapper[4752]: I0929 11:10:27.030951 4752 scope.go:117] "RemoveContainer" containerID="18eab399f36ee078445fd05909a0d35ada9fdfa2424d9729b71ad67d5ec2e670" Sep 29 11:10:27 crc kubenswrapper[4752]: E0929 11:10:27.031205 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgrvs_openshift-machine-config-operator(5863c243-797d-462a-b11f-71aaf005f8d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" Sep 29 11:10:27 crc kubenswrapper[4752]: I0929 11:10:27.087060 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gvjj7\" (UniqueName: \"kubernetes.io/projected/88232ec0-75f8-407a-90eb-2a8d6fbb7703-kube-api-access-gvjj7\") on node \"crc\" DevicePath \"\"" Sep 29 11:10:27 crc kubenswrapper[4752]: I0929 11:10:27.637636 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-3aef-account-create-c6ng4" event={"ID":"88232ec0-75f8-407a-90eb-2a8d6fbb7703","Type":"ContainerDied","Data":"dc59d600b56bab39b9c1ea34535680b5e2d5b11e78557e38759cfee07ff4380a"} Sep 29 11:10:27 crc kubenswrapper[4752]: I0929 11:10:27.637677 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-3aef-account-create-c6ng4" Sep 29 11:10:27 crc kubenswrapper[4752]: I0929 11:10:27.637681 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dc59d600b56bab39b9c1ea34535680b5e2d5b11e78557e38759cfee07ff4380a" Sep 29 11:10:29 crc kubenswrapper[4752]: I0929 11:10:29.216200 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-qh5gr"] Sep 29 11:10:29 crc kubenswrapper[4752]: E0929 11:10:29.216731 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88232ec0-75f8-407a-90eb-2a8d6fbb7703" containerName="mariadb-account-create" Sep 29 11:10:29 crc kubenswrapper[4752]: I0929 11:10:29.216744 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="88232ec0-75f8-407a-90eb-2a8d6fbb7703" containerName="mariadb-account-create" Sep 29 11:10:29 crc kubenswrapper[4752]: I0929 11:10:29.216894 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="88232ec0-75f8-407a-90eb-2a8d6fbb7703" containerName="mariadb-account-create" Sep 29 11:10:29 crc kubenswrapper[4752]: I0929 11:10:29.217412 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-qh5gr" Sep 29 11:10:29 crc kubenswrapper[4752]: I0929 11:10:29.219413 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-watcher-kuttl-dockercfg-cv69s" Sep 29 11:10:29 crc kubenswrapper[4752]: I0929 11:10:29.221503 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-config-data" Sep 29 11:10:29 crc kubenswrapper[4752]: I0929 11:10:29.224639 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-qh5gr"] Sep 29 11:10:29 crc kubenswrapper[4752]: I0929 11:10:29.318695 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10e50680-30b0-43b2-a586-b9bb5101ed94-config-data\") pod \"watcher-kuttl-db-sync-qh5gr\" (UID: \"10e50680-30b0-43b2-a586-b9bb5101ed94\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-qh5gr" Sep 29 11:10:29 crc kubenswrapper[4752]: I0929 11:10:29.318744 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzt2k\" (UniqueName: \"kubernetes.io/projected/10e50680-30b0-43b2-a586-b9bb5101ed94-kube-api-access-xzt2k\") pod \"watcher-kuttl-db-sync-qh5gr\" (UID: \"10e50680-30b0-43b2-a586-b9bb5101ed94\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-qh5gr" Sep 29 11:10:29 crc kubenswrapper[4752]: I0929 11:10:29.318792 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/10e50680-30b0-43b2-a586-b9bb5101ed94-db-sync-config-data\") pod \"watcher-kuttl-db-sync-qh5gr\" (UID: \"10e50680-30b0-43b2-a586-b9bb5101ed94\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-qh5gr" Sep 29 11:10:29 crc kubenswrapper[4752]: I0929 11:10:29.319074 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10e50680-30b0-43b2-a586-b9bb5101ed94-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-qh5gr\" (UID: \"10e50680-30b0-43b2-a586-b9bb5101ed94\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-qh5gr" Sep 29 11:10:29 crc kubenswrapper[4752]: I0929 11:10:29.420547 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/10e50680-30b0-43b2-a586-b9bb5101ed94-db-sync-config-data\") pod \"watcher-kuttl-db-sync-qh5gr\" (UID: \"10e50680-30b0-43b2-a586-b9bb5101ed94\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-qh5gr" Sep 29 11:10:29 crc kubenswrapper[4752]: I0929 11:10:29.420658 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10e50680-30b0-43b2-a586-b9bb5101ed94-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-qh5gr\" (UID: \"10e50680-30b0-43b2-a586-b9bb5101ed94\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-qh5gr" Sep 29 11:10:29 crc kubenswrapper[4752]: I0929 11:10:29.420771 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10e50680-30b0-43b2-a586-b9bb5101ed94-config-data\") pod \"watcher-kuttl-db-sync-qh5gr\" (UID: \"10e50680-30b0-43b2-a586-b9bb5101ed94\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-qh5gr" Sep 29 11:10:29 crc kubenswrapper[4752]: I0929 11:10:29.420830 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzt2k\" (UniqueName: \"kubernetes.io/projected/10e50680-30b0-43b2-a586-b9bb5101ed94-kube-api-access-xzt2k\") pod \"watcher-kuttl-db-sync-qh5gr\" (UID: \"10e50680-30b0-43b2-a586-b9bb5101ed94\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-qh5gr" Sep 29 11:10:29 crc kubenswrapper[4752]: I0929 11:10:29.425141 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/10e50680-30b0-43b2-a586-b9bb5101ed94-db-sync-config-data\") pod \"watcher-kuttl-db-sync-qh5gr\" (UID: \"10e50680-30b0-43b2-a586-b9bb5101ed94\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-qh5gr" Sep 29 11:10:29 crc kubenswrapper[4752]: I0929 11:10:29.425318 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10e50680-30b0-43b2-a586-b9bb5101ed94-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-qh5gr\" (UID: \"10e50680-30b0-43b2-a586-b9bb5101ed94\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-qh5gr" Sep 29 11:10:29 crc kubenswrapper[4752]: I0929 11:10:29.438863 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10e50680-30b0-43b2-a586-b9bb5101ed94-config-data\") pod \"watcher-kuttl-db-sync-qh5gr\" (UID: \"10e50680-30b0-43b2-a586-b9bb5101ed94\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-qh5gr" Sep 29 11:10:29 crc kubenswrapper[4752]: I0929 11:10:29.440531 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzt2k\" (UniqueName: \"kubernetes.io/projected/10e50680-30b0-43b2-a586-b9bb5101ed94-kube-api-access-xzt2k\") pod \"watcher-kuttl-db-sync-qh5gr\" (UID: \"10e50680-30b0-43b2-a586-b9bb5101ed94\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-qh5gr" Sep 29 11:10:29 crc kubenswrapper[4752]: I0929 11:10:29.542405 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-qh5gr" Sep 29 11:10:30 crc kubenswrapper[4752]: I0929 11:10:30.027272 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-qh5gr"] Sep 29 11:10:30 crc kubenswrapper[4752]: I0929 11:10:30.661483 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-qh5gr" event={"ID":"10e50680-30b0-43b2-a586-b9bb5101ed94","Type":"ContainerStarted","Data":"c99e3da615c2d0a795c0b7893146592e1a9e083d04457218db84f9dedba7aaf8"} Sep 29 11:10:30 crc kubenswrapper[4752]: I0929 11:10:30.661758 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-qh5gr" event={"ID":"10e50680-30b0-43b2-a586-b9bb5101ed94","Type":"ContainerStarted","Data":"0f9d9e601e9dcb880d4250ad21f102f15f0227b56eed80e3124121820e5f126e"} Sep 29 11:10:30 crc kubenswrapper[4752]: I0929 11:10:30.689490 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-db-sync-qh5gr" podStartSLOduration=1.689472361 podStartE2EDuration="1.689472361s" podCreationTimestamp="2025-09-29 11:10:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 11:10:30.68293763 +0000 UTC m=+1571.472079307" watchObservedRunningTime="2025-09-29 11:10:30.689472361 +0000 UTC m=+1571.478614018" Sep 29 11:10:33 crc kubenswrapper[4752]: I0929 11:10:33.686400 4752 generic.go:334] "Generic (PLEG): container finished" podID="10e50680-30b0-43b2-a586-b9bb5101ed94" containerID="c99e3da615c2d0a795c0b7893146592e1a9e083d04457218db84f9dedba7aaf8" exitCode=0 Sep 29 11:10:33 crc kubenswrapper[4752]: I0929 11:10:33.686506 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-qh5gr" event={"ID":"10e50680-30b0-43b2-a586-b9bb5101ed94","Type":"ContainerDied","Data":"c99e3da615c2d0a795c0b7893146592e1a9e083d04457218db84f9dedba7aaf8"} Sep 29 11:10:35 crc kubenswrapper[4752]: I0929 11:10:35.014339 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-qh5gr" Sep 29 11:10:35 crc kubenswrapper[4752]: I0929 11:10:35.108842 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xzt2k\" (UniqueName: \"kubernetes.io/projected/10e50680-30b0-43b2-a586-b9bb5101ed94-kube-api-access-xzt2k\") pod \"10e50680-30b0-43b2-a586-b9bb5101ed94\" (UID: \"10e50680-30b0-43b2-a586-b9bb5101ed94\") " Sep 29 11:10:35 crc kubenswrapper[4752]: I0929 11:10:35.108968 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10e50680-30b0-43b2-a586-b9bb5101ed94-config-data\") pod \"10e50680-30b0-43b2-a586-b9bb5101ed94\" (UID: \"10e50680-30b0-43b2-a586-b9bb5101ed94\") " Sep 29 11:10:35 crc kubenswrapper[4752]: I0929 11:10:35.109020 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/10e50680-30b0-43b2-a586-b9bb5101ed94-db-sync-config-data\") pod \"10e50680-30b0-43b2-a586-b9bb5101ed94\" (UID: \"10e50680-30b0-43b2-a586-b9bb5101ed94\") " Sep 29 11:10:35 crc kubenswrapper[4752]: I0929 11:10:35.109059 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10e50680-30b0-43b2-a586-b9bb5101ed94-combined-ca-bundle\") pod \"10e50680-30b0-43b2-a586-b9bb5101ed94\" (UID: \"10e50680-30b0-43b2-a586-b9bb5101ed94\") " Sep 29 11:10:35 crc kubenswrapper[4752]: I0929 11:10:35.116278 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10e50680-30b0-43b2-a586-b9bb5101ed94-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "10e50680-30b0-43b2-a586-b9bb5101ed94" (UID: "10e50680-30b0-43b2-a586-b9bb5101ed94"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:10:35 crc kubenswrapper[4752]: I0929 11:10:35.116974 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10e50680-30b0-43b2-a586-b9bb5101ed94-kube-api-access-xzt2k" (OuterVolumeSpecName: "kube-api-access-xzt2k") pod "10e50680-30b0-43b2-a586-b9bb5101ed94" (UID: "10e50680-30b0-43b2-a586-b9bb5101ed94"). InnerVolumeSpecName "kube-api-access-xzt2k". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 11:10:35 crc kubenswrapper[4752]: I0929 11:10:35.131742 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10e50680-30b0-43b2-a586-b9bb5101ed94-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "10e50680-30b0-43b2-a586-b9bb5101ed94" (UID: "10e50680-30b0-43b2-a586-b9bb5101ed94"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:10:35 crc kubenswrapper[4752]: I0929 11:10:35.152231 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10e50680-30b0-43b2-a586-b9bb5101ed94-config-data" (OuterVolumeSpecName: "config-data") pod "10e50680-30b0-43b2-a586-b9bb5101ed94" (UID: "10e50680-30b0-43b2-a586-b9bb5101ed94"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:10:35 crc kubenswrapper[4752]: I0929 11:10:35.211424 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10e50680-30b0-43b2-a586-b9bb5101ed94-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 11:10:35 crc kubenswrapper[4752]: I0929 11:10:35.211467 4752 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/10e50680-30b0-43b2-a586-b9bb5101ed94-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 11:10:35 crc kubenswrapper[4752]: I0929 11:10:35.211480 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10e50680-30b0-43b2-a586-b9bb5101ed94-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 11:10:35 crc kubenswrapper[4752]: I0929 11:10:35.211490 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xzt2k\" (UniqueName: \"kubernetes.io/projected/10e50680-30b0-43b2-a586-b9bb5101ed94-kube-api-access-xzt2k\") on node \"crc\" DevicePath \"\"" Sep 29 11:10:35 crc kubenswrapper[4752]: I0929 11:10:35.701789 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-qh5gr" event={"ID":"10e50680-30b0-43b2-a586-b9bb5101ed94","Type":"ContainerDied","Data":"0f9d9e601e9dcb880d4250ad21f102f15f0227b56eed80e3124121820e5f126e"} Sep 29 11:10:35 crc kubenswrapper[4752]: I0929 11:10:35.702091 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f9d9e601e9dcb880d4250ad21f102f15f0227b56eed80e3124121820e5f126e" Sep 29 11:10:35 crc kubenswrapper[4752]: I0929 11:10:35.701912 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-qh5gr" Sep 29 11:10:35 crc kubenswrapper[4752]: I0929 11:10:35.976237 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Sep 29 11:10:35 crc kubenswrapper[4752]: E0929 11:10:35.976932 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10e50680-30b0-43b2-a586-b9bb5101ed94" containerName="watcher-kuttl-db-sync" Sep 29 11:10:35 crc kubenswrapper[4752]: I0929 11:10:35.976953 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="10e50680-30b0-43b2-a586-b9bb5101ed94" containerName="watcher-kuttl-db-sync" Sep 29 11:10:35 crc kubenswrapper[4752]: I0929 11:10:35.977160 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="10e50680-30b0-43b2-a586-b9bb5101ed94" containerName="watcher-kuttl-db-sync" Sep 29 11:10:35 crc kubenswrapper[4752]: I0929 11:10:35.978155 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:10:35 crc kubenswrapper[4752]: I0929 11:10:35.980884 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-watcher-public-svc" Sep 29 11:10:35 crc kubenswrapper[4752]: I0929 11:10:35.981152 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-watcher-kuttl-dockercfg-cv69s" Sep 29 11:10:35 crc kubenswrapper[4752]: I0929 11:10:35.981436 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-api-config-data" Sep 29 11:10:35 crc kubenswrapper[4752]: I0929 11:10:35.981592 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-watcher-internal-svc" Sep 29 11:10:36 crc kubenswrapper[4752]: I0929 11:10:36.000697 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Sep 29 11:10:36 crc kubenswrapper[4752]: I0929 11:10:36.025571 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b0a1d48-69e7-486a-8163-1961ba8d3501-public-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"1b0a1d48-69e7-486a-8163-1961ba8d3501\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:10:36 crc kubenswrapper[4752]: I0929 11:10:36.025616 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b0a1d48-69e7-486a-8163-1961ba8d3501-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"1b0a1d48-69e7-486a-8163-1961ba8d3501\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:10:36 crc kubenswrapper[4752]: I0929 11:10:36.025650 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b0a1d48-69e7-486a-8163-1961ba8d3501-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"1b0a1d48-69e7-486a-8163-1961ba8d3501\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:10:36 crc kubenswrapper[4752]: I0929 11:10:36.025665 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b0a1d48-69e7-486a-8163-1961ba8d3501-logs\") pod \"watcher-kuttl-api-0\" (UID: \"1b0a1d48-69e7-486a-8163-1961ba8d3501\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:10:36 crc kubenswrapper[4752]: I0929 11:10:36.025716 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b0a1d48-69e7-486a-8163-1961ba8d3501-internal-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"1b0a1d48-69e7-486a-8163-1961ba8d3501\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:10:36 crc kubenswrapper[4752]: I0929 11:10:36.025850 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhpv7\" (UniqueName: \"kubernetes.io/projected/1b0a1d48-69e7-486a-8163-1961ba8d3501-kube-api-access-vhpv7\") pod \"watcher-kuttl-api-0\" (UID: \"1b0a1d48-69e7-486a-8163-1961ba8d3501\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:10:36 crc kubenswrapper[4752]: I0929 11:10:36.026105 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/1b0a1d48-69e7-486a-8163-1961ba8d3501-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"1b0a1d48-69e7-486a-8163-1961ba8d3501\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:10:36 crc kubenswrapper[4752]: I0929 11:10:36.047282 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Sep 29 11:10:36 crc kubenswrapper[4752]: I0929 11:10:36.048431 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:10:36 crc kubenswrapper[4752]: I0929 11:10:36.065379 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-applier-config-data" Sep 29 11:10:36 crc kubenswrapper[4752]: I0929 11:10:36.086933 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Sep 29 11:10:36 crc kubenswrapper[4752]: I0929 11:10:36.109896 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Sep 29 11:10:36 crc kubenswrapper[4752]: I0929 11:10:36.110982 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:10:36 crc kubenswrapper[4752]: I0929 11:10:36.113378 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-decision-engine-config-data" Sep 29 11:10:36 crc kubenswrapper[4752]: I0929 11:10:36.116654 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Sep 29 11:10:36 crc kubenswrapper[4752]: I0929 11:10:36.130940 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b0a1d48-69e7-486a-8163-1961ba8d3501-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"1b0a1d48-69e7-486a-8163-1961ba8d3501\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:10:36 crc kubenswrapper[4752]: I0929 11:10:36.131022 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqsz8\" (UniqueName: \"kubernetes.io/projected/8b00fc0a-2c62-480e-93e6-95ed8b1305c8-kube-api-access-hqsz8\") pod \"watcher-kuttl-applier-0\" (UID: \"8b00fc0a-2c62-480e-93e6-95ed8b1305c8\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:10:36 crc kubenswrapper[4752]: I0929 11:10:36.131088 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b00fc0a-2c62-480e-93e6-95ed8b1305c8-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"8b00fc0a-2c62-480e-93e6-95ed8b1305c8\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:10:36 crc kubenswrapper[4752]: I0929 11:10:36.131108 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b0a1d48-69e7-486a-8163-1961ba8d3501-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"1b0a1d48-69e7-486a-8163-1961ba8d3501\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:10:36 crc kubenswrapper[4752]: I0929 11:10:36.131127 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b0a1d48-69e7-486a-8163-1961ba8d3501-logs\") pod \"watcher-kuttl-api-0\" (UID: \"1b0a1d48-69e7-486a-8163-1961ba8d3501\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:10:36 crc kubenswrapper[4752]: I0929 11:10:36.131867 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b0a1d48-69e7-486a-8163-1961ba8d3501-internal-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"1b0a1d48-69e7-486a-8163-1961ba8d3501\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:10:36 crc kubenswrapper[4752]: I0929 11:10:36.131916 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhpv7\" (UniqueName: \"kubernetes.io/projected/1b0a1d48-69e7-486a-8163-1961ba8d3501-kube-api-access-vhpv7\") pod \"watcher-kuttl-api-0\" (UID: \"1b0a1d48-69e7-486a-8163-1961ba8d3501\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:10:36 crc kubenswrapper[4752]: I0929 11:10:36.131946 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b00fc0a-2c62-480e-93e6-95ed8b1305c8-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"8b00fc0a-2c62-480e-93e6-95ed8b1305c8\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:10:36 crc kubenswrapper[4752]: I0929 11:10:36.132063 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b00fc0a-2c62-480e-93e6-95ed8b1305c8-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"8b00fc0a-2c62-480e-93e6-95ed8b1305c8\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:10:36 crc kubenswrapper[4752]: I0929 11:10:36.132088 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/1b0a1d48-69e7-486a-8163-1961ba8d3501-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"1b0a1d48-69e7-486a-8163-1961ba8d3501\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:10:36 crc kubenswrapper[4752]: I0929 11:10:36.135364 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b0a1d48-69e7-486a-8163-1961ba8d3501-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"1b0a1d48-69e7-486a-8163-1961ba8d3501\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:10:36 crc kubenswrapper[4752]: I0929 11:10:36.136929 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b0a1d48-69e7-486a-8163-1961ba8d3501-logs\") pod \"watcher-kuttl-api-0\" (UID: \"1b0a1d48-69e7-486a-8163-1961ba8d3501\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:10:36 crc kubenswrapper[4752]: I0929 11:10:36.137442 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b0a1d48-69e7-486a-8163-1961ba8d3501-public-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"1b0a1d48-69e7-486a-8163-1961ba8d3501\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:10:36 crc kubenswrapper[4752]: I0929 11:10:36.138385 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b0a1d48-69e7-486a-8163-1961ba8d3501-internal-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"1b0a1d48-69e7-486a-8163-1961ba8d3501\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:10:36 crc kubenswrapper[4752]: I0929 11:10:36.143252 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/1b0a1d48-69e7-486a-8163-1961ba8d3501-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"1b0a1d48-69e7-486a-8163-1961ba8d3501\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:10:36 crc kubenswrapper[4752]: I0929 11:10:36.144262 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b0a1d48-69e7-486a-8163-1961ba8d3501-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"1b0a1d48-69e7-486a-8163-1961ba8d3501\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:10:36 crc kubenswrapper[4752]: I0929 11:10:36.146122 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b0a1d48-69e7-486a-8163-1961ba8d3501-public-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"1b0a1d48-69e7-486a-8163-1961ba8d3501\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:10:36 crc kubenswrapper[4752]: I0929 11:10:36.159405 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhpv7\" (UniqueName: \"kubernetes.io/projected/1b0a1d48-69e7-486a-8163-1961ba8d3501-kube-api-access-vhpv7\") pod \"watcher-kuttl-api-0\" (UID: \"1b0a1d48-69e7-486a-8163-1961ba8d3501\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:10:36 crc kubenswrapper[4752]: I0929 11:10:36.238917 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f39fa61e-8858-4d6e-80db-bc94fdccaec8-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"f39fa61e-8858-4d6e-80db-bc94fdccaec8\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:10:36 crc kubenswrapper[4752]: I0929 11:10:36.239049 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqsz8\" (UniqueName: \"kubernetes.io/projected/8b00fc0a-2c62-480e-93e6-95ed8b1305c8-kube-api-access-hqsz8\") pod \"watcher-kuttl-applier-0\" (UID: \"8b00fc0a-2c62-480e-93e6-95ed8b1305c8\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:10:36 crc kubenswrapper[4752]: I0929 11:10:36.239095 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b00fc0a-2c62-480e-93e6-95ed8b1305c8-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"8b00fc0a-2c62-480e-93e6-95ed8b1305c8\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:10:36 crc kubenswrapper[4752]: I0929 11:10:36.239124 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f39fa61e-8858-4d6e-80db-bc94fdccaec8-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"f39fa61e-8858-4d6e-80db-bc94fdccaec8\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:10:36 crc kubenswrapper[4752]: I0929 11:10:36.239167 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f39fa61e-8858-4d6e-80db-bc94fdccaec8-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"f39fa61e-8858-4d6e-80db-bc94fdccaec8\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:10:36 crc kubenswrapper[4752]: I0929 11:10:36.239204 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b00fc0a-2c62-480e-93e6-95ed8b1305c8-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"8b00fc0a-2c62-480e-93e6-95ed8b1305c8\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:10:36 crc kubenswrapper[4752]: I0929 11:10:36.239289 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/f39fa61e-8858-4d6e-80db-bc94fdccaec8-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"f39fa61e-8858-4d6e-80db-bc94fdccaec8\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:10:36 crc kubenswrapper[4752]: I0929 11:10:36.239318 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66rbr\" (UniqueName: \"kubernetes.io/projected/f39fa61e-8858-4d6e-80db-bc94fdccaec8-kube-api-access-66rbr\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"f39fa61e-8858-4d6e-80db-bc94fdccaec8\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:10:36 crc kubenswrapper[4752]: I0929 11:10:36.239354 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b00fc0a-2c62-480e-93e6-95ed8b1305c8-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"8b00fc0a-2c62-480e-93e6-95ed8b1305c8\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:10:36 crc kubenswrapper[4752]: I0929 11:10:36.239714 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b00fc0a-2c62-480e-93e6-95ed8b1305c8-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"8b00fc0a-2c62-480e-93e6-95ed8b1305c8\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:10:36 crc kubenswrapper[4752]: I0929 11:10:36.242003 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b00fc0a-2c62-480e-93e6-95ed8b1305c8-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"8b00fc0a-2c62-480e-93e6-95ed8b1305c8\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:10:36 crc kubenswrapper[4752]: I0929 11:10:36.242263 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b00fc0a-2c62-480e-93e6-95ed8b1305c8-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"8b00fc0a-2c62-480e-93e6-95ed8b1305c8\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:10:36 crc kubenswrapper[4752]: I0929 11:10:36.256244 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqsz8\" (UniqueName: \"kubernetes.io/projected/8b00fc0a-2c62-480e-93e6-95ed8b1305c8-kube-api-access-hqsz8\") pod \"watcher-kuttl-applier-0\" (UID: \"8b00fc0a-2c62-480e-93e6-95ed8b1305c8\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:10:36 crc kubenswrapper[4752]: I0929 11:10:36.301509 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:10:36 crc kubenswrapper[4752]: I0929 11:10:36.341129 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f39fa61e-8858-4d6e-80db-bc94fdccaec8-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"f39fa61e-8858-4d6e-80db-bc94fdccaec8\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:10:36 crc kubenswrapper[4752]: I0929 11:10:36.341247 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f39fa61e-8858-4d6e-80db-bc94fdccaec8-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"f39fa61e-8858-4d6e-80db-bc94fdccaec8\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:10:36 crc kubenswrapper[4752]: I0929 11:10:36.341286 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f39fa61e-8858-4d6e-80db-bc94fdccaec8-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"f39fa61e-8858-4d6e-80db-bc94fdccaec8\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:10:36 crc kubenswrapper[4752]: I0929 11:10:36.341337 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/f39fa61e-8858-4d6e-80db-bc94fdccaec8-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"f39fa61e-8858-4d6e-80db-bc94fdccaec8\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:10:36 crc kubenswrapper[4752]: I0929 11:10:36.341363 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66rbr\" (UniqueName: \"kubernetes.io/projected/f39fa61e-8858-4d6e-80db-bc94fdccaec8-kube-api-access-66rbr\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"f39fa61e-8858-4d6e-80db-bc94fdccaec8\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:10:36 crc kubenswrapper[4752]: I0929 11:10:36.342011 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f39fa61e-8858-4d6e-80db-bc94fdccaec8-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"f39fa61e-8858-4d6e-80db-bc94fdccaec8\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:10:36 crc kubenswrapper[4752]: I0929 11:10:36.348426 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/f39fa61e-8858-4d6e-80db-bc94fdccaec8-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"f39fa61e-8858-4d6e-80db-bc94fdccaec8\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:10:36 crc kubenswrapper[4752]: I0929 11:10:36.348966 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f39fa61e-8858-4d6e-80db-bc94fdccaec8-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"f39fa61e-8858-4d6e-80db-bc94fdccaec8\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:10:36 crc kubenswrapper[4752]: I0929 11:10:36.350641 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f39fa61e-8858-4d6e-80db-bc94fdccaec8-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"f39fa61e-8858-4d6e-80db-bc94fdccaec8\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:10:36 crc kubenswrapper[4752]: I0929 11:10:36.365233 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66rbr\" (UniqueName: \"kubernetes.io/projected/f39fa61e-8858-4d6e-80db-bc94fdccaec8-kube-api-access-66rbr\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"f39fa61e-8858-4d6e-80db-bc94fdccaec8\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:10:36 crc kubenswrapper[4752]: I0929 11:10:36.373953 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:10:36 crc kubenswrapper[4752]: I0929 11:10:36.431356 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:10:36 crc kubenswrapper[4752]: I0929 11:10:36.808561 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Sep 29 11:10:36 crc kubenswrapper[4752]: W0929 11:10:36.811000 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1b0a1d48_69e7_486a_8163_1961ba8d3501.slice/crio-dde34122c9c68578193254f5e6af0182cf69d39bdce0e51a928e7c81c66a009c WatchSource:0}: Error finding container dde34122c9c68578193254f5e6af0182cf69d39bdce0e51a928e7c81c66a009c: Status 404 returned error can't find the container with id dde34122c9c68578193254f5e6af0182cf69d39bdce0e51a928e7c81c66a009c Sep 29 11:10:36 crc kubenswrapper[4752]: I0929 11:10:36.912745 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Sep 29 11:10:36 crc kubenswrapper[4752]: W0929 11:10:36.926180 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf39fa61e_8858_4d6e_80db_bc94fdccaec8.slice/crio-936ccb41c8ff30271a8e57f830fc35a21510627a65779dcabce327b0d122c653 WatchSource:0}: Error finding container 936ccb41c8ff30271a8e57f830fc35a21510627a65779dcabce327b0d122c653: Status 404 returned error can't find the container with id 936ccb41c8ff30271a8e57f830fc35a21510627a65779dcabce327b0d122c653 Sep 29 11:10:36 crc kubenswrapper[4752]: I0929 11:10:36.927549 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Sep 29 11:10:37 crc kubenswrapper[4752]: I0929 11:10:37.719642 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"8b00fc0a-2c62-480e-93e6-95ed8b1305c8","Type":"ContainerStarted","Data":"08fb7c9ca8887afc3730321ac0ce1b0e0e37d821b9e72f8a10de80286cba82ea"} Sep 29 11:10:37 crc kubenswrapper[4752]: I0929 11:10:37.719996 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"8b00fc0a-2c62-480e-93e6-95ed8b1305c8","Type":"ContainerStarted","Data":"75107b0321cd3aaf5d2910da3a15988d281de943027b68eaf2e91171787ddd8c"} Sep 29 11:10:37 crc kubenswrapper[4752]: I0929 11:10:37.723318 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"1b0a1d48-69e7-486a-8163-1961ba8d3501","Type":"ContainerStarted","Data":"54ac9ff5c2216d873da434228eb3eef8de401bc1ec5dbc7c9f11f451740d804e"} Sep 29 11:10:37 crc kubenswrapper[4752]: I0929 11:10:37.723381 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"1b0a1d48-69e7-486a-8163-1961ba8d3501","Type":"ContainerStarted","Data":"78468825fe549e376ecd19a9147b25b3c7226a1fad3845684a601b80708898ee"} Sep 29 11:10:37 crc kubenswrapper[4752]: I0929 11:10:37.723394 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"1b0a1d48-69e7-486a-8163-1961ba8d3501","Type":"ContainerStarted","Data":"dde34122c9c68578193254f5e6af0182cf69d39bdce0e51a928e7c81c66a009c"} Sep 29 11:10:37 crc kubenswrapper[4752]: I0929 11:10:37.723606 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:10:37 crc kubenswrapper[4752]: I0929 11:10:37.725669 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"f39fa61e-8858-4d6e-80db-bc94fdccaec8","Type":"ContainerStarted","Data":"a17245ec1c9efb32f26a19fe377a440a7fd3b84c08fbbdd6d68079700e9a78e0"} Sep 29 11:10:37 crc kubenswrapper[4752]: I0929 11:10:37.725695 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"f39fa61e-8858-4d6e-80db-bc94fdccaec8","Type":"ContainerStarted","Data":"936ccb41c8ff30271a8e57f830fc35a21510627a65779dcabce327b0d122c653"} Sep 29 11:10:37 crc kubenswrapper[4752]: I0929 11:10:37.742662 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podStartSLOduration=1.742641627 podStartE2EDuration="1.742641627s" podCreationTimestamp="2025-09-29 11:10:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 11:10:37.739195878 +0000 UTC m=+1578.528337545" watchObservedRunningTime="2025-09-29 11:10:37.742641627 +0000 UTC m=+1578.531783294" Sep 29 11:10:37 crc kubenswrapper[4752]: I0929 11:10:37.757823 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-api-0" podStartSLOduration=2.757783654 podStartE2EDuration="2.757783654s" podCreationTimestamp="2025-09-29 11:10:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 11:10:37.757497277 +0000 UTC m=+1578.546638974" watchObservedRunningTime="2025-09-29 11:10:37.757783654 +0000 UTC m=+1578.546925341" Sep 29 11:10:37 crc kubenswrapper[4752]: I0929 11:10:37.778132 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podStartSLOduration=1.778110888 podStartE2EDuration="1.778110888s" podCreationTimestamp="2025-09-29 11:10:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 11:10:37.771533555 +0000 UTC m=+1578.560675232" watchObservedRunningTime="2025-09-29 11:10:37.778110888 +0000 UTC m=+1578.567252555" Sep 29 11:10:40 crc kubenswrapper[4752]: I0929 11:10:40.160287 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:10:41 crc kubenswrapper[4752]: I0929 11:10:41.302425 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:10:41 crc kubenswrapper[4752]: I0929 11:10:41.375178 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:10:42 crc kubenswrapper[4752]: I0929 11:10:42.030982 4752 scope.go:117] "RemoveContainer" containerID="18eab399f36ee078445fd05909a0d35ada9fdfa2424d9729b71ad67d5ec2e670" Sep 29 11:10:42 crc kubenswrapper[4752]: E0929 11:10:42.031198 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgrvs_openshift-machine-config-operator(5863c243-797d-462a-b11f-71aaf005f8d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" Sep 29 11:10:45 crc kubenswrapper[4752]: I0929 11:10:45.945229 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:10:46 crc kubenswrapper[4752]: I0929 11:10:46.301956 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:10:46 crc kubenswrapper[4752]: I0929 11:10:46.311679 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:10:46 crc kubenswrapper[4752]: I0929 11:10:46.374593 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:10:46 crc kubenswrapper[4752]: I0929 11:10:46.399779 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:10:46 crc kubenswrapper[4752]: I0929 11:10:46.431920 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:10:46 crc kubenswrapper[4752]: I0929 11:10:46.458670 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:10:46 crc kubenswrapper[4752]: I0929 11:10:46.800181 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:10:46 crc kubenswrapper[4752]: I0929 11:10:46.808269 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:10:46 crc kubenswrapper[4752]: I0929 11:10:46.828688 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:10:46 crc kubenswrapper[4752]: I0929 11:10:46.832068 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:10:49 crc kubenswrapper[4752]: I0929 11:10:49.060634 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Sep 29 11:10:49 crc kubenswrapper[4752]: I0929 11:10:49.061258 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="58504de4-1881-41d4-8b99-630c0e8cce8a" containerName="ceilometer-central-agent" containerID="cri-o://74c9db24212877ed8a19e02590367c51eb30537fcb48d1bd3a4f84b3ee364e0a" gracePeriod=30 Sep 29 11:10:49 crc kubenswrapper[4752]: I0929 11:10:49.061408 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="58504de4-1881-41d4-8b99-630c0e8cce8a" containerName="ceilometer-notification-agent" containerID="cri-o://4393caf974990eb735039281298a506791486ea24bd9966039f8cbeb151a8e7e" gracePeriod=30 Sep 29 11:10:49 crc kubenswrapper[4752]: I0929 11:10:49.061403 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="58504de4-1881-41d4-8b99-630c0e8cce8a" containerName="sg-core" containerID="cri-o://5ff5485ae83a07ddf8e9978b23ae2f4fcca8c865952210451b9eb567a565b012" gracePeriod=30 Sep 29 11:10:49 crc kubenswrapper[4752]: I0929 11:10:49.061332 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="58504de4-1881-41d4-8b99-630c0e8cce8a" containerName="proxy-httpd" containerID="cri-o://6de5ab66ff6da4b6fd574ef251dbad6c1d30077f6f0b67c40d464cc56af5cede" gracePeriod=30 Sep 29 11:10:49 crc kubenswrapper[4752]: I0929 11:10:49.824441 4752 generic.go:334] "Generic (PLEG): container finished" podID="58504de4-1881-41d4-8b99-630c0e8cce8a" containerID="6de5ab66ff6da4b6fd574ef251dbad6c1d30077f6f0b67c40d464cc56af5cede" exitCode=0 Sep 29 11:10:49 crc kubenswrapper[4752]: I0929 11:10:49.824814 4752 generic.go:334] "Generic (PLEG): container finished" podID="58504de4-1881-41d4-8b99-630c0e8cce8a" containerID="5ff5485ae83a07ddf8e9978b23ae2f4fcca8c865952210451b9eb567a565b012" exitCode=2 Sep 29 11:10:49 crc kubenswrapper[4752]: I0929 11:10:49.824824 4752 generic.go:334] "Generic (PLEG): container finished" podID="58504de4-1881-41d4-8b99-630c0e8cce8a" containerID="74c9db24212877ed8a19e02590367c51eb30537fcb48d1bd3a4f84b3ee364e0a" exitCode=0 Sep 29 11:10:49 crc kubenswrapper[4752]: I0929 11:10:49.824523 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"58504de4-1881-41d4-8b99-630c0e8cce8a","Type":"ContainerDied","Data":"6de5ab66ff6da4b6fd574ef251dbad6c1d30077f6f0b67c40d464cc56af5cede"} Sep 29 11:10:49 crc kubenswrapper[4752]: I0929 11:10:49.824862 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"58504de4-1881-41d4-8b99-630c0e8cce8a","Type":"ContainerDied","Data":"5ff5485ae83a07ddf8e9978b23ae2f4fcca8c865952210451b9eb567a565b012"} Sep 29 11:10:49 crc kubenswrapper[4752]: I0929 11:10:49.824900 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"58504de4-1881-41d4-8b99-630c0e8cce8a","Type":"ContainerDied","Data":"74c9db24212877ed8a19e02590367c51eb30537fcb48d1bd3a4f84b3ee364e0a"} Sep 29 11:10:54 crc kubenswrapper[4752]: I0929 11:10:54.675622 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:10:54 crc kubenswrapper[4752]: I0929 11:10:54.750633 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58504de4-1881-41d4-8b99-630c0e8cce8a-combined-ca-bundle\") pod \"58504de4-1881-41d4-8b99-630c0e8cce8a\" (UID: \"58504de4-1881-41d4-8b99-630c0e8cce8a\") " Sep 29 11:10:54 crc kubenswrapper[4752]: I0929 11:10:54.750755 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/58504de4-1881-41d4-8b99-630c0e8cce8a-ceilometer-tls-certs\") pod \"58504de4-1881-41d4-8b99-630c0e8cce8a\" (UID: \"58504de4-1881-41d4-8b99-630c0e8cce8a\") " Sep 29 11:10:54 crc kubenswrapper[4752]: I0929 11:10:54.751162 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/58504de4-1881-41d4-8b99-630c0e8cce8a-sg-core-conf-yaml\") pod \"58504de4-1881-41d4-8b99-630c0e8cce8a\" (UID: \"58504de4-1881-41d4-8b99-630c0e8cce8a\") " Sep 29 11:10:54 crc kubenswrapper[4752]: I0929 11:10:54.751193 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/58504de4-1881-41d4-8b99-630c0e8cce8a-run-httpd\") pod \"58504de4-1881-41d4-8b99-630c0e8cce8a\" (UID: \"58504de4-1881-41d4-8b99-630c0e8cce8a\") " Sep 29 11:10:54 crc kubenswrapper[4752]: I0929 11:10:54.751237 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58504de4-1881-41d4-8b99-630c0e8cce8a-config-data\") pod \"58504de4-1881-41d4-8b99-630c0e8cce8a\" (UID: \"58504de4-1881-41d4-8b99-630c0e8cce8a\") " Sep 29 11:10:54 crc kubenswrapper[4752]: I0929 11:10:54.751264 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/58504de4-1881-41d4-8b99-630c0e8cce8a-log-httpd\") pod \"58504de4-1881-41d4-8b99-630c0e8cce8a\" (UID: \"58504de4-1881-41d4-8b99-630c0e8cce8a\") " Sep 29 11:10:54 crc kubenswrapper[4752]: I0929 11:10:54.751306 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l52r2\" (UniqueName: \"kubernetes.io/projected/58504de4-1881-41d4-8b99-630c0e8cce8a-kube-api-access-l52r2\") pod \"58504de4-1881-41d4-8b99-630c0e8cce8a\" (UID: \"58504de4-1881-41d4-8b99-630c0e8cce8a\") " Sep 29 11:10:54 crc kubenswrapper[4752]: I0929 11:10:54.751328 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58504de4-1881-41d4-8b99-630c0e8cce8a-scripts\") pod \"58504de4-1881-41d4-8b99-630c0e8cce8a\" (UID: \"58504de4-1881-41d4-8b99-630c0e8cce8a\") " Sep 29 11:10:54 crc kubenswrapper[4752]: I0929 11:10:54.754187 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58504de4-1881-41d4-8b99-630c0e8cce8a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "58504de4-1881-41d4-8b99-630c0e8cce8a" (UID: "58504de4-1881-41d4-8b99-630c0e8cce8a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 11:10:54 crc kubenswrapper[4752]: I0929 11:10:54.754239 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58504de4-1881-41d4-8b99-630c0e8cce8a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "58504de4-1881-41d4-8b99-630c0e8cce8a" (UID: "58504de4-1881-41d4-8b99-630c0e8cce8a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 11:10:54 crc kubenswrapper[4752]: I0929 11:10:54.760521 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58504de4-1881-41d4-8b99-630c0e8cce8a-kube-api-access-l52r2" (OuterVolumeSpecName: "kube-api-access-l52r2") pod "58504de4-1881-41d4-8b99-630c0e8cce8a" (UID: "58504de4-1881-41d4-8b99-630c0e8cce8a"). InnerVolumeSpecName "kube-api-access-l52r2". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 11:10:54 crc kubenswrapper[4752]: I0929 11:10:54.765310 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58504de4-1881-41d4-8b99-630c0e8cce8a-scripts" (OuterVolumeSpecName: "scripts") pod "58504de4-1881-41d4-8b99-630c0e8cce8a" (UID: "58504de4-1881-41d4-8b99-630c0e8cce8a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:10:54 crc kubenswrapper[4752]: I0929 11:10:54.784343 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58504de4-1881-41d4-8b99-630c0e8cce8a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "58504de4-1881-41d4-8b99-630c0e8cce8a" (UID: "58504de4-1881-41d4-8b99-630c0e8cce8a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:10:54 crc kubenswrapper[4752]: I0929 11:10:54.803906 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58504de4-1881-41d4-8b99-630c0e8cce8a-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "58504de4-1881-41d4-8b99-630c0e8cce8a" (UID: "58504de4-1881-41d4-8b99-630c0e8cce8a"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:10:54 crc kubenswrapper[4752]: I0929 11:10:54.829923 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58504de4-1881-41d4-8b99-630c0e8cce8a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "58504de4-1881-41d4-8b99-630c0e8cce8a" (UID: "58504de4-1881-41d4-8b99-630c0e8cce8a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:10:54 crc kubenswrapper[4752]: I0929 11:10:54.853599 4752 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/58504de4-1881-41d4-8b99-630c0e8cce8a-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 29 11:10:54 crc kubenswrapper[4752]: I0929 11:10:54.853643 4752 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/58504de4-1881-41d4-8b99-630c0e8cce8a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 29 11:10:54 crc kubenswrapper[4752]: I0929 11:10:54.853653 4752 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/58504de4-1881-41d4-8b99-630c0e8cce8a-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 29 11:10:54 crc kubenswrapper[4752]: I0929 11:10:54.853662 4752 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/58504de4-1881-41d4-8b99-630c0e8cce8a-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 29 11:10:54 crc kubenswrapper[4752]: I0929 11:10:54.853671 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l52r2\" (UniqueName: \"kubernetes.io/projected/58504de4-1881-41d4-8b99-630c0e8cce8a-kube-api-access-l52r2\") on node \"crc\" DevicePath \"\"" Sep 29 11:10:54 crc kubenswrapper[4752]: I0929 11:10:54.853681 4752 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58504de4-1881-41d4-8b99-630c0e8cce8a-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 11:10:54 crc kubenswrapper[4752]: I0929 11:10:54.853689 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58504de4-1881-41d4-8b99-630c0e8cce8a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 11:10:54 crc kubenswrapper[4752]: I0929 11:10:54.864990 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58504de4-1881-41d4-8b99-630c0e8cce8a-config-data" (OuterVolumeSpecName: "config-data") pod "58504de4-1881-41d4-8b99-630c0e8cce8a" (UID: "58504de4-1881-41d4-8b99-630c0e8cce8a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:10:54 crc kubenswrapper[4752]: I0929 11:10:54.865092 4752 generic.go:334] "Generic (PLEG): container finished" podID="58504de4-1881-41d4-8b99-630c0e8cce8a" containerID="4393caf974990eb735039281298a506791486ea24bd9966039f8cbeb151a8e7e" exitCode=0 Sep 29 11:10:54 crc kubenswrapper[4752]: I0929 11:10:54.865128 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"58504de4-1881-41d4-8b99-630c0e8cce8a","Type":"ContainerDied","Data":"4393caf974990eb735039281298a506791486ea24bd9966039f8cbeb151a8e7e"} Sep 29 11:10:54 crc kubenswrapper[4752]: I0929 11:10:54.865152 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"58504de4-1881-41d4-8b99-630c0e8cce8a","Type":"ContainerDied","Data":"24f29ff5061d8ec9ac918087887645ae124c2b259a6cade44ba550fefb8b41a0"} Sep 29 11:10:54 crc kubenswrapper[4752]: I0929 11:10:54.865170 4752 scope.go:117] "RemoveContainer" containerID="6de5ab66ff6da4b6fd574ef251dbad6c1d30077f6f0b67c40d464cc56af5cede" Sep 29 11:10:54 crc kubenswrapper[4752]: I0929 11:10:54.865167 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:10:54 crc kubenswrapper[4752]: I0929 11:10:54.889251 4752 scope.go:117] "RemoveContainer" containerID="5ff5485ae83a07ddf8e9978b23ae2f4fcca8c865952210451b9eb567a565b012" Sep 29 11:10:54 crc kubenswrapper[4752]: I0929 11:10:54.911968 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Sep 29 11:10:54 crc kubenswrapper[4752]: I0929 11:10:54.913069 4752 scope.go:117] "RemoveContainer" containerID="4393caf974990eb735039281298a506791486ea24bd9966039f8cbeb151a8e7e" Sep 29 11:10:54 crc kubenswrapper[4752]: I0929 11:10:54.918759 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Sep 29 11:10:54 crc kubenswrapper[4752]: I0929 11:10:54.934961 4752 scope.go:117] "RemoveContainer" containerID="74c9db24212877ed8a19e02590367c51eb30537fcb48d1bd3a4f84b3ee364e0a" Sep 29 11:10:54 crc kubenswrapper[4752]: I0929 11:10:54.940594 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Sep 29 11:10:54 crc kubenswrapper[4752]: E0929 11:10:54.941196 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58504de4-1881-41d4-8b99-630c0e8cce8a" containerName="sg-core" Sep 29 11:10:54 crc kubenswrapper[4752]: I0929 11:10:54.941283 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="58504de4-1881-41d4-8b99-630c0e8cce8a" containerName="sg-core" Sep 29 11:10:54 crc kubenswrapper[4752]: E0929 11:10:54.941360 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58504de4-1881-41d4-8b99-630c0e8cce8a" containerName="ceilometer-notification-agent" Sep 29 11:10:54 crc kubenswrapper[4752]: I0929 11:10:54.941431 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="58504de4-1881-41d4-8b99-630c0e8cce8a" containerName="ceilometer-notification-agent" Sep 29 11:10:54 crc kubenswrapper[4752]: E0929 11:10:54.941546 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58504de4-1881-41d4-8b99-630c0e8cce8a" containerName="proxy-httpd" Sep 29 11:10:54 crc kubenswrapper[4752]: I0929 11:10:54.941621 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="58504de4-1881-41d4-8b99-630c0e8cce8a" containerName="proxy-httpd" Sep 29 11:10:54 crc kubenswrapper[4752]: E0929 11:10:54.941725 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58504de4-1881-41d4-8b99-630c0e8cce8a" containerName="ceilometer-central-agent" Sep 29 11:10:54 crc kubenswrapper[4752]: I0929 11:10:54.941819 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="58504de4-1881-41d4-8b99-630c0e8cce8a" containerName="ceilometer-central-agent" Sep 29 11:10:54 crc kubenswrapper[4752]: I0929 11:10:54.942069 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="58504de4-1881-41d4-8b99-630c0e8cce8a" containerName="ceilometer-notification-agent" Sep 29 11:10:54 crc kubenswrapper[4752]: I0929 11:10:54.942164 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="58504de4-1881-41d4-8b99-630c0e8cce8a" containerName="ceilometer-central-agent" Sep 29 11:10:54 crc kubenswrapper[4752]: I0929 11:10:54.942256 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="58504de4-1881-41d4-8b99-630c0e8cce8a" containerName="sg-core" Sep 29 11:10:54 crc kubenswrapper[4752]: I0929 11:10:54.942337 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="58504de4-1881-41d4-8b99-630c0e8cce8a" containerName="proxy-httpd" Sep 29 11:10:54 crc kubenswrapper[4752]: I0929 11:10:54.945568 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:10:54 crc kubenswrapper[4752]: I0929 11:10:54.951475 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Sep 29 11:10:54 crc kubenswrapper[4752]: I0929 11:10:54.951592 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-ceilometer-internal-svc" Sep 29 11:10:54 crc kubenswrapper[4752]: I0929 11:10:54.952147 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Sep 29 11:10:54 crc kubenswrapper[4752]: I0929 11:10:54.954870 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58504de4-1881-41d4-8b99-630c0e8cce8a-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 11:10:54 crc kubenswrapper[4752]: I0929 11:10:54.956583 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Sep 29 11:10:54 crc kubenswrapper[4752]: I0929 11:10:54.977220 4752 scope.go:117] "RemoveContainer" containerID="6de5ab66ff6da4b6fd574ef251dbad6c1d30077f6f0b67c40d464cc56af5cede" Sep 29 11:10:54 crc kubenswrapper[4752]: E0929 11:10:54.978364 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6de5ab66ff6da4b6fd574ef251dbad6c1d30077f6f0b67c40d464cc56af5cede\": container with ID starting with 6de5ab66ff6da4b6fd574ef251dbad6c1d30077f6f0b67c40d464cc56af5cede not found: ID does not exist" containerID="6de5ab66ff6da4b6fd574ef251dbad6c1d30077f6f0b67c40d464cc56af5cede" Sep 29 11:10:54 crc kubenswrapper[4752]: I0929 11:10:54.978419 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6de5ab66ff6da4b6fd574ef251dbad6c1d30077f6f0b67c40d464cc56af5cede"} err="failed to get container status \"6de5ab66ff6da4b6fd574ef251dbad6c1d30077f6f0b67c40d464cc56af5cede\": rpc error: code = NotFound desc = could not find container \"6de5ab66ff6da4b6fd574ef251dbad6c1d30077f6f0b67c40d464cc56af5cede\": container with ID starting with 6de5ab66ff6da4b6fd574ef251dbad6c1d30077f6f0b67c40d464cc56af5cede not found: ID does not exist" Sep 29 11:10:54 crc kubenswrapper[4752]: I0929 11:10:54.978450 4752 scope.go:117] "RemoveContainer" containerID="5ff5485ae83a07ddf8e9978b23ae2f4fcca8c865952210451b9eb567a565b012" Sep 29 11:10:54 crc kubenswrapper[4752]: E0929 11:10:54.980410 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ff5485ae83a07ddf8e9978b23ae2f4fcca8c865952210451b9eb567a565b012\": container with ID starting with 5ff5485ae83a07ddf8e9978b23ae2f4fcca8c865952210451b9eb567a565b012 not found: ID does not exist" containerID="5ff5485ae83a07ddf8e9978b23ae2f4fcca8c865952210451b9eb567a565b012" Sep 29 11:10:54 crc kubenswrapper[4752]: I0929 11:10:54.980457 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ff5485ae83a07ddf8e9978b23ae2f4fcca8c865952210451b9eb567a565b012"} err="failed to get container status \"5ff5485ae83a07ddf8e9978b23ae2f4fcca8c865952210451b9eb567a565b012\": rpc error: code = NotFound desc = could not find container \"5ff5485ae83a07ddf8e9978b23ae2f4fcca8c865952210451b9eb567a565b012\": container with ID starting with 5ff5485ae83a07ddf8e9978b23ae2f4fcca8c865952210451b9eb567a565b012 not found: ID does not exist" Sep 29 11:10:54 crc kubenswrapper[4752]: I0929 11:10:54.980491 4752 scope.go:117] "RemoveContainer" containerID="4393caf974990eb735039281298a506791486ea24bd9966039f8cbeb151a8e7e" Sep 29 11:10:54 crc kubenswrapper[4752]: E0929 11:10:54.981680 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4393caf974990eb735039281298a506791486ea24bd9966039f8cbeb151a8e7e\": container with ID starting with 4393caf974990eb735039281298a506791486ea24bd9966039f8cbeb151a8e7e not found: ID does not exist" containerID="4393caf974990eb735039281298a506791486ea24bd9966039f8cbeb151a8e7e" Sep 29 11:10:54 crc kubenswrapper[4752]: I0929 11:10:54.981722 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4393caf974990eb735039281298a506791486ea24bd9966039f8cbeb151a8e7e"} err="failed to get container status \"4393caf974990eb735039281298a506791486ea24bd9966039f8cbeb151a8e7e\": rpc error: code = NotFound desc = could not find container \"4393caf974990eb735039281298a506791486ea24bd9966039f8cbeb151a8e7e\": container with ID starting with 4393caf974990eb735039281298a506791486ea24bd9966039f8cbeb151a8e7e not found: ID does not exist" Sep 29 11:10:54 crc kubenswrapper[4752]: I0929 11:10:54.981747 4752 scope.go:117] "RemoveContainer" containerID="74c9db24212877ed8a19e02590367c51eb30537fcb48d1bd3a4f84b3ee364e0a" Sep 29 11:10:54 crc kubenswrapper[4752]: E0929 11:10:54.982220 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74c9db24212877ed8a19e02590367c51eb30537fcb48d1bd3a4f84b3ee364e0a\": container with ID starting with 74c9db24212877ed8a19e02590367c51eb30537fcb48d1bd3a4f84b3ee364e0a not found: ID does not exist" containerID="74c9db24212877ed8a19e02590367c51eb30537fcb48d1bd3a4f84b3ee364e0a" Sep 29 11:10:54 crc kubenswrapper[4752]: I0929 11:10:54.982252 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74c9db24212877ed8a19e02590367c51eb30537fcb48d1bd3a4f84b3ee364e0a"} err="failed to get container status \"74c9db24212877ed8a19e02590367c51eb30537fcb48d1bd3a4f84b3ee364e0a\": rpc error: code = NotFound desc = could not find container \"74c9db24212877ed8a19e02590367c51eb30537fcb48d1bd3a4f84b3ee364e0a\": container with ID starting with 74c9db24212877ed8a19e02590367c51eb30537fcb48d1bd3a4f84b3ee364e0a not found: ID does not exist" Sep 29 11:10:55 crc kubenswrapper[4752]: I0929 11:10:55.056020 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b09d00c-78f2-4399-bc78-3c901f3470ad-config-data\") pod \"ceilometer-0\" (UID: \"8b09d00c-78f2-4399-bc78-3c901f3470ad\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:10:55 crc kubenswrapper[4752]: I0929 11:10:55.056122 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgjjr\" (UniqueName: \"kubernetes.io/projected/8b09d00c-78f2-4399-bc78-3c901f3470ad-kube-api-access-cgjjr\") pod \"ceilometer-0\" (UID: \"8b09d00c-78f2-4399-bc78-3c901f3470ad\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:10:55 crc kubenswrapper[4752]: I0929 11:10:55.056154 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b09d00c-78f2-4399-bc78-3c901f3470ad-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"8b09d00c-78f2-4399-bc78-3c901f3470ad\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:10:55 crc kubenswrapper[4752]: I0929 11:10:55.056183 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8b09d00c-78f2-4399-bc78-3c901f3470ad-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8b09d00c-78f2-4399-bc78-3c901f3470ad\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:10:55 crc kubenswrapper[4752]: I0929 11:10:55.056284 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b09d00c-78f2-4399-bc78-3c901f3470ad-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8b09d00c-78f2-4399-bc78-3c901f3470ad\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:10:55 crc kubenswrapper[4752]: I0929 11:10:55.056322 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b09d00c-78f2-4399-bc78-3c901f3470ad-scripts\") pod \"ceilometer-0\" (UID: \"8b09d00c-78f2-4399-bc78-3c901f3470ad\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:10:55 crc kubenswrapper[4752]: I0929 11:10:55.056383 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8b09d00c-78f2-4399-bc78-3c901f3470ad-log-httpd\") pod \"ceilometer-0\" (UID: \"8b09d00c-78f2-4399-bc78-3c901f3470ad\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:10:55 crc kubenswrapper[4752]: I0929 11:10:55.056458 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8b09d00c-78f2-4399-bc78-3c901f3470ad-run-httpd\") pod \"ceilometer-0\" (UID: \"8b09d00c-78f2-4399-bc78-3c901f3470ad\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:10:55 crc kubenswrapper[4752]: I0929 11:10:55.157826 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b09d00c-78f2-4399-bc78-3c901f3470ad-config-data\") pod \"ceilometer-0\" (UID: \"8b09d00c-78f2-4399-bc78-3c901f3470ad\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:10:55 crc kubenswrapper[4752]: I0929 11:10:55.157899 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgjjr\" (UniqueName: \"kubernetes.io/projected/8b09d00c-78f2-4399-bc78-3c901f3470ad-kube-api-access-cgjjr\") pod \"ceilometer-0\" (UID: \"8b09d00c-78f2-4399-bc78-3c901f3470ad\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:10:55 crc kubenswrapper[4752]: I0929 11:10:55.157946 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b09d00c-78f2-4399-bc78-3c901f3470ad-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"8b09d00c-78f2-4399-bc78-3c901f3470ad\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:10:55 crc kubenswrapper[4752]: I0929 11:10:55.157981 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8b09d00c-78f2-4399-bc78-3c901f3470ad-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8b09d00c-78f2-4399-bc78-3c901f3470ad\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:10:55 crc kubenswrapper[4752]: I0929 11:10:55.158080 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b09d00c-78f2-4399-bc78-3c901f3470ad-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8b09d00c-78f2-4399-bc78-3c901f3470ad\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:10:55 crc kubenswrapper[4752]: I0929 11:10:55.158103 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b09d00c-78f2-4399-bc78-3c901f3470ad-scripts\") pod \"ceilometer-0\" (UID: \"8b09d00c-78f2-4399-bc78-3c901f3470ad\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:10:55 crc kubenswrapper[4752]: I0929 11:10:55.158150 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8b09d00c-78f2-4399-bc78-3c901f3470ad-log-httpd\") pod \"ceilometer-0\" (UID: \"8b09d00c-78f2-4399-bc78-3c901f3470ad\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:10:55 crc kubenswrapper[4752]: I0929 11:10:55.158227 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8b09d00c-78f2-4399-bc78-3c901f3470ad-run-httpd\") pod \"ceilometer-0\" (UID: \"8b09d00c-78f2-4399-bc78-3c901f3470ad\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:10:55 crc kubenswrapper[4752]: I0929 11:10:55.158770 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8b09d00c-78f2-4399-bc78-3c901f3470ad-log-httpd\") pod \"ceilometer-0\" (UID: \"8b09d00c-78f2-4399-bc78-3c901f3470ad\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:10:55 crc kubenswrapper[4752]: I0929 11:10:55.158791 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8b09d00c-78f2-4399-bc78-3c901f3470ad-run-httpd\") pod \"ceilometer-0\" (UID: \"8b09d00c-78f2-4399-bc78-3c901f3470ad\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:10:55 crc kubenswrapper[4752]: I0929 11:10:55.162989 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b09d00c-78f2-4399-bc78-3c901f3470ad-config-data\") pod \"ceilometer-0\" (UID: \"8b09d00c-78f2-4399-bc78-3c901f3470ad\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:10:55 crc kubenswrapper[4752]: I0929 11:10:55.163650 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b09d00c-78f2-4399-bc78-3c901f3470ad-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"8b09d00c-78f2-4399-bc78-3c901f3470ad\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:10:55 crc kubenswrapper[4752]: I0929 11:10:55.163875 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8b09d00c-78f2-4399-bc78-3c901f3470ad-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8b09d00c-78f2-4399-bc78-3c901f3470ad\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:10:55 crc kubenswrapper[4752]: I0929 11:10:55.164165 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b09d00c-78f2-4399-bc78-3c901f3470ad-scripts\") pod \"ceilometer-0\" (UID: \"8b09d00c-78f2-4399-bc78-3c901f3470ad\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:10:55 crc kubenswrapper[4752]: I0929 11:10:55.164722 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b09d00c-78f2-4399-bc78-3c901f3470ad-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8b09d00c-78f2-4399-bc78-3c901f3470ad\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:10:55 crc kubenswrapper[4752]: I0929 11:10:55.187761 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgjjr\" (UniqueName: \"kubernetes.io/projected/8b09d00c-78f2-4399-bc78-3c901f3470ad-kube-api-access-cgjjr\") pod \"ceilometer-0\" (UID: \"8b09d00c-78f2-4399-bc78-3c901f3470ad\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:10:55 crc kubenswrapper[4752]: I0929 11:10:55.276473 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:10:55 crc kubenswrapper[4752]: I0929 11:10:55.861231 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Sep 29 11:10:55 crc kubenswrapper[4752]: W0929 11:10:55.865998 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8b09d00c_78f2_4399_bc78_3c901f3470ad.slice/crio-0dcf8829037bd340f4cd12ccabd43f42549de12b5bbbe1b0821e3f6f25e724db WatchSource:0}: Error finding container 0dcf8829037bd340f4cd12ccabd43f42549de12b5bbbe1b0821e3f6f25e724db: Status 404 returned error can't find the container with id 0dcf8829037bd340f4cd12ccabd43f42549de12b5bbbe1b0821e3f6f25e724db Sep 29 11:10:55 crc kubenswrapper[4752]: I0929 11:10:55.869895 4752 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 29 11:10:56 crc kubenswrapper[4752]: I0929 11:10:56.054694 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58504de4-1881-41d4-8b99-630c0e8cce8a" path="/var/lib/kubelet/pods/58504de4-1881-41d4-8b99-630c0e8cce8a/volumes" Sep 29 11:10:56 crc kubenswrapper[4752]: I0929 11:10:56.887527 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"8b09d00c-78f2-4399-bc78-3c901f3470ad","Type":"ContainerStarted","Data":"0dcf8829037bd340f4cd12ccabd43f42549de12b5bbbe1b0821e3f6f25e724db"} Sep 29 11:10:57 crc kubenswrapper[4752]: I0929 11:10:57.033013 4752 scope.go:117] "RemoveContainer" containerID="18eab399f36ee078445fd05909a0d35ada9fdfa2424d9729b71ad67d5ec2e670" Sep 29 11:10:57 crc kubenswrapper[4752]: E0929 11:10:57.033401 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgrvs_openshift-machine-config-operator(5863c243-797d-462a-b11f-71aaf005f8d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" Sep 29 11:10:57 crc kubenswrapper[4752]: I0929 11:10:57.898326 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"8b09d00c-78f2-4399-bc78-3c901f3470ad","Type":"ContainerStarted","Data":"a59217981fcb9b639e460e9b3c77de8fb9adcca23087040a03f8f57cd23a1b8b"} Sep 29 11:10:57 crc kubenswrapper[4752]: I0929 11:10:57.898726 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"8b09d00c-78f2-4399-bc78-3c901f3470ad","Type":"ContainerStarted","Data":"7027072bd227485b6c8846e90b7a47acbf71310ea093d71a484c0d57bd584f85"} Sep 29 11:10:58 crc kubenswrapper[4752]: I0929 11:10:58.909055 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"8b09d00c-78f2-4399-bc78-3c901f3470ad","Type":"ContainerStarted","Data":"3d0632330e2d8ac690c35022aa66b02ac23ae110fccd155839efcc35e041325d"} Sep 29 11:11:00 crc kubenswrapper[4752]: I0929 11:11:00.945948 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"8b09d00c-78f2-4399-bc78-3c901f3470ad","Type":"ContainerStarted","Data":"f185d6ba4609c787ea95f5509f05e77b699306647b1a4f71d4c17539ca5826c6"} Sep 29 11:11:00 crc kubenswrapper[4752]: I0929 11:11:00.946583 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:11:00 crc kubenswrapper[4752]: I0929 11:11:00.972572 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/ceilometer-0" podStartSLOduration=2.485215224 podStartE2EDuration="6.972551353s" podCreationTimestamp="2025-09-29 11:10:54 +0000 UTC" firstStartedPulling="2025-09-29 11:10:55.869626882 +0000 UTC m=+1596.658768549" lastFinishedPulling="2025-09-29 11:11:00.356963011 +0000 UTC m=+1601.146104678" observedRunningTime="2025-09-29 11:11:00.972154922 +0000 UTC m=+1601.761296589" watchObservedRunningTime="2025-09-29 11:11:00.972551353 +0000 UTC m=+1601.761693020" Sep 29 11:11:09 crc kubenswrapper[4752]: I0929 11:11:09.031895 4752 scope.go:117] "RemoveContainer" containerID="18eab399f36ee078445fd05909a0d35ada9fdfa2424d9729b71ad67d5ec2e670" Sep 29 11:11:09 crc kubenswrapper[4752]: E0929 11:11:09.032825 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgrvs_openshift-machine-config-operator(5863c243-797d-462a-b11f-71aaf005f8d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" Sep 29 11:11:20 crc kubenswrapper[4752]: I0929 11:11:20.900633 4752 scope.go:117] "RemoveContainer" containerID="97b115dcd0fec96d222056a04cfbc24da589dee1b137fb838ca8229cb9a12518" Sep 29 11:11:20 crc kubenswrapper[4752]: I0929 11:11:20.927132 4752 scope.go:117] "RemoveContainer" containerID="1ebb764c140be5cb6d96d917c5c99c0adcfd7a731aea30b903e1c7551a5a8f0a" Sep 29 11:11:22 crc kubenswrapper[4752]: I0929 11:11:22.031680 4752 scope.go:117] "RemoveContainer" containerID="18eab399f36ee078445fd05909a0d35ada9fdfa2424d9729b71ad67d5ec2e670" Sep 29 11:11:22 crc kubenswrapper[4752]: E0929 11:11:22.032335 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgrvs_openshift-machine-config-operator(5863c243-797d-462a-b11f-71aaf005f8d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" Sep 29 11:11:25 crc kubenswrapper[4752]: I0929 11:11:25.286418 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:11:36 crc kubenswrapper[4752]: I0929 11:11:36.030971 4752 scope.go:117] "RemoveContainer" containerID="18eab399f36ee078445fd05909a0d35ada9fdfa2424d9729b71ad67d5ec2e670" Sep 29 11:11:36 crc kubenswrapper[4752]: E0929 11:11:36.031736 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgrvs_openshift-machine-config-operator(5863c243-797d-462a-b11f-71aaf005f8d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" Sep 29 11:11:47 crc kubenswrapper[4752]: I0929 11:11:47.031462 4752 scope.go:117] "RemoveContainer" containerID="18eab399f36ee078445fd05909a0d35ada9fdfa2424d9729b71ad67d5ec2e670" Sep 29 11:11:47 crc kubenswrapper[4752]: E0929 11:11:47.032682 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgrvs_openshift-machine-config-operator(5863c243-797d-462a-b11f-71aaf005f8d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" Sep 29 11:12:02 crc kubenswrapper[4752]: I0929 11:12:02.031283 4752 scope.go:117] "RemoveContainer" containerID="18eab399f36ee078445fd05909a0d35ada9fdfa2424d9729b71ad67d5ec2e670" Sep 29 11:12:02 crc kubenswrapper[4752]: E0929 11:12:02.032031 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgrvs_openshift-machine-config-operator(5863c243-797d-462a-b11f-71aaf005f8d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" Sep 29 11:12:14 crc kubenswrapper[4752]: I0929 11:12:14.031543 4752 scope.go:117] "RemoveContainer" containerID="18eab399f36ee078445fd05909a0d35ada9fdfa2424d9729b71ad67d5ec2e670" Sep 29 11:12:14 crc kubenswrapper[4752]: E0929 11:12:14.032291 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgrvs_openshift-machine-config-operator(5863c243-797d-462a-b11f-71aaf005f8d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" Sep 29 11:12:21 crc kubenswrapper[4752]: I0929 11:12:21.020183 4752 scope.go:117] "RemoveContainer" containerID="687ff988f13298c88fb93ee6de176587291f824339939fe4cfec4216479ceab7" Sep 29 11:12:23 crc kubenswrapper[4752]: I0929 11:12:23.373141 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-lcn2q"] Sep 29 11:12:23 crc kubenswrapper[4752]: I0929 11:12:23.375536 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lcn2q" Sep 29 11:12:23 crc kubenswrapper[4752]: I0929 11:12:23.383103 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lcn2q"] Sep 29 11:12:23 crc kubenswrapper[4752]: I0929 11:12:23.444929 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/168dc9fa-2c32-4bf0-915d-79dd62d497f4-catalog-content\") pod \"redhat-operators-lcn2q\" (UID: \"168dc9fa-2c32-4bf0-915d-79dd62d497f4\") " pod="openshift-marketplace/redhat-operators-lcn2q" Sep 29 11:12:23 crc kubenswrapper[4752]: I0929 11:12:23.445007 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n855v\" (UniqueName: \"kubernetes.io/projected/168dc9fa-2c32-4bf0-915d-79dd62d497f4-kube-api-access-n855v\") pod \"redhat-operators-lcn2q\" (UID: \"168dc9fa-2c32-4bf0-915d-79dd62d497f4\") " pod="openshift-marketplace/redhat-operators-lcn2q" Sep 29 11:12:23 crc kubenswrapper[4752]: I0929 11:12:23.445058 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/168dc9fa-2c32-4bf0-915d-79dd62d497f4-utilities\") pod \"redhat-operators-lcn2q\" (UID: \"168dc9fa-2c32-4bf0-915d-79dd62d497f4\") " pod="openshift-marketplace/redhat-operators-lcn2q" Sep 29 11:12:23 crc kubenswrapper[4752]: I0929 11:12:23.546341 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n855v\" (UniqueName: \"kubernetes.io/projected/168dc9fa-2c32-4bf0-915d-79dd62d497f4-kube-api-access-n855v\") pod \"redhat-operators-lcn2q\" (UID: \"168dc9fa-2c32-4bf0-915d-79dd62d497f4\") " pod="openshift-marketplace/redhat-operators-lcn2q" Sep 29 11:12:23 crc kubenswrapper[4752]: I0929 11:12:23.546731 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/168dc9fa-2c32-4bf0-915d-79dd62d497f4-utilities\") pod \"redhat-operators-lcn2q\" (UID: \"168dc9fa-2c32-4bf0-915d-79dd62d497f4\") " pod="openshift-marketplace/redhat-operators-lcn2q" Sep 29 11:12:23 crc kubenswrapper[4752]: I0929 11:12:23.546941 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/168dc9fa-2c32-4bf0-915d-79dd62d497f4-catalog-content\") pod \"redhat-operators-lcn2q\" (UID: \"168dc9fa-2c32-4bf0-915d-79dd62d497f4\") " pod="openshift-marketplace/redhat-operators-lcn2q" Sep 29 11:12:23 crc kubenswrapper[4752]: I0929 11:12:23.547684 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/168dc9fa-2c32-4bf0-915d-79dd62d497f4-catalog-content\") pod \"redhat-operators-lcn2q\" (UID: \"168dc9fa-2c32-4bf0-915d-79dd62d497f4\") " pod="openshift-marketplace/redhat-operators-lcn2q" Sep 29 11:12:23 crc kubenswrapper[4752]: I0929 11:12:23.547846 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/168dc9fa-2c32-4bf0-915d-79dd62d497f4-utilities\") pod \"redhat-operators-lcn2q\" (UID: \"168dc9fa-2c32-4bf0-915d-79dd62d497f4\") " pod="openshift-marketplace/redhat-operators-lcn2q" Sep 29 11:12:23 crc kubenswrapper[4752]: I0929 11:12:23.567383 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n855v\" (UniqueName: \"kubernetes.io/projected/168dc9fa-2c32-4bf0-915d-79dd62d497f4-kube-api-access-n855v\") pod \"redhat-operators-lcn2q\" (UID: \"168dc9fa-2c32-4bf0-915d-79dd62d497f4\") " pod="openshift-marketplace/redhat-operators-lcn2q" Sep 29 11:12:23 crc kubenswrapper[4752]: I0929 11:12:23.697850 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lcn2q" Sep 29 11:12:24 crc kubenswrapper[4752]: I0929 11:12:24.182988 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lcn2q"] Sep 29 11:12:24 crc kubenswrapper[4752]: I0929 11:12:24.634022 4752 generic.go:334] "Generic (PLEG): container finished" podID="168dc9fa-2c32-4bf0-915d-79dd62d497f4" containerID="1efe63563b053b4cc5f05d803d8d10a924f2e25999512d1913ea41aa7b845c38" exitCode=0 Sep 29 11:12:24 crc kubenswrapper[4752]: I0929 11:12:24.634119 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lcn2q" event={"ID":"168dc9fa-2c32-4bf0-915d-79dd62d497f4","Type":"ContainerDied","Data":"1efe63563b053b4cc5f05d803d8d10a924f2e25999512d1913ea41aa7b845c38"} Sep 29 11:12:24 crc kubenswrapper[4752]: I0929 11:12:24.634241 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lcn2q" event={"ID":"168dc9fa-2c32-4bf0-915d-79dd62d497f4","Type":"ContainerStarted","Data":"34c7e5a6393c76cfd94c08075486cd00c2aa7237234b62ff7113619d28804dc2"} Sep 29 11:12:25 crc kubenswrapper[4752]: I0929 11:12:25.645292 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lcn2q" event={"ID":"168dc9fa-2c32-4bf0-915d-79dd62d497f4","Type":"ContainerStarted","Data":"5db5a1212417cb4c7eb47db568e6a4778f7011c3e80caacb5c940f39c07d6dd3"} Sep 29 11:12:27 crc kubenswrapper[4752]: I0929 11:12:27.663445 4752 generic.go:334] "Generic (PLEG): container finished" podID="168dc9fa-2c32-4bf0-915d-79dd62d497f4" containerID="5db5a1212417cb4c7eb47db568e6a4778f7011c3e80caacb5c940f39c07d6dd3" exitCode=0 Sep 29 11:12:27 crc kubenswrapper[4752]: I0929 11:12:27.663499 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lcn2q" event={"ID":"168dc9fa-2c32-4bf0-915d-79dd62d497f4","Type":"ContainerDied","Data":"5db5a1212417cb4c7eb47db568e6a4778f7011c3e80caacb5c940f39c07d6dd3"} Sep 29 11:12:28 crc kubenswrapper[4752]: I0929 11:12:28.675344 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lcn2q" event={"ID":"168dc9fa-2c32-4bf0-915d-79dd62d497f4","Type":"ContainerStarted","Data":"e3924a0772f610357f3e56b0d36c3c26551c38e8002fbc1fb936cabe62d5966f"} Sep 29 11:12:28 crc kubenswrapper[4752]: I0929 11:12:28.703775 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-lcn2q" podStartSLOduration=2.276935641 podStartE2EDuration="5.703748302s" podCreationTimestamp="2025-09-29 11:12:23 +0000 UTC" firstStartedPulling="2025-09-29 11:12:24.635710217 +0000 UTC m=+1685.424851884" lastFinishedPulling="2025-09-29 11:12:28.062522878 +0000 UTC m=+1688.851664545" observedRunningTime="2025-09-29 11:12:28.696742498 +0000 UTC m=+1689.485884175" watchObservedRunningTime="2025-09-29 11:12:28.703748302 +0000 UTC m=+1689.492889979" Sep 29 11:12:29 crc kubenswrapper[4752]: I0929 11:12:29.769402 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mnhzk"] Sep 29 11:12:29 crc kubenswrapper[4752]: I0929 11:12:29.771249 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mnhzk" Sep 29 11:12:29 crc kubenswrapper[4752]: I0929 11:12:29.783588 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mnhzk"] Sep 29 11:12:29 crc kubenswrapper[4752]: I0929 11:12:29.852290 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9ee24d0-9dd5-4fa4-8483-ecd0b3f0c4ff-catalog-content\") pod \"certified-operators-mnhzk\" (UID: \"f9ee24d0-9dd5-4fa4-8483-ecd0b3f0c4ff\") " pod="openshift-marketplace/certified-operators-mnhzk" Sep 29 11:12:29 crc kubenswrapper[4752]: I0929 11:12:29.852421 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpmrj\" (UniqueName: \"kubernetes.io/projected/f9ee24d0-9dd5-4fa4-8483-ecd0b3f0c4ff-kube-api-access-vpmrj\") pod \"certified-operators-mnhzk\" (UID: \"f9ee24d0-9dd5-4fa4-8483-ecd0b3f0c4ff\") " pod="openshift-marketplace/certified-operators-mnhzk" Sep 29 11:12:29 crc kubenswrapper[4752]: I0929 11:12:29.852464 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9ee24d0-9dd5-4fa4-8483-ecd0b3f0c4ff-utilities\") pod \"certified-operators-mnhzk\" (UID: \"f9ee24d0-9dd5-4fa4-8483-ecd0b3f0c4ff\") " pod="openshift-marketplace/certified-operators-mnhzk" Sep 29 11:12:29 crc kubenswrapper[4752]: I0929 11:12:29.953732 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpmrj\" (UniqueName: \"kubernetes.io/projected/f9ee24d0-9dd5-4fa4-8483-ecd0b3f0c4ff-kube-api-access-vpmrj\") pod \"certified-operators-mnhzk\" (UID: \"f9ee24d0-9dd5-4fa4-8483-ecd0b3f0c4ff\") " pod="openshift-marketplace/certified-operators-mnhzk" Sep 29 11:12:29 crc kubenswrapper[4752]: I0929 11:12:29.953777 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9ee24d0-9dd5-4fa4-8483-ecd0b3f0c4ff-utilities\") pod \"certified-operators-mnhzk\" (UID: \"f9ee24d0-9dd5-4fa4-8483-ecd0b3f0c4ff\") " pod="openshift-marketplace/certified-operators-mnhzk" Sep 29 11:12:29 crc kubenswrapper[4752]: I0929 11:12:29.953873 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9ee24d0-9dd5-4fa4-8483-ecd0b3f0c4ff-catalog-content\") pod \"certified-operators-mnhzk\" (UID: \"f9ee24d0-9dd5-4fa4-8483-ecd0b3f0c4ff\") " pod="openshift-marketplace/certified-operators-mnhzk" Sep 29 11:12:29 crc kubenswrapper[4752]: I0929 11:12:29.954379 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9ee24d0-9dd5-4fa4-8483-ecd0b3f0c4ff-catalog-content\") pod \"certified-operators-mnhzk\" (UID: \"f9ee24d0-9dd5-4fa4-8483-ecd0b3f0c4ff\") " pod="openshift-marketplace/certified-operators-mnhzk" Sep 29 11:12:29 crc kubenswrapper[4752]: I0929 11:12:29.954407 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9ee24d0-9dd5-4fa4-8483-ecd0b3f0c4ff-utilities\") pod \"certified-operators-mnhzk\" (UID: \"f9ee24d0-9dd5-4fa4-8483-ecd0b3f0c4ff\") " pod="openshift-marketplace/certified-operators-mnhzk" Sep 29 11:12:29 crc kubenswrapper[4752]: I0929 11:12:29.972305 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpmrj\" (UniqueName: \"kubernetes.io/projected/f9ee24d0-9dd5-4fa4-8483-ecd0b3f0c4ff-kube-api-access-vpmrj\") pod \"certified-operators-mnhzk\" (UID: \"f9ee24d0-9dd5-4fa4-8483-ecd0b3f0c4ff\") " pod="openshift-marketplace/certified-operators-mnhzk" Sep 29 11:12:30 crc kubenswrapper[4752]: I0929 11:12:30.037205 4752 scope.go:117] "RemoveContainer" containerID="18eab399f36ee078445fd05909a0d35ada9fdfa2424d9729b71ad67d5ec2e670" Sep 29 11:12:30 crc kubenswrapper[4752]: E0929 11:12:30.037470 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgrvs_openshift-machine-config-operator(5863c243-797d-462a-b11f-71aaf005f8d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" Sep 29 11:12:30 crc kubenswrapper[4752]: I0929 11:12:30.095698 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mnhzk" Sep 29 11:12:30 crc kubenswrapper[4752]: I0929 11:12:30.628540 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mnhzk"] Sep 29 11:12:30 crc kubenswrapper[4752]: W0929 11:12:30.668706 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9ee24d0_9dd5_4fa4_8483_ecd0b3f0c4ff.slice/crio-910d11f3287e93a47225b77095b6286676a94890e174acbed685601ad3ee3697 WatchSource:0}: Error finding container 910d11f3287e93a47225b77095b6286676a94890e174acbed685601ad3ee3697: Status 404 returned error can't find the container with id 910d11f3287e93a47225b77095b6286676a94890e174acbed685601ad3ee3697 Sep 29 11:12:30 crc kubenswrapper[4752]: I0929 11:12:30.700085 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mnhzk" event={"ID":"f9ee24d0-9dd5-4fa4-8483-ecd0b3f0c4ff","Type":"ContainerStarted","Data":"910d11f3287e93a47225b77095b6286676a94890e174acbed685601ad3ee3697"} Sep 29 11:12:31 crc kubenswrapper[4752]: I0929 11:12:31.709632 4752 generic.go:334] "Generic (PLEG): container finished" podID="f9ee24d0-9dd5-4fa4-8483-ecd0b3f0c4ff" containerID="9a505861521e897b477d7d25ca5f422c484fed4caaa24b096c10f8b6d32d21be" exitCode=0 Sep 29 11:12:31 crc kubenswrapper[4752]: I0929 11:12:31.709691 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mnhzk" event={"ID":"f9ee24d0-9dd5-4fa4-8483-ecd0b3f0c4ff","Type":"ContainerDied","Data":"9a505861521e897b477d7d25ca5f422c484fed4caaa24b096c10f8b6d32d21be"} Sep 29 11:12:33 crc kubenswrapper[4752]: I0929 11:12:33.698539 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-lcn2q" Sep 29 11:12:33 crc kubenswrapper[4752]: I0929 11:12:33.698962 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-lcn2q" Sep 29 11:12:33 crc kubenswrapper[4752]: I0929 11:12:33.735736 4752 generic.go:334] "Generic (PLEG): container finished" podID="f9ee24d0-9dd5-4fa4-8483-ecd0b3f0c4ff" containerID="acfc9c29d6cdbeb49bc45eae22870d69fd62ac583783a806356d3ddc77263083" exitCode=0 Sep 29 11:12:33 crc kubenswrapper[4752]: I0929 11:12:33.735777 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mnhzk" event={"ID":"f9ee24d0-9dd5-4fa4-8483-ecd0b3f0c4ff","Type":"ContainerDied","Data":"acfc9c29d6cdbeb49bc45eae22870d69fd62ac583783a806356d3ddc77263083"} Sep 29 11:12:34 crc kubenswrapper[4752]: I0929 11:12:34.742636 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-lcn2q" podUID="168dc9fa-2c32-4bf0-915d-79dd62d497f4" containerName="registry-server" probeResult="failure" output=< Sep 29 11:12:34 crc kubenswrapper[4752]: timeout: failed to connect service ":50051" within 1s Sep 29 11:12:34 crc kubenswrapper[4752]: > Sep 29 11:12:34 crc kubenswrapper[4752]: I0929 11:12:34.749886 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mnhzk" event={"ID":"f9ee24d0-9dd5-4fa4-8483-ecd0b3f0c4ff","Type":"ContainerStarted","Data":"cbedeb1da3849fa88003dd0483b942344ed993bef73f1ef6dd7f2e99514281f0"} Sep 29 11:12:34 crc kubenswrapper[4752]: I0929 11:12:34.770881 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mnhzk" podStartSLOduration=3.300295882 podStartE2EDuration="5.770863575s" podCreationTimestamp="2025-09-29 11:12:29 +0000 UTC" firstStartedPulling="2025-09-29 11:12:31.711204942 +0000 UTC m=+1692.500346609" lastFinishedPulling="2025-09-29 11:12:34.181772635 +0000 UTC m=+1694.970914302" observedRunningTime="2025-09-29 11:12:34.76761898 +0000 UTC m=+1695.556760647" watchObservedRunningTime="2025-09-29 11:12:34.770863575 +0000 UTC m=+1695.560005242" Sep 29 11:12:40 crc kubenswrapper[4752]: I0929 11:12:40.096687 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mnhzk" Sep 29 11:12:40 crc kubenswrapper[4752]: I0929 11:12:40.096773 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mnhzk" Sep 29 11:12:40 crc kubenswrapper[4752]: I0929 11:12:40.144050 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mnhzk" Sep 29 11:12:40 crc kubenswrapper[4752]: I0929 11:12:40.842906 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mnhzk" Sep 29 11:12:43 crc kubenswrapper[4752]: I0929 11:12:43.767518 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mnhzk"] Sep 29 11:12:43 crc kubenswrapper[4752]: I0929 11:12:43.768991 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-mnhzk" podUID="f9ee24d0-9dd5-4fa4-8483-ecd0b3f0c4ff" containerName="registry-server" containerID="cri-o://cbedeb1da3849fa88003dd0483b942344ed993bef73f1ef6dd7f2e99514281f0" gracePeriod=2 Sep 29 11:12:44 crc kubenswrapper[4752]: I0929 11:12:44.030995 4752 scope.go:117] "RemoveContainer" containerID="18eab399f36ee078445fd05909a0d35ada9fdfa2424d9729b71ad67d5ec2e670" Sep 29 11:12:44 crc kubenswrapper[4752]: E0929 11:12:44.031748 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgrvs_openshift-machine-config-operator(5863c243-797d-462a-b11f-71aaf005f8d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" Sep 29 11:12:44 crc kubenswrapper[4752]: I0929 11:12:44.220150 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mnhzk" Sep 29 11:12:44 crc kubenswrapper[4752]: I0929 11:12:44.405512 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vpmrj\" (UniqueName: \"kubernetes.io/projected/f9ee24d0-9dd5-4fa4-8483-ecd0b3f0c4ff-kube-api-access-vpmrj\") pod \"f9ee24d0-9dd5-4fa4-8483-ecd0b3f0c4ff\" (UID: \"f9ee24d0-9dd5-4fa4-8483-ecd0b3f0c4ff\") " Sep 29 11:12:44 crc kubenswrapper[4752]: I0929 11:12:44.405574 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9ee24d0-9dd5-4fa4-8483-ecd0b3f0c4ff-catalog-content\") pod \"f9ee24d0-9dd5-4fa4-8483-ecd0b3f0c4ff\" (UID: \"f9ee24d0-9dd5-4fa4-8483-ecd0b3f0c4ff\") " Sep 29 11:12:44 crc kubenswrapper[4752]: I0929 11:12:44.406251 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9ee24d0-9dd5-4fa4-8483-ecd0b3f0c4ff-utilities\") pod \"f9ee24d0-9dd5-4fa4-8483-ecd0b3f0c4ff\" (UID: \"f9ee24d0-9dd5-4fa4-8483-ecd0b3f0c4ff\") " Sep 29 11:12:44 crc kubenswrapper[4752]: I0929 11:12:44.407220 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9ee24d0-9dd5-4fa4-8483-ecd0b3f0c4ff-utilities" (OuterVolumeSpecName: "utilities") pod "f9ee24d0-9dd5-4fa4-8483-ecd0b3f0c4ff" (UID: "f9ee24d0-9dd5-4fa4-8483-ecd0b3f0c4ff"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 11:12:44 crc kubenswrapper[4752]: I0929 11:12:44.411419 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9ee24d0-9dd5-4fa4-8483-ecd0b3f0c4ff-kube-api-access-vpmrj" (OuterVolumeSpecName: "kube-api-access-vpmrj") pod "f9ee24d0-9dd5-4fa4-8483-ecd0b3f0c4ff" (UID: "f9ee24d0-9dd5-4fa4-8483-ecd0b3f0c4ff"). InnerVolumeSpecName "kube-api-access-vpmrj". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 11:12:44 crc kubenswrapper[4752]: I0929 11:12:44.449281 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9ee24d0-9dd5-4fa4-8483-ecd0b3f0c4ff-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f9ee24d0-9dd5-4fa4-8483-ecd0b3f0c4ff" (UID: "f9ee24d0-9dd5-4fa4-8483-ecd0b3f0c4ff"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 11:12:44 crc kubenswrapper[4752]: I0929 11:12:44.507971 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9ee24d0-9dd5-4fa4-8483-ecd0b3f0c4ff-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 11:12:44 crc kubenswrapper[4752]: I0929 11:12:44.508011 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vpmrj\" (UniqueName: \"kubernetes.io/projected/f9ee24d0-9dd5-4fa4-8483-ecd0b3f0c4ff-kube-api-access-vpmrj\") on node \"crc\" DevicePath \"\"" Sep 29 11:12:44 crc kubenswrapper[4752]: I0929 11:12:44.508021 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9ee24d0-9dd5-4fa4-8483-ecd0b3f0c4ff-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 11:12:44 crc kubenswrapper[4752]: I0929 11:12:44.745859 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-lcn2q" podUID="168dc9fa-2c32-4bf0-915d-79dd62d497f4" containerName="registry-server" probeResult="failure" output=< Sep 29 11:12:44 crc kubenswrapper[4752]: timeout: failed to connect service ":50051" within 1s Sep 29 11:12:44 crc kubenswrapper[4752]: > Sep 29 11:12:44 crc kubenswrapper[4752]: I0929 11:12:44.831616 4752 generic.go:334] "Generic (PLEG): container finished" podID="f9ee24d0-9dd5-4fa4-8483-ecd0b3f0c4ff" containerID="cbedeb1da3849fa88003dd0483b942344ed993bef73f1ef6dd7f2e99514281f0" exitCode=0 Sep 29 11:12:44 crc kubenswrapper[4752]: I0929 11:12:44.831665 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mnhzk" event={"ID":"f9ee24d0-9dd5-4fa4-8483-ecd0b3f0c4ff","Type":"ContainerDied","Data":"cbedeb1da3849fa88003dd0483b942344ed993bef73f1ef6dd7f2e99514281f0"} Sep 29 11:12:44 crc kubenswrapper[4752]: I0929 11:12:44.831693 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mnhzk" event={"ID":"f9ee24d0-9dd5-4fa4-8483-ecd0b3f0c4ff","Type":"ContainerDied","Data":"910d11f3287e93a47225b77095b6286676a94890e174acbed685601ad3ee3697"} Sep 29 11:12:44 crc kubenswrapper[4752]: I0929 11:12:44.831710 4752 scope.go:117] "RemoveContainer" containerID="cbedeb1da3849fa88003dd0483b942344ed993bef73f1ef6dd7f2e99514281f0" Sep 29 11:12:44 crc kubenswrapper[4752]: I0929 11:12:44.831865 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mnhzk" Sep 29 11:12:44 crc kubenswrapper[4752]: I0929 11:12:44.859041 4752 scope.go:117] "RemoveContainer" containerID="acfc9c29d6cdbeb49bc45eae22870d69fd62ac583783a806356d3ddc77263083" Sep 29 11:12:44 crc kubenswrapper[4752]: I0929 11:12:44.870776 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mnhzk"] Sep 29 11:12:44 crc kubenswrapper[4752]: I0929 11:12:44.877440 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-mnhzk"] Sep 29 11:12:44 crc kubenswrapper[4752]: I0929 11:12:44.884815 4752 scope.go:117] "RemoveContainer" containerID="9a505861521e897b477d7d25ca5f422c484fed4caaa24b096c10f8b6d32d21be" Sep 29 11:12:44 crc kubenswrapper[4752]: I0929 11:12:44.919416 4752 scope.go:117] "RemoveContainer" containerID="cbedeb1da3849fa88003dd0483b942344ed993bef73f1ef6dd7f2e99514281f0" Sep 29 11:12:44 crc kubenswrapper[4752]: E0929 11:12:44.919974 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbedeb1da3849fa88003dd0483b942344ed993bef73f1ef6dd7f2e99514281f0\": container with ID starting with cbedeb1da3849fa88003dd0483b942344ed993bef73f1ef6dd7f2e99514281f0 not found: ID does not exist" containerID="cbedeb1da3849fa88003dd0483b942344ed993bef73f1ef6dd7f2e99514281f0" Sep 29 11:12:44 crc kubenswrapper[4752]: I0929 11:12:44.920009 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbedeb1da3849fa88003dd0483b942344ed993bef73f1ef6dd7f2e99514281f0"} err="failed to get container status \"cbedeb1da3849fa88003dd0483b942344ed993bef73f1ef6dd7f2e99514281f0\": rpc error: code = NotFound desc = could not find container \"cbedeb1da3849fa88003dd0483b942344ed993bef73f1ef6dd7f2e99514281f0\": container with ID starting with cbedeb1da3849fa88003dd0483b942344ed993bef73f1ef6dd7f2e99514281f0 not found: ID does not exist" Sep 29 11:12:44 crc kubenswrapper[4752]: I0929 11:12:44.920032 4752 scope.go:117] "RemoveContainer" containerID="acfc9c29d6cdbeb49bc45eae22870d69fd62ac583783a806356d3ddc77263083" Sep 29 11:12:44 crc kubenswrapper[4752]: E0929 11:12:44.920445 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"acfc9c29d6cdbeb49bc45eae22870d69fd62ac583783a806356d3ddc77263083\": container with ID starting with acfc9c29d6cdbeb49bc45eae22870d69fd62ac583783a806356d3ddc77263083 not found: ID does not exist" containerID="acfc9c29d6cdbeb49bc45eae22870d69fd62ac583783a806356d3ddc77263083" Sep 29 11:12:44 crc kubenswrapper[4752]: I0929 11:12:44.920476 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"acfc9c29d6cdbeb49bc45eae22870d69fd62ac583783a806356d3ddc77263083"} err="failed to get container status \"acfc9c29d6cdbeb49bc45eae22870d69fd62ac583783a806356d3ddc77263083\": rpc error: code = NotFound desc = could not find container \"acfc9c29d6cdbeb49bc45eae22870d69fd62ac583783a806356d3ddc77263083\": container with ID starting with acfc9c29d6cdbeb49bc45eae22870d69fd62ac583783a806356d3ddc77263083 not found: ID does not exist" Sep 29 11:12:44 crc kubenswrapper[4752]: I0929 11:12:44.920494 4752 scope.go:117] "RemoveContainer" containerID="9a505861521e897b477d7d25ca5f422c484fed4caaa24b096c10f8b6d32d21be" Sep 29 11:12:44 crc kubenswrapper[4752]: E0929 11:12:44.920855 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a505861521e897b477d7d25ca5f422c484fed4caaa24b096c10f8b6d32d21be\": container with ID starting with 9a505861521e897b477d7d25ca5f422c484fed4caaa24b096c10f8b6d32d21be not found: ID does not exist" containerID="9a505861521e897b477d7d25ca5f422c484fed4caaa24b096c10f8b6d32d21be" Sep 29 11:12:44 crc kubenswrapper[4752]: I0929 11:12:44.920882 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a505861521e897b477d7d25ca5f422c484fed4caaa24b096c10f8b6d32d21be"} err="failed to get container status \"9a505861521e897b477d7d25ca5f422c484fed4caaa24b096c10f8b6d32d21be\": rpc error: code = NotFound desc = could not find container \"9a505861521e897b477d7d25ca5f422c484fed4caaa24b096c10f8b6d32d21be\": container with ID starting with 9a505861521e897b477d7d25ca5f422c484fed4caaa24b096c10f8b6d32d21be not found: ID does not exist" Sep 29 11:12:46 crc kubenswrapper[4752]: I0929 11:12:46.042184 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9ee24d0-9dd5-4fa4-8483-ecd0b3f0c4ff" path="/var/lib/kubelet/pods/f9ee24d0-9dd5-4fa4-8483-ecd0b3f0c4ff/volumes" Sep 29 11:12:54 crc kubenswrapper[4752]: I0929 11:12:54.744285 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-lcn2q" podUID="168dc9fa-2c32-4bf0-915d-79dd62d497f4" containerName="registry-server" probeResult="failure" output=< Sep 29 11:12:54 crc kubenswrapper[4752]: timeout: failed to connect service ":50051" within 1s Sep 29 11:12:54 crc kubenswrapper[4752]: > Sep 29 11:12:55 crc kubenswrapper[4752]: I0929 11:12:55.032479 4752 scope.go:117] "RemoveContainer" containerID="18eab399f36ee078445fd05909a0d35ada9fdfa2424d9729b71ad67d5ec2e670" Sep 29 11:12:55 crc kubenswrapper[4752]: E0929 11:12:55.032799 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgrvs_openshift-machine-config-operator(5863c243-797d-462a-b11f-71aaf005f8d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" Sep 29 11:13:04 crc kubenswrapper[4752]: I0929 11:13:04.742596 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-lcn2q" podUID="168dc9fa-2c32-4bf0-915d-79dd62d497f4" containerName="registry-server" probeResult="failure" output=< Sep 29 11:13:04 crc kubenswrapper[4752]: timeout: failed to connect service ":50051" within 1s Sep 29 11:13:04 crc kubenswrapper[4752]: > Sep 29 11:13:10 crc kubenswrapper[4752]: I0929 11:13:10.040327 4752 scope.go:117] "RemoveContainer" containerID="18eab399f36ee078445fd05909a0d35ada9fdfa2424d9729b71ad67d5ec2e670" Sep 29 11:13:10 crc kubenswrapper[4752]: E0929 11:13:10.041658 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgrvs_openshift-machine-config-operator(5863c243-797d-462a-b11f-71aaf005f8d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" Sep 29 11:13:13 crc kubenswrapper[4752]: I0929 11:13:13.742741 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-lcn2q" Sep 29 11:13:13 crc kubenswrapper[4752]: I0929 11:13:13.796746 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-lcn2q" Sep 29 11:13:16 crc kubenswrapper[4752]: I0929 11:13:16.073439 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/keystone-db-create-8mp5x"] Sep 29 11:13:16 crc kubenswrapper[4752]: I0929 11:13:16.084215 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/keystone-db-create-8mp5x"] Sep 29 11:13:17 crc kubenswrapper[4752]: I0929 11:13:17.362055 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lcn2q"] Sep 29 11:13:17 crc kubenswrapper[4752]: I0929 11:13:17.362286 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-lcn2q" podUID="168dc9fa-2c32-4bf0-915d-79dd62d497f4" containerName="registry-server" containerID="cri-o://e3924a0772f610357f3e56b0d36c3c26551c38e8002fbc1fb936cabe62d5966f" gracePeriod=2 Sep 29 11:13:17 crc kubenswrapper[4752]: I0929 11:13:17.789595 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lcn2q" Sep 29 11:13:17 crc kubenswrapper[4752]: I0929 11:13:17.889443 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/168dc9fa-2c32-4bf0-915d-79dd62d497f4-catalog-content\") pod \"168dc9fa-2c32-4bf0-915d-79dd62d497f4\" (UID: \"168dc9fa-2c32-4bf0-915d-79dd62d497f4\") " Sep 29 11:13:17 crc kubenswrapper[4752]: I0929 11:13:17.889565 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n855v\" (UniqueName: \"kubernetes.io/projected/168dc9fa-2c32-4bf0-915d-79dd62d497f4-kube-api-access-n855v\") pod \"168dc9fa-2c32-4bf0-915d-79dd62d497f4\" (UID: \"168dc9fa-2c32-4bf0-915d-79dd62d497f4\") " Sep 29 11:13:17 crc kubenswrapper[4752]: I0929 11:13:17.889704 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/168dc9fa-2c32-4bf0-915d-79dd62d497f4-utilities\") pod \"168dc9fa-2c32-4bf0-915d-79dd62d497f4\" (UID: \"168dc9fa-2c32-4bf0-915d-79dd62d497f4\") " Sep 29 11:13:17 crc kubenswrapper[4752]: I0929 11:13:17.890443 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/168dc9fa-2c32-4bf0-915d-79dd62d497f4-utilities" (OuterVolumeSpecName: "utilities") pod "168dc9fa-2c32-4bf0-915d-79dd62d497f4" (UID: "168dc9fa-2c32-4bf0-915d-79dd62d497f4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 11:13:17 crc kubenswrapper[4752]: I0929 11:13:17.896062 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/168dc9fa-2c32-4bf0-915d-79dd62d497f4-kube-api-access-n855v" (OuterVolumeSpecName: "kube-api-access-n855v") pod "168dc9fa-2c32-4bf0-915d-79dd62d497f4" (UID: "168dc9fa-2c32-4bf0-915d-79dd62d497f4"). InnerVolumeSpecName "kube-api-access-n855v". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 11:13:17 crc kubenswrapper[4752]: I0929 11:13:17.968991 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/168dc9fa-2c32-4bf0-915d-79dd62d497f4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "168dc9fa-2c32-4bf0-915d-79dd62d497f4" (UID: "168dc9fa-2c32-4bf0-915d-79dd62d497f4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 11:13:17 crc kubenswrapper[4752]: I0929 11:13:17.991303 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n855v\" (UniqueName: \"kubernetes.io/projected/168dc9fa-2c32-4bf0-915d-79dd62d497f4-kube-api-access-n855v\") on node \"crc\" DevicePath \"\"" Sep 29 11:13:17 crc kubenswrapper[4752]: I0929 11:13:17.991381 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/168dc9fa-2c32-4bf0-915d-79dd62d497f4-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 11:13:17 crc kubenswrapper[4752]: I0929 11:13:17.991394 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/168dc9fa-2c32-4bf0-915d-79dd62d497f4-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 11:13:18 crc kubenswrapper[4752]: I0929 11:13:18.041650 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e25c334c-c7a9-46d2-b94b-9956db4aac80" path="/var/lib/kubelet/pods/e25c334c-c7a9-46d2-b94b-9956db4aac80/volumes" Sep 29 11:13:18 crc kubenswrapper[4752]: I0929 11:13:18.152244 4752 generic.go:334] "Generic (PLEG): container finished" podID="168dc9fa-2c32-4bf0-915d-79dd62d497f4" containerID="e3924a0772f610357f3e56b0d36c3c26551c38e8002fbc1fb936cabe62d5966f" exitCode=0 Sep 29 11:13:18 crc kubenswrapper[4752]: I0929 11:13:18.152480 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lcn2q" Sep 29 11:13:18 crc kubenswrapper[4752]: I0929 11:13:18.152761 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lcn2q" event={"ID":"168dc9fa-2c32-4bf0-915d-79dd62d497f4","Type":"ContainerDied","Data":"e3924a0772f610357f3e56b0d36c3c26551c38e8002fbc1fb936cabe62d5966f"} Sep 29 11:13:18 crc kubenswrapper[4752]: I0929 11:13:18.153101 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lcn2q" event={"ID":"168dc9fa-2c32-4bf0-915d-79dd62d497f4","Type":"ContainerDied","Data":"34c7e5a6393c76cfd94c08075486cd00c2aa7237234b62ff7113619d28804dc2"} Sep 29 11:13:18 crc kubenswrapper[4752]: I0929 11:13:18.153148 4752 scope.go:117] "RemoveContainer" containerID="e3924a0772f610357f3e56b0d36c3c26551c38e8002fbc1fb936cabe62d5966f" Sep 29 11:13:18 crc kubenswrapper[4752]: I0929 11:13:18.182140 4752 scope.go:117] "RemoveContainer" containerID="5db5a1212417cb4c7eb47db568e6a4778f7011c3e80caacb5c940f39c07d6dd3" Sep 29 11:13:18 crc kubenswrapper[4752]: I0929 11:13:18.185621 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lcn2q"] Sep 29 11:13:18 crc kubenswrapper[4752]: I0929 11:13:18.196431 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-lcn2q"] Sep 29 11:13:18 crc kubenswrapper[4752]: I0929 11:13:18.211929 4752 scope.go:117] "RemoveContainer" containerID="1efe63563b053b4cc5f05d803d8d10a924f2e25999512d1913ea41aa7b845c38" Sep 29 11:13:18 crc kubenswrapper[4752]: I0929 11:13:18.240548 4752 scope.go:117] "RemoveContainer" containerID="e3924a0772f610357f3e56b0d36c3c26551c38e8002fbc1fb936cabe62d5966f" Sep 29 11:13:18 crc kubenswrapper[4752]: E0929 11:13:18.241382 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3924a0772f610357f3e56b0d36c3c26551c38e8002fbc1fb936cabe62d5966f\": container with ID starting with e3924a0772f610357f3e56b0d36c3c26551c38e8002fbc1fb936cabe62d5966f not found: ID does not exist" containerID="e3924a0772f610357f3e56b0d36c3c26551c38e8002fbc1fb936cabe62d5966f" Sep 29 11:13:18 crc kubenswrapper[4752]: I0929 11:13:18.241451 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3924a0772f610357f3e56b0d36c3c26551c38e8002fbc1fb936cabe62d5966f"} err="failed to get container status \"e3924a0772f610357f3e56b0d36c3c26551c38e8002fbc1fb936cabe62d5966f\": rpc error: code = NotFound desc = could not find container \"e3924a0772f610357f3e56b0d36c3c26551c38e8002fbc1fb936cabe62d5966f\": container with ID starting with e3924a0772f610357f3e56b0d36c3c26551c38e8002fbc1fb936cabe62d5966f not found: ID does not exist" Sep 29 11:13:18 crc kubenswrapper[4752]: I0929 11:13:18.241481 4752 scope.go:117] "RemoveContainer" containerID="5db5a1212417cb4c7eb47db568e6a4778f7011c3e80caacb5c940f39c07d6dd3" Sep 29 11:13:18 crc kubenswrapper[4752]: E0929 11:13:18.242099 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5db5a1212417cb4c7eb47db568e6a4778f7011c3e80caacb5c940f39c07d6dd3\": container with ID starting with 5db5a1212417cb4c7eb47db568e6a4778f7011c3e80caacb5c940f39c07d6dd3 not found: ID does not exist" containerID="5db5a1212417cb4c7eb47db568e6a4778f7011c3e80caacb5c940f39c07d6dd3" Sep 29 11:13:18 crc kubenswrapper[4752]: I0929 11:13:18.242147 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5db5a1212417cb4c7eb47db568e6a4778f7011c3e80caacb5c940f39c07d6dd3"} err="failed to get container status \"5db5a1212417cb4c7eb47db568e6a4778f7011c3e80caacb5c940f39c07d6dd3\": rpc error: code = NotFound desc = could not find container \"5db5a1212417cb4c7eb47db568e6a4778f7011c3e80caacb5c940f39c07d6dd3\": container with ID starting with 5db5a1212417cb4c7eb47db568e6a4778f7011c3e80caacb5c940f39c07d6dd3 not found: ID does not exist" Sep 29 11:13:18 crc kubenswrapper[4752]: I0929 11:13:18.242166 4752 scope.go:117] "RemoveContainer" containerID="1efe63563b053b4cc5f05d803d8d10a924f2e25999512d1913ea41aa7b845c38" Sep 29 11:13:18 crc kubenswrapper[4752]: E0929 11:13:18.242788 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1efe63563b053b4cc5f05d803d8d10a924f2e25999512d1913ea41aa7b845c38\": container with ID starting with 1efe63563b053b4cc5f05d803d8d10a924f2e25999512d1913ea41aa7b845c38 not found: ID does not exist" containerID="1efe63563b053b4cc5f05d803d8d10a924f2e25999512d1913ea41aa7b845c38" Sep 29 11:13:18 crc kubenswrapper[4752]: I0929 11:13:18.242838 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1efe63563b053b4cc5f05d803d8d10a924f2e25999512d1913ea41aa7b845c38"} err="failed to get container status \"1efe63563b053b4cc5f05d803d8d10a924f2e25999512d1913ea41aa7b845c38\": rpc error: code = NotFound desc = could not find container \"1efe63563b053b4cc5f05d803d8d10a924f2e25999512d1913ea41aa7b845c38\": container with ID starting with 1efe63563b053b4cc5f05d803d8d10a924f2e25999512d1913ea41aa7b845c38 not found: ID does not exist" Sep 29 11:13:20 crc kubenswrapper[4752]: I0929 11:13:20.045125 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="168dc9fa-2c32-4bf0-915d-79dd62d497f4" path="/var/lib/kubelet/pods/168dc9fa-2c32-4bf0-915d-79dd62d497f4/volumes" Sep 29 11:13:21 crc kubenswrapper[4752]: I0929 11:13:21.078151 4752 scope.go:117] "RemoveContainer" containerID="6e951f81f5c5a9b7c5ffc69ed43b0a83d60384b01f23a028a73575b9edb36eea" Sep 29 11:13:25 crc kubenswrapper[4752]: I0929 11:13:25.031057 4752 scope.go:117] "RemoveContainer" containerID="18eab399f36ee078445fd05909a0d35ada9fdfa2424d9729b71ad67d5ec2e670" Sep 29 11:13:25 crc kubenswrapper[4752]: E0929 11:13:25.031768 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgrvs_openshift-machine-config-operator(5863c243-797d-462a-b11f-71aaf005f8d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" Sep 29 11:13:26 crc kubenswrapper[4752]: I0929 11:13:26.026188 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/keystone-238e-account-create-vpltk"] Sep 29 11:13:26 crc kubenswrapper[4752]: I0929 11:13:26.042984 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/keystone-238e-account-create-vpltk"] Sep 29 11:13:28 crc kubenswrapper[4752]: I0929 11:13:28.054093 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a39b4ca-d533-4620-8e4e-827f7e2dc8de" path="/var/lib/kubelet/pods/9a39b4ca-d533-4620-8e4e-827f7e2dc8de/volumes" Sep 29 11:13:38 crc kubenswrapper[4752]: I0929 11:13:38.031363 4752 scope.go:117] "RemoveContainer" containerID="18eab399f36ee078445fd05909a0d35ada9fdfa2424d9729b71ad67d5ec2e670" Sep 29 11:13:38 crc kubenswrapper[4752]: E0929 11:13:38.033331 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgrvs_openshift-machine-config-operator(5863c243-797d-462a-b11f-71aaf005f8d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" Sep 29 11:13:49 crc kubenswrapper[4752]: I0929 11:13:49.030898 4752 scope.go:117] "RemoveContainer" containerID="18eab399f36ee078445fd05909a0d35ada9fdfa2424d9729b71ad67d5ec2e670" Sep 29 11:13:49 crc kubenswrapper[4752]: E0929 11:13:49.031662 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgrvs_openshift-machine-config-operator(5863c243-797d-462a-b11f-71aaf005f8d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" Sep 29 11:13:56 crc kubenswrapper[4752]: I0929 11:13:56.072478 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/keystone-db-sync-6p29p"] Sep 29 11:13:56 crc kubenswrapper[4752]: I0929 11:13:56.079677 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/keystone-db-sync-6p29p"] Sep 29 11:13:58 crc kubenswrapper[4752]: I0929 11:13:58.041371 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1cce556e-44f8-46c9-ba1d-b609c7e6468d" path="/var/lib/kubelet/pods/1cce556e-44f8-46c9-ba1d-b609c7e6468d/volumes" Sep 29 11:14:00 crc kubenswrapper[4752]: I0929 11:14:00.038839 4752 scope.go:117] "RemoveContainer" containerID="18eab399f36ee078445fd05909a0d35ada9fdfa2424d9729b71ad67d5ec2e670" Sep 29 11:14:00 crc kubenswrapper[4752]: I0929 11:14:00.516160 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" event={"ID":"5863c243-797d-462a-b11f-71aaf005f8d1","Type":"ContainerStarted","Data":"910e9715ff191c5fe48e666106182c9ab8ee872d75d2fda3908f263b58dd32be"} Sep 29 11:14:20 crc kubenswrapper[4752]: I0929 11:14:20.049628 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/keystone-bootstrap-fr8mq"] Sep 29 11:14:20 crc kubenswrapper[4752]: I0929 11:14:20.056177 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/keystone-bootstrap-fr8mq"] Sep 29 11:14:21 crc kubenswrapper[4752]: I0929 11:14:21.162607 4752 scope.go:117] "RemoveContainer" containerID="4d3149214022ac33d5cd5dcaa9685e20715f7a26cc5a0de14b7149211b77adde" Sep 29 11:14:21 crc kubenswrapper[4752]: I0929 11:14:21.197604 4752 scope.go:117] "RemoveContainer" containerID="53d9cefff49de70973024a39cc6726dd9a2b5fd75f345c1ae68c6f8604cbae54" Sep 29 11:14:22 crc kubenswrapper[4752]: I0929 11:14:22.047617 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="205ab4e1-725f-4b7e-8c70-b809d2390460" path="/var/lib/kubelet/pods/205ab4e1-725f-4b7e-8c70-b809d2390460/volumes" Sep 29 11:15:00 crc kubenswrapper[4752]: I0929 11:15:00.155318 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29319075-jfm9n"] Sep 29 11:15:00 crc kubenswrapper[4752]: E0929 11:15:00.156142 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9ee24d0-9dd5-4fa4-8483-ecd0b3f0c4ff" containerName="extract-content" Sep 29 11:15:00 crc kubenswrapper[4752]: I0929 11:15:00.156155 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9ee24d0-9dd5-4fa4-8483-ecd0b3f0c4ff" containerName="extract-content" Sep 29 11:15:00 crc kubenswrapper[4752]: E0929 11:15:00.156171 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="168dc9fa-2c32-4bf0-915d-79dd62d497f4" containerName="extract-content" Sep 29 11:15:00 crc kubenswrapper[4752]: I0929 11:15:00.156177 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="168dc9fa-2c32-4bf0-915d-79dd62d497f4" containerName="extract-content" Sep 29 11:15:00 crc kubenswrapper[4752]: E0929 11:15:00.156187 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9ee24d0-9dd5-4fa4-8483-ecd0b3f0c4ff" containerName="registry-server" Sep 29 11:15:00 crc kubenswrapper[4752]: I0929 11:15:00.156193 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9ee24d0-9dd5-4fa4-8483-ecd0b3f0c4ff" containerName="registry-server" Sep 29 11:15:00 crc kubenswrapper[4752]: E0929 11:15:00.156204 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="168dc9fa-2c32-4bf0-915d-79dd62d497f4" containerName="extract-utilities" Sep 29 11:15:00 crc kubenswrapper[4752]: I0929 11:15:00.156212 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="168dc9fa-2c32-4bf0-915d-79dd62d497f4" containerName="extract-utilities" Sep 29 11:15:00 crc kubenswrapper[4752]: E0929 11:15:00.156228 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="168dc9fa-2c32-4bf0-915d-79dd62d497f4" containerName="registry-server" Sep 29 11:15:00 crc kubenswrapper[4752]: I0929 11:15:00.156233 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="168dc9fa-2c32-4bf0-915d-79dd62d497f4" containerName="registry-server" Sep 29 11:15:00 crc kubenswrapper[4752]: E0929 11:15:00.156249 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9ee24d0-9dd5-4fa4-8483-ecd0b3f0c4ff" containerName="extract-utilities" Sep 29 11:15:00 crc kubenswrapper[4752]: I0929 11:15:00.156255 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9ee24d0-9dd5-4fa4-8483-ecd0b3f0c4ff" containerName="extract-utilities" Sep 29 11:15:00 crc kubenswrapper[4752]: I0929 11:15:00.156396 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9ee24d0-9dd5-4fa4-8483-ecd0b3f0c4ff" containerName="registry-server" Sep 29 11:15:00 crc kubenswrapper[4752]: I0929 11:15:00.156411 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="168dc9fa-2c32-4bf0-915d-79dd62d497f4" containerName="registry-server" Sep 29 11:15:00 crc kubenswrapper[4752]: I0929 11:15:00.156975 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29319075-jfm9n" Sep 29 11:15:00 crc kubenswrapper[4752]: I0929 11:15:00.161039 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 29 11:15:00 crc kubenswrapper[4752]: I0929 11:15:00.161305 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 29 11:15:00 crc kubenswrapper[4752]: I0929 11:15:00.167842 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29319075-jfm9n"] Sep 29 11:15:00 crc kubenswrapper[4752]: I0929 11:15:00.243175 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9brd\" (UniqueName: \"kubernetes.io/projected/5596b59b-2e43-4c11-8882-c59fbc456f63-kube-api-access-j9brd\") pod \"collect-profiles-29319075-jfm9n\" (UID: \"5596b59b-2e43-4c11-8882-c59fbc456f63\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319075-jfm9n" Sep 29 11:15:00 crc kubenswrapper[4752]: I0929 11:15:00.243361 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5596b59b-2e43-4c11-8882-c59fbc456f63-config-volume\") pod \"collect-profiles-29319075-jfm9n\" (UID: \"5596b59b-2e43-4c11-8882-c59fbc456f63\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319075-jfm9n" Sep 29 11:15:00 crc kubenswrapper[4752]: I0929 11:15:00.243428 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5596b59b-2e43-4c11-8882-c59fbc456f63-secret-volume\") pod \"collect-profiles-29319075-jfm9n\" (UID: \"5596b59b-2e43-4c11-8882-c59fbc456f63\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319075-jfm9n" Sep 29 11:15:00 crc kubenswrapper[4752]: I0929 11:15:00.344713 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5596b59b-2e43-4c11-8882-c59fbc456f63-config-volume\") pod \"collect-profiles-29319075-jfm9n\" (UID: \"5596b59b-2e43-4c11-8882-c59fbc456f63\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319075-jfm9n" Sep 29 11:15:00 crc kubenswrapper[4752]: I0929 11:15:00.344816 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5596b59b-2e43-4c11-8882-c59fbc456f63-secret-volume\") pod \"collect-profiles-29319075-jfm9n\" (UID: \"5596b59b-2e43-4c11-8882-c59fbc456f63\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319075-jfm9n" Sep 29 11:15:00 crc kubenswrapper[4752]: I0929 11:15:00.344889 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9brd\" (UniqueName: \"kubernetes.io/projected/5596b59b-2e43-4c11-8882-c59fbc456f63-kube-api-access-j9brd\") pod \"collect-profiles-29319075-jfm9n\" (UID: \"5596b59b-2e43-4c11-8882-c59fbc456f63\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319075-jfm9n" Sep 29 11:15:00 crc kubenswrapper[4752]: I0929 11:15:00.345728 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5596b59b-2e43-4c11-8882-c59fbc456f63-config-volume\") pod \"collect-profiles-29319075-jfm9n\" (UID: \"5596b59b-2e43-4c11-8882-c59fbc456f63\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319075-jfm9n" Sep 29 11:15:00 crc kubenswrapper[4752]: I0929 11:15:00.353606 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5596b59b-2e43-4c11-8882-c59fbc456f63-secret-volume\") pod \"collect-profiles-29319075-jfm9n\" (UID: \"5596b59b-2e43-4c11-8882-c59fbc456f63\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319075-jfm9n" Sep 29 11:15:00 crc kubenswrapper[4752]: I0929 11:15:00.364570 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9brd\" (UniqueName: \"kubernetes.io/projected/5596b59b-2e43-4c11-8882-c59fbc456f63-kube-api-access-j9brd\") pod \"collect-profiles-29319075-jfm9n\" (UID: \"5596b59b-2e43-4c11-8882-c59fbc456f63\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319075-jfm9n" Sep 29 11:15:00 crc kubenswrapper[4752]: I0929 11:15:00.484947 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29319075-jfm9n" Sep 29 11:15:00 crc kubenswrapper[4752]: I0929 11:15:00.962647 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29319075-jfm9n"] Sep 29 11:15:00 crc kubenswrapper[4752]: I0929 11:15:00.997077 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29319075-jfm9n" event={"ID":"5596b59b-2e43-4c11-8882-c59fbc456f63","Type":"ContainerStarted","Data":"6a3d97c8040630355623afde4d6a52cfaeb71b7f8caa859993c7689f5bbab6ae"} Sep 29 11:15:02 crc kubenswrapper[4752]: I0929 11:15:02.005737 4752 generic.go:334] "Generic (PLEG): container finished" podID="5596b59b-2e43-4c11-8882-c59fbc456f63" containerID="b50bd2599671b8efe07ee18197662cc3de2ad869fc139eaae6bd35cab6d25e7d" exitCode=0 Sep 29 11:15:02 crc kubenswrapper[4752]: I0929 11:15:02.005865 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29319075-jfm9n" event={"ID":"5596b59b-2e43-4c11-8882-c59fbc456f63","Type":"ContainerDied","Data":"b50bd2599671b8efe07ee18197662cc3de2ad869fc139eaae6bd35cab6d25e7d"} Sep 29 11:15:03 crc kubenswrapper[4752]: I0929 11:15:03.336723 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29319075-jfm9n" Sep 29 11:15:03 crc kubenswrapper[4752]: I0929 11:15:03.397406 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5596b59b-2e43-4c11-8882-c59fbc456f63-secret-volume\") pod \"5596b59b-2e43-4c11-8882-c59fbc456f63\" (UID: \"5596b59b-2e43-4c11-8882-c59fbc456f63\") " Sep 29 11:15:03 crc kubenswrapper[4752]: I0929 11:15:03.397636 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5596b59b-2e43-4c11-8882-c59fbc456f63-config-volume\") pod \"5596b59b-2e43-4c11-8882-c59fbc456f63\" (UID: \"5596b59b-2e43-4c11-8882-c59fbc456f63\") " Sep 29 11:15:03 crc kubenswrapper[4752]: I0929 11:15:03.397685 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j9brd\" (UniqueName: \"kubernetes.io/projected/5596b59b-2e43-4c11-8882-c59fbc456f63-kube-api-access-j9brd\") pod \"5596b59b-2e43-4c11-8882-c59fbc456f63\" (UID: \"5596b59b-2e43-4c11-8882-c59fbc456f63\") " Sep 29 11:15:03 crc kubenswrapper[4752]: I0929 11:15:03.399081 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5596b59b-2e43-4c11-8882-c59fbc456f63-config-volume" (OuterVolumeSpecName: "config-volume") pod "5596b59b-2e43-4c11-8882-c59fbc456f63" (UID: "5596b59b-2e43-4c11-8882-c59fbc456f63"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 11:15:03 crc kubenswrapper[4752]: I0929 11:15:03.404262 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5596b59b-2e43-4c11-8882-c59fbc456f63-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "5596b59b-2e43-4c11-8882-c59fbc456f63" (UID: "5596b59b-2e43-4c11-8882-c59fbc456f63"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:15:03 crc kubenswrapper[4752]: I0929 11:15:03.404540 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5596b59b-2e43-4c11-8882-c59fbc456f63-kube-api-access-j9brd" (OuterVolumeSpecName: "kube-api-access-j9brd") pod "5596b59b-2e43-4c11-8882-c59fbc456f63" (UID: "5596b59b-2e43-4c11-8882-c59fbc456f63"). InnerVolumeSpecName "kube-api-access-j9brd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 11:15:03 crc kubenswrapper[4752]: I0929 11:15:03.499380 4752 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5596b59b-2e43-4c11-8882-c59fbc456f63-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 29 11:15:03 crc kubenswrapper[4752]: I0929 11:15:03.499429 4752 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5596b59b-2e43-4c11-8882-c59fbc456f63-config-volume\") on node \"crc\" DevicePath \"\"" Sep 29 11:15:03 crc kubenswrapper[4752]: I0929 11:15:03.499462 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j9brd\" (UniqueName: \"kubernetes.io/projected/5596b59b-2e43-4c11-8882-c59fbc456f63-kube-api-access-j9brd\") on node \"crc\" DevicePath \"\"" Sep 29 11:15:04 crc kubenswrapper[4752]: I0929 11:15:04.020597 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29319075-jfm9n" event={"ID":"5596b59b-2e43-4c11-8882-c59fbc456f63","Type":"ContainerDied","Data":"6a3d97c8040630355623afde4d6a52cfaeb71b7f8caa859993c7689f5bbab6ae"} Sep 29 11:15:04 crc kubenswrapper[4752]: I0929 11:15:04.020659 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a3d97c8040630355623afde4d6a52cfaeb71b7f8caa859993c7689f5bbab6ae" Sep 29 11:15:04 crc kubenswrapper[4752]: I0929 11:15:04.020658 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29319075-jfm9n" Sep 29 11:15:10 crc kubenswrapper[4752]: I0929 11:15:10.288506 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-qh5gr"] Sep 29 11:15:10 crc kubenswrapper[4752]: I0929 11:15:10.294848 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-qh5gr"] Sep 29 11:15:10 crc kubenswrapper[4752]: I0929 11:15:10.344455 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher3aef-account-delete-gb2fj"] Sep 29 11:15:10 crc kubenswrapper[4752]: E0929 11:15:10.344875 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5596b59b-2e43-4c11-8882-c59fbc456f63" containerName="collect-profiles" Sep 29 11:15:10 crc kubenswrapper[4752]: I0929 11:15:10.344893 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="5596b59b-2e43-4c11-8882-c59fbc456f63" containerName="collect-profiles" Sep 29 11:15:10 crc kubenswrapper[4752]: I0929 11:15:10.345124 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="5596b59b-2e43-4c11-8882-c59fbc456f63" containerName="collect-profiles" Sep 29 11:15:10 crc kubenswrapper[4752]: I0929 11:15:10.345855 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher3aef-account-delete-gb2fj" Sep 29 11:15:10 crc kubenswrapper[4752]: I0929 11:15:10.357570 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher3aef-account-delete-gb2fj"] Sep 29 11:15:10 crc kubenswrapper[4752]: I0929 11:15:10.376964 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Sep 29 11:15:10 crc kubenswrapper[4752]: I0929 11:15:10.377312 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podUID="8b00fc0a-2c62-480e-93e6-95ed8b1305c8" containerName="watcher-applier" containerID="cri-o://08fb7c9ca8887afc3730321ac0ce1b0e0e37d821b9e72f8a10de80286cba82ea" gracePeriod=30 Sep 29 11:15:10 crc kubenswrapper[4752]: I0929 11:15:10.412357 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tksl6\" (UniqueName: \"kubernetes.io/projected/290917fa-62e5-4763-8d82-a941c4e6d6a6-kube-api-access-tksl6\") pod \"watcher3aef-account-delete-gb2fj\" (UID: \"290917fa-62e5-4763-8d82-a941c4e6d6a6\") " pod="watcher-kuttl-default/watcher3aef-account-delete-gb2fj" Sep 29 11:15:10 crc kubenswrapper[4752]: I0929 11:15:10.451152 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Sep 29 11:15:10 crc kubenswrapper[4752]: I0929 11:15:10.451654 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="1b0a1d48-69e7-486a-8163-1961ba8d3501" containerName="watcher-api" containerID="cri-o://54ac9ff5c2216d873da434228eb3eef8de401bc1ec5dbc7c9f11f451740d804e" gracePeriod=30 Sep 29 11:15:10 crc kubenswrapper[4752]: I0929 11:15:10.451461 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="1b0a1d48-69e7-486a-8163-1961ba8d3501" containerName="watcher-kuttl-api-log" containerID="cri-o://78468825fe549e376ecd19a9147b25b3c7226a1fad3845684a601b80708898ee" gracePeriod=30 Sep 29 11:15:10 crc kubenswrapper[4752]: I0929 11:15:10.515113 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tksl6\" (UniqueName: \"kubernetes.io/projected/290917fa-62e5-4763-8d82-a941c4e6d6a6-kube-api-access-tksl6\") pod \"watcher3aef-account-delete-gb2fj\" (UID: \"290917fa-62e5-4763-8d82-a941c4e6d6a6\") " pod="watcher-kuttl-default/watcher3aef-account-delete-gb2fj" Sep 29 11:15:10 crc kubenswrapper[4752]: I0929 11:15:10.516524 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Sep 29 11:15:10 crc kubenswrapper[4752]: I0929 11:15:10.516784 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="f39fa61e-8858-4d6e-80db-bc94fdccaec8" containerName="watcher-decision-engine" containerID="cri-o://a17245ec1c9efb32f26a19fe377a440a7fd3b84c08fbbdd6d68079700e9a78e0" gracePeriod=30 Sep 29 11:15:10 crc kubenswrapper[4752]: I0929 11:15:10.541972 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tksl6\" (UniqueName: \"kubernetes.io/projected/290917fa-62e5-4763-8d82-a941c4e6d6a6-kube-api-access-tksl6\") pod \"watcher3aef-account-delete-gb2fj\" (UID: \"290917fa-62e5-4763-8d82-a941c4e6d6a6\") " pod="watcher-kuttl-default/watcher3aef-account-delete-gb2fj" Sep 29 11:15:10 crc kubenswrapper[4752]: I0929 11:15:10.669121 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher3aef-account-delete-gb2fj" Sep 29 11:15:11 crc kubenswrapper[4752]: I0929 11:15:11.093085 4752 generic.go:334] "Generic (PLEG): container finished" podID="1b0a1d48-69e7-486a-8163-1961ba8d3501" containerID="78468825fe549e376ecd19a9147b25b3c7226a1fad3845684a601b80708898ee" exitCode=143 Sep 29 11:15:11 crc kubenswrapper[4752]: I0929 11:15:11.093623 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"1b0a1d48-69e7-486a-8163-1961ba8d3501","Type":"ContainerDied","Data":"78468825fe549e376ecd19a9147b25b3c7226a1fad3845684a601b80708898ee"} Sep 29 11:15:11 crc kubenswrapper[4752]: I0929 11:15:11.213675 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher3aef-account-delete-gb2fj"] Sep 29 11:15:11 crc kubenswrapper[4752]: I0929 11:15:11.323971 4752 prober.go:107] "Probe failed" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="1b0a1d48-69e7-486a-8163-1961ba8d3501" containerName="watcher-api" probeResult="failure" output="Get \"https://10.217.0.146:9322/\": read tcp 10.217.0.2:51352->10.217.0.146:9322: read: connection reset by peer" Sep 29 11:15:11 crc kubenswrapper[4752]: I0929 11:15:11.324946 4752 prober.go:107] "Probe failed" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="1b0a1d48-69e7-486a-8163-1961ba8d3501" containerName="watcher-kuttl-api-log" probeResult="failure" output="Get \"https://10.217.0.146:9322/\": read tcp 10.217.0.2:51362->10.217.0.146:9322: read: connection reset by peer" Sep 29 11:15:11 crc kubenswrapper[4752]: E0929 11:15:11.383237 4752 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="08fb7c9ca8887afc3730321ac0ce1b0e0e37d821b9e72f8a10de80286cba82ea" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Sep 29 11:15:11 crc kubenswrapper[4752]: E0929 11:15:11.385545 4752 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="08fb7c9ca8887afc3730321ac0ce1b0e0e37d821b9e72f8a10de80286cba82ea" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Sep 29 11:15:11 crc kubenswrapper[4752]: E0929 11:15:11.386719 4752 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="08fb7c9ca8887afc3730321ac0ce1b0e0e37d821b9e72f8a10de80286cba82ea" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Sep 29 11:15:11 crc kubenswrapper[4752]: E0929 11:15:11.386755 4752 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podUID="8b00fc0a-2c62-480e-93e6-95ed8b1305c8" containerName="watcher-applier" Sep 29 11:15:11 crc kubenswrapper[4752]: I0929 11:15:11.773820 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:15:11 crc kubenswrapper[4752]: I0929 11:15:11.842200 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/1b0a1d48-69e7-486a-8163-1961ba8d3501-custom-prometheus-ca\") pod \"1b0a1d48-69e7-486a-8163-1961ba8d3501\" (UID: \"1b0a1d48-69e7-486a-8163-1961ba8d3501\") " Sep 29 11:15:11 crc kubenswrapper[4752]: I0929 11:15:11.842264 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b0a1d48-69e7-486a-8163-1961ba8d3501-internal-tls-certs\") pod \"1b0a1d48-69e7-486a-8163-1961ba8d3501\" (UID: \"1b0a1d48-69e7-486a-8163-1961ba8d3501\") " Sep 29 11:15:11 crc kubenswrapper[4752]: I0929 11:15:11.842311 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b0a1d48-69e7-486a-8163-1961ba8d3501-combined-ca-bundle\") pod \"1b0a1d48-69e7-486a-8163-1961ba8d3501\" (UID: \"1b0a1d48-69e7-486a-8163-1961ba8d3501\") " Sep 29 11:15:11 crc kubenswrapper[4752]: I0929 11:15:11.842336 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhpv7\" (UniqueName: \"kubernetes.io/projected/1b0a1d48-69e7-486a-8163-1961ba8d3501-kube-api-access-vhpv7\") pod \"1b0a1d48-69e7-486a-8163-1961ba8d3501\" (UID: \"1b0a1d48-69e7-486a-8163-1961ba8d3501\") " Sep 29 11:15:11 crc kubenswrapper[4752]: I0929 11:15:11.842394 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b0a1d48-69e7-486a-8163-1961ba8d3501-public-tls-certs\") pod \"1b0a1d48-69e7-486a-8163-1961ba8d3501\" (UID: \"1b0a1d48-69e7-486a-8163-1961ba8d3501\") " Sep 29 11:15:11 crc kubenswrapper[4752]: I0929 11:15:11.842423 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b0a1d48-69e7-486a-8163-1961ba8d3501-config-data\") pod \"1b0a1d48-69e7-486a-8163-1961ba8d3501\" (UID: \"1b0a1d48-69e7-486a-8163-1961ba8d3501\") " Sep 29 11:15:11 crc kubenswrapper[4752]: I0929 11:15:11.842472 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b0a1d48-69e7-486a-8163-1961ba8d3501-logs\") pod \"1b0a1d48-69e7-486a-8163-1961ba8d3501\" (UID: \"1b0a1d48-69e7-486a-8163-1961ba8d3501\") " Sep 29 11:15:11 crc kubenswrapper[4752]: I0929 11:15:11.843698 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b0a1d48-69e7-486a-8163-1961ba8d3501-logs" (OuterVolumeSpecName: "logs") pod "1b0a1d48-69e7-486a-8163-1961ba8d3501" (UID: "1b0a1d48-69e7-486a-8163-1961ba8d3501"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 11:15:11 crc kubenswrapper[4752]: I0929 11:15:11.857117 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b0a1d48-69e7-486a-8163-1961ba8d3501-kube-api-access-vhpv7" (OuterVolumeSpecName: "kube-api-access-vhpv7") pod "1b0a1d48-69e7-486a-8163-1961ba8d3501" (UID: "1b0a1d48-69e7-486a-8163-1961ba8d3501"). InnerVolumeSpecName "kube-api-access-vhpv7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 11:15:11 crc kubenswrapper[4752]: I0929 11:15:11.944875 4752 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b0a1d48-69e7-486a-8163-1961ba8d3501-logs\") on node \"crc\" DevicePath \"\"" Sep 29 11:15:11 crc kubenswrapper[4752]: I0929 11:15:11.944907 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vhpv7\" (UniqueName: \"kubernetes.io/projected/1b0a1d48-69e7-486a-8163-1961ba8d3501-kube-api-access-vhpv7\") on node \"crc\" DevicePath \"\"" Sep 29 11:15:11 crc kubenswrapper[4752]: I0929 11:15:11.946251 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b0a1d48-69e7-486a-8163-1961ba8d3501-config-data" (OuterVolumeSpecName: "config-data") pod "1b0a1d48-69e7-486a-8163-1961ba8d3501" (UID: "1b0a1d48-69e7-486a-8163-1961ba8d3501"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:15:11 crc kubenswrapper[4752]: I0929 11:15:11.950455 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b0a1d48-69e7-486a-8163-1961ba8d3501-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "1b0a1d48-69e7-486a-8163-1961ba8d3501" (UID: "1b0a1d48-69e7-486a-8163-1961ba8d3501"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:15:11 crc kubenswrapper[4752]: I0929 11:15:11.950606 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b0a1d48-69e7-486a-8163-1961ba8d3501-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1b0a1d48-69e7-486a-8163-1961ba8d3501" (UID: "1b0a1d48-69e7-486a-8163-1961ba8d3501"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:15:11 crc kubenswrapper[4752]: I0929 11:15:11.957719 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b0a1d48-69e7-486a-8163-1961ba8d3501-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "1b0a1d48-69e7-486a-8163-1961ba8d3501" (UID: "1b0a1d48-69e7-486a-8163-1961ba8d3501"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:15:11 crc kubenswrapper[4752]: I0929 11:15:11.969395 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b0a1d48-69e7-486a-8163-1961ba8d3501-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "1b0a1d48-69e7-486a-8163-1961ba8d3501" (UID: "1b0a1d48-69e7-486a-8163-1961ba8d3501"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:15:12 crc kubenswrapper[4752]: I0929 11:15:12.044367 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10e50680-30b0-43b2-a586-b9bb5101ed94" path="/var/lib/kubelet/pods/10e50680-30b0-43b2-a586-b9bb5101ed94/volumes" Sep 29 11:15:12 crc kubenswrapper[4752]: I0929 11:15:12.047052 4752 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/1b0a1d48-69e7-486a-8163-1961ba8d3501-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Sep 29 11:15:12 crc kubenswrapper[4752]: I0929 11:15:12.047099 4752 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b0a1d48-69e7-486a-8163-1961ba8d3501-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 29 11:15:12 crc kubenswrapper[4752]: I0929 11:15:12.047120 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b0a1d48-69e7-486a-8163-1961ba8d3501-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 11:15:12 crc kubenswrapper[4752]: I0929 11:15:12.047135 4752 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b0a1d48-69e7-486a-8163-1961ba8d3501-public-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 29 11:15:12 crc kubenswrapper[4752]: I0929 11:15:12.047148 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b0a1d48-69e7-486a-8163-1961ba8d3501-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 11:15:12 crc kubenswrapper[4752]: I0929 11:15:12.102474 4752 generic.go:334] "Generic (PLEG): container finished" podID="290917fa-62e5-4763-8d82-a941c4e6d6a6" containerID="49b5e72ade9fa195e2885a3effa0b343cde6622f15828e925bd8c0c020891fa1" exitCode=0 Sep 29 11:15:12 crc kubenswrapper[4752]: I0929 11:15:12.102590 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher3aef-account-delete-gb2fj" event={"ID":"290917fa-62e5-4763-8d82-a941c4e6d6a6","Type":"ContainerDied","Data":"49b5e72ade9fa195e2885a3effa0b343cde6622f15828e925bd8c0c020891fa1"} Sep 29 11:15:12 crc kubenswrapper[4752]: I0929 11:15:12.102620 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher3aef-account-delete-gb2fj" event={"ID":"290917fa-62e5-4763-8d82-a941c4e6d6a6","Type":"ContainerStarted","Data":"9107c2ba236a859890e976ab94c82f9caa7782eb1f4c3feaadb30aa8fad1cecc"} Sep 29 11:15:12 crc kubenswrapper[4752]: I0929 11:15:12.110728 4752 generic.go:334] "Generic (PLEG): container finished" podID="1b0a1d48-69e7-486a-8163-1961ba8d3501" containerID="54ac9ff5c2216d873da434228eb3eef8de401bc1ec5dbc7c9f11f451740d804e" exitCode=0 Sep 29 11:15:12 crc kubenswrapper[4752]: I0929 11:15:12.110778 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"1b0a1d48-69e7-486a-8163-1961ba8d3501","Type":"ContainerDied","Data":"54ac9ff5c2216d873da434228eb3eef8de401bc1ec5dbc7c9f11f451740d804e"} Sep 29 11:15:12 crc kubenswrapper[4752]: I0929 11:15:12.110823 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:15:12 crc kubenswrapper[4752]: I0929 11:15:12.110838 4752 scope.go:117] "RemoveContainer" containerID="54ac9ff5c2216d873da434228eb3eef8de401bc1ec5dbc7c9f11f451740d804e" Sep 29 11:15:12 crc kubenswrapper[4752]: I0929 11:15:12.110824 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"1b0a1d48-69e7-486a-8163-1961ba8d3501","Type":"ContainerDied","Data":"dde34122c9c68578193254f5e6af0182cf69d39bdce0e51a928e7c81c66a009c"} Sep 29 11:15:12 crc kubenswrapper[4752]: E0929 11:15:12.153904 4752 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1b0a1d48_69e7_486a_8163_1961ba8d3501.slice\": RecentStats: unable to find data in memory cache]" Sep 29 11:15:12 crc kubenswrapper[4752]: I0929 11:15:12.186913 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Sep 29 11:15:12 crc kubenswrapper[4752]: I0929 11:15:12.194642 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Sep 29 11:15:12 crc kubenswrapper[4752]: I0929 11:15:12.210533 4752 scope.go:117] "RemoveContainer" containerID="78468825fe549e376ecd19a9147b25b3c7226a1fad3845684a601b80708898ee" Sep 29 11:15:12 crc kubenswrapper[4752]: I0929 11:15:12.247617 4752 scope.go:117] "RemoveContainer" containerID="54ac9ff5c2216d873da434228eb3eef8de401bc1ec5dbc7c9f11f451740d804e" Sep 29 11:15:12 crc kubenswrapper[4752]: E0929 11:15:12.248144 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54ac9ff5c2216d873da434228eb3eef8de401bc1ec5dbc7c9f11f451740d804e\": container with ID starting with 54ac9ff5c2216d873da434228eb3eef8de401bc1ec5dbc7c9f11f451740d804e not found: ID does not exist" containerID="54ac9ff5c2216d873da434228eb3eef8de401bc1ec5dbc7c9f11f451740d804e" Sep 29 11:15:12 crc kubenswrapper[4752]: I0929 11:15:12.248174 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54ac9ff5c2216d873da434228eb3eef8de401bc1ec5dbc7c9f11f451740d804e"} err="failed to get container status \"54ac9ff5c2216d873da434228eb3eef8de401bc1ec5dbc7c9f11f451740d804e\": rpc error: code = NotFound desc = could not find container \"54ac9ff5c2216d873da434228eb3eef8de401bc1ec5dbc7c9f11f451740d804e\": container with ID starting with 54ac9ff5c2216d873da434228eb3eef8de401bc1ec5dbc7c9f11f451740d804e not found: ID does not exist" Sep 29 11:15:12 crc kubenswrapper[4752]: I0929 11:15:12.248197 4752 scope.go:117] "RemoveContainer" containerID="78468825fe549e376ecd19a9147b25b3c7226a1fad3845684a601b80708898ee" Sep 29 11:15:12 crc kubenswrapper[4752]: E0929 11:15:12.248426 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78468825fe549e376ecd19a9147b25b3c7226a1fad3845684a601b80708898ee\": container with ID starting with 78468825fe549e376ecd19a9147b25b3c7226a1fad3845684a601b80708898ee not found: ID does not exist" containerID="78468825fe549e376ecd19a9147b25b3c7226a1fad3845684a601b80708898ee" Sep 29 11:15:12 crc kubenswrapper[4752]: I0929 11:15:12.248446 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78468825fe549e376ecd19a9147b25b3c7226a1fad3845684a601b80708898ee"} err="failed to get container status \"78468825fe549e376ecd19a9147b25b3c7226a1fad3845684a601b80708898ee\": rpc error: code = NotFound desc = could not find container \"78468825fe549e376ecd19a9147b25b3c7226a1fad3845684a601b80708898ee\": container with ID starting with 78468825fe549e376ecd19a9147b25b3c7226a1fad3845684a601b80708898ee not found: ID does not exist" Sep 29 11:15:13 crc kubenswrapper[4752]: I0929 11:15:13.102993 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Sep 29 11:15:13 crc kubenswrapper[4752]: I0929 11:15:13.103563 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="8b09d00c-78f2-4399-bc78-3c901f3470ad" containerName="ceilometer-central-agent" containerID="cri-o://7027072bd227485b6c8846e90b7a47acbf71310ea093d71a484c0d57bd584f85" gracePeriod=30 Sep 29 11:15:13 crc kubenswrapper[4752]: I0929 11:15:13.103659 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="8b09d00c-78f2-4399-bc78-3c901f3470ad" containerName="proxy-httpd" containerID="cri-o://f185d6ba4609c787ea95f5509f05e77b699306647b1a4f71d4c17539ca5826c6" gracePeriod=30 Sep 29 11:15:13 crc kubenswrapper[4752]: I0929 11:15:13.103705 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="8b09d00c-78f2-4399-bc78-3c901f3470ad" containerName="sg-core" containerID="cri-o://3d0632330e2d8ac690c35022aa66b02ac23ae110fccd155839efcc35e041325d" gracePeriod=30 Sep 29 11:15:13 crc kubenswrapper[4752]: I0929 11:15:13.103746 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="8b09d00c-78f2-4399-bc78-3c901f3470ad" containerName="ceilometer-notification-agent" containerID="cri-o://a59217981fcb9b639e460e9b3c77de8fb9adcca23087040a03f8f57cd23a1b8b" gracePeriod=30 Sep 29 11:15:13 crc kubenswrapper[4752]: I0929 11:15:13.429853 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher3aef-account-delete-gb2fj" Sep 29 11:15:13 crc kubenswrapper[4752]: I0929 11:15:13.467650 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tksl6\" (UniqueName: \"kubernetes.io/projected/290917fa-62e5-4763-8d82-a941c4e6d6a6-kube-api-access-tksl6\") pod \"290917fa-62e5-4763-8d82-a941c4e6d6a6\" (UID: \"290917fa-62e5-4763-8d82-a941c4e6d6a6\") " Sep 29 11:15:13 crc kubenswrapper[4752]: I0929 11:15:13.475654 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/290917fa-62e5-4763-8d82-a941c4e6d6a6-kube-api-access-tksl6" (OuterVolumeSpecName: "kube-api-access-tksl6") pod "290917fa-62e5-4763-8d82-a941c4e6d6a6" (UID: "290917fa-62e5-4763-8d82-a941c4e6d6a6"). InnerVolumeSpecName "kube-api-access-tksl6". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 11:15:13 crc kubenswrapper[4752]: I0929 11:15:13.569296 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tksl6\" (UniqueName: \"kubernetes.io/projected/290917fa-62e5-4763-8d82-a941c4e6d6a6-kube-api-access-tksl6\") on node \"crc\" DevicePath \"\"" Sep 29 11:15:14 crc kubenswrapper[4752]: I0929 11:15:14.041558 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b0a1d48-69e7-486a-8163-1961ba8d3501" path="/var/lib/kubelet/pods/1b0a1d48-69e7-486a-8163-1961ba8d3501/volumes" Sep 29 11:15:14 crc kubenswrapper[4752]: I0929 11:15:14.136753 4752 generic.go:334] "Generic (PLEG): container finished" podID="8b09d00c-78f2-4399-bc78-3c901f3470ad" containerID="f185d6ba4609c787ea95f5509f05e77b699306647b1a4f71d4c17539ca5826c6" exitCode=0 Sep 29 11:15:14 crc kubenswrapper[4752]: I0929 11:15:14.136798 4752 generic.go:334] "Generic (PLEG): container finished" podID="8b09d00c-78f2-4399-bc78-3c901f3470ad" containerID="3d0632330e2d8ac690c35022aa66b02ac23ae110fccd155839efcc35e041325d" exitCode=2 Sep 29 11:15:14 crc kubenswrapper[4752]: I0929 11:15:14.136820 4752 generic.go:334] "Generic (PLEG): container finished" podID="8b09d00c-78f2-4399-bc78-3c901f3470ad" containerID="7027072bd227485b6c8846e90b7a47acbf71310ea093d71a484c0d57bd584f85" exitCode=0 Sep 29 11:15:14 crc kubenswrapper[4752]: I0929 11:15:14.136834 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"8b09d00c-78f2-4399-bc78-3c901f3470ad","Type":"ContainerDied","Data":"f185d6ba4609c787ea95f5509f05e77b699306647b1a4f71d4c17539ca5826c6"} Sep 29 11:15:14 crc kubenswrapper[4752]: I0929 11:15:14.136875 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"8b09d00c-78f2-4399-bc78-3c901f3470ad","Type":"ContainerDied","Data":"3d0632330e2d8ac690c35022aa66b02ac23ae110fccd155839efcc35e041325d"} Sep 29 11:15:14 crc kubenswrapper[4752]: I0929 11:15:14.137249 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"8b09d00c-78f2-4399-bc78-3c901f3470ad","Type":"ContainerDied","Data":"7027072bd227485b6c8846e90b7a47acbf71310ea093d71a484c0d57bd584f85"} Sep 29 11:15:14 crc kubenswrapper[4752]: I0929 11:15:14.138303 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher3aef-account-delete-gb2fj" event={"ID":"290917fa-62e5-4763-8d82-a941c4e6d6a6","Type":"ContainerDied","Data":"9107c2ba236a859890e976ab94c82f9caa7782eb1f4c3feaadb30aa8fad1cecc"} Sep 29 11:15:14 crc kubenswrapper[4752]: I0929 11:15:14.138334 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9107c2ba236a859890e976ab94c82f9caa7782eb1f4c3feaadb30aa8fad1cecc" Sep 29 11:15:14 crc kubenswrapper[4752]: I0929 11:15:14.138356 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher3aef-account-delete-gb2fj" Sep 29 11:15:15 crc kubenswrapper[4752]: I0929 11:15:15.379918 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-db-create-9xkb5"] Sep 29 11:15:15 crc kubenswrapper[4752]: I0929 11:15:15.388192 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-db-create-9xkb5"] Sep 29 11:15:15 crc kubenswrapper[4752]: I0929 11:15:15.401867 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher3aef-account-delete-gb2fj"] Sep 29 11:15:15 crc kubenswrapper[4752]: I0929 11:15:15.408837 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-3aef-account-create-c6ng4"] Sep 29 11:15:15 crc kubenswrapper[4752]: I0929 11:15:15.416660 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher3aef-account-delete-gb2fj"] Sep 29 11:15:15 crc kubenswrapper[4752]: I0929 11:15:15.425521 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-3aef-account-create-c6ng4"] Sep 29 11:15:15 crc kubenswrapper[4752]: I0929 11:15:15.542837 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-db-create-rm9qv"] Sep 29 11:15:15 crc kubenswrapper[4752]: E0929 11:15:15.543330 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="290917fa-62e5-4763-8d82-a941c4e6d6a6" containerName="mariadb-account-delete" Sep 29 11:15:15 crc kubenswrapper[4752]: I0929 11:15:15.543366 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="290917fa-62e5-4763-8d82-a941c4e6d6a6" containerName="mariadb-account-delete" Sep 29 11:15:15 crc kubenswrapper[4752]: E0929 11:15:15.543383 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b0a1d48-69e7-486a-8163-1961ba8d3501" containerName="watcher-api" Sep 29 11:15:15 crc kubenswrapper[4752]: I0929 11:15:15.543390 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b0a1d48-69e7-486a-8163-1961ba8d3501" containerName="watcher-api" Sep 29 11:15:15 crc kubenswrapper[4752]: E0929 11:15:15.543401 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b0a1d48-69e7-486a-8163-1961ba8d3501" containerName="watcher-kuttl-api-log" Sep 29 11:15:15 crc kubenswrapper[4752]: I0929 11:15:15.543407 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b0a1d48-69e7-486a-8163-1961ba8d3501" containerName="watcher-kuttl-api-log" Sep 29 11:15:15 crc kubenswrapper[4752]: I0929 11:15:15.543611 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b0a1d48-69e7-486a-8163-1961ba8d3501" containerName="watcher-api" Sep 29 11:15:15 crc kubenswrapper[4752]: I0929 11:15:15.543643 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="290917fa-62e5-4763-8d82-a941c4e6d6a6" containerName="mariadb-account-delete" Sep 29 11:15:15 crc kubenswrapper[4752]: I0929 11:15:15.543655 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b0a1d48-69e7-486a-8163-1961ba8d3501" containerName="watcher-kuttl-api-log" Sep 29 11:15:15 crc kubenswrapper[4752]: I0929 11:15:15.544543 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-rm9qv" Sep 29 11:15:15 crc kubenswrapper[4752]: I0929 11:15:15.547333 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-db-create-rm9qv"] Sep 29 11:15:15 crc kubenswrapper[4752]: I0929 11:15:15.602733 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2wph\" (UniqueName: \"kubernetes.io/projected/8d9cb262-2d02-4933-a57b-05a013784977-kube-api-access-p2wph\") pod \"watcher-db-create-rm9qv\" (UID: \"8d9cb262-2d02-4933-a57b-05a013784977\") " pod="watcher-kuttl-default/watcher-db-create-rm9qv" Sep 29 11:15:15 crc kubenswrapper[4752]: I0929 11:15:15.668839 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:15:15 crc kubenswrapper[4752]: I0929 11:15:15.704163 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b00fc0a-2c62-480e-93e6-95ed8b1305c8-combined-ca-bundle\") pod \"8b00fc0a-2c62-480e-93e6-95ed8b1305c8\" (UID: \"8b00fc0a-2c62-480e-93e6-95ed8b1305c8\") " Sep 29 11:15:15 crc kubenswrapper[4752]: I0929 11:15:15.704244 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b00fc0a-2c62-480e-93e6-95ed8b1305c8-config-data\") pod \"8b00fc0a-2c62-480e-93e6-95ed8b1305c8\" (UID: \"8b00fc0a-2c62-480e-93e6-95ed8b1305c8\") " Sep 29 11:15:15 crc kubenswrapper[4752]: I0929 11:15:15.704276 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b00fc0a-2c62-480e-93e6-95ed8b1305c8-logs\") pod \"8b00fc0a-2c62-480e-93e6-95ed8b1305c8\" (UID: \"8b00fc0a-2c62-480e-93e6-95ed8b1305c8\") " Sep 29 11:15:15 crc kubenswrapper[4752]: I0929 11:15:15.704298 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hqsz8\" (UniqueName: \"kubernetes.io/projected/8b00fc0a-2c62-480e-93e6-95ed8b1305c8-kube-api-access-hqsz8\") pod \"8b00fc0a-2c62-480e-93e6-95ed8b1305c8\" (UID: \"8b00fc0a-2c62-480e-93e6-95ed8b1305c8\") " Sep 29 11:15:15 crc kubenswrapper[4752]: I0929 11:15:15.704674 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2wph\" (UniqueName: \"kubernetes.io/projected/8d9cb262-2d02-4933-a57b-05a013784977-kube-api-access-p2wph\") pod \"watcher-db-create-rm9qv\" (UID: \"8d9cb262-2d02-4933-a57b-05a013784977\") " pod="watcher-kuttl-default/watcher-db-create-rm9qv" Sep 29 11:15:15 crc kubenswrapper[4752]: I0929 11:15:15.705110 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b00fc0a-2c62-480e-93e6-95ed8b1305c8-logs" (OuterVolumeSpecName: "logs") pod "8b00fc0a-2c62-480e-93e6-95ed8b1305c8" (UID: "8b00fc0a-2c62-480e-93e6-95ed8b1305c8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 11:15:15 crc kubenswrapper[4752]: I0929 11:15:15.712182 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b00fc0a-2c62-480e-93e6-95ed8b1305c8-kube-api-access-hqsz8" (OuterVolumeSpecName: "kube-api-access-hqsz8") pod "8b00fc0a-2c62-480e-93e6-95ed8b1305c8" (UID: "8b00fc0a-2c62-480e-93e6-95ed8b1305c8"). InnerVolumeSpecName "kube-api-access-hqsz8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 11:15:15 crc kubenswrapper[4752]: I0929 11:15:15.729777 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2wph\" (UniqueName: \"kubernetes.io/projected/8d9cb262-2d02-4933-a57b-05a013784977-kube-api-access-p2wph\") pod \"watcher-db-create-rm9qv\" (UID: \"8d9cb262-2d02-4933-a57b-05a013784977\") " pod="watcher-kuttl-default/watcher-db-create-rm9qv" Sep 29 11:15:15 crc kubenswrapper[4752]: I0929 11:15:15.737471 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b00fc0a-2c62-480e-93e6-95ed8b1305c8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8b00fc0a-2c62-480e-93e6-95ed8b1305c8" (UID: "8b00fc0a-2c62-480e-93e6-95ed8b1305c8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:15:15 crc kubenswrapper[4752]: I0929 11:15:15.767512 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b00fc0a-2c62-480e-93e6-95ed8b1305c8-config-data" (OuterVolumeSpecName: "config-data") pod "8b00fc0a-2c62-480e-93e6-95ed8b1305c8" (UID: "8b00fc0a-2c62-480e-93e6-95ed8b1305c8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:15:15 crc kubenswrapper[4752]: I0929 11:15:15.806338 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b00fc0a-2c62-480e-93e6-95ed8b1305c8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 11:15:15 crc kubenswrapper[4752]: I0929 11:15:15.806377 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b00fc0a-2c62-480e-93e6-95ed8b1305c8-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 11:15:15 crc kubenswrapper[4752]: I0929 11:15:15.806391 4752 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b00fc0a-2c62-480e-93e6-95ed8b1305c8-logs\") on node \"crc\" DevicePath \"\"" Sep 29 11:15:15 crc kubenswrapper[4752]: I0929 11:15:15.806403 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hqsz8\" (UniqueName: \"kubernetes.io/projected/8b00fc0a-2c62-480e-93e6-95ed8b1305c8-kube-api-access-hqsz8\") on node \"crc\" DevicePath \"\"" Sep 29 11:15:15 crc kubenswrapper[4752]: I0929 11:15:15.872971 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-rm9qv" Sep 29 11:15:16 crc kubenswrapper[4752]: I0929 11:15:16.052475 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="290917fa-62e5-4763-8d82-a941c4e6d6a6" path="/var/lib/kubelet/pods/290917fa-62e5-4763-8d82-a941c4e6d6a6/volumes" Sep 29 11:15:16 crc kubenswrapper[4752]: I0929 11:15:16.053376 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c26afed-cf6c-4876-a817-216eae20e80f" path="/var/lib/kubelet/pods/2c26afed-cf6c-4876-a817-216eae20e80f/volumes" Sep 29 11:15:16 crc kubenswrapper[4752]: I0929 11:15:16.053950 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88232ec0-75f8-407a-90eb-2a8d6fbb7703" path="/var/lib/kubelet/pods/88232ec0-75f8-407a-90eb-2a8d6fbb7703/volumes" Sep 29 11:15:16 crc kubenswrapper[4752]: I0929 11:15:16.158590 4752 generic.go:334] "Generic (PLEG): container finished" podID="f39fa61e-8858-4d6e-80db-bc94fdccaec8" containerID="a17245ec1c9efb32f26a19fe377a440a7fd3b84c08fbbdd6d68079700e9a78e0" exitCode=0 Sep 29 11:15:16 crc kubenswrapper[4752]: I0929 11:15:16.158667 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"f39fa61e-8858-4d6e-80db-bc94fdccaec8","Type":"ContainerDied","Data":"a17245ec1c9efb32f26a19fe377a440a7fd3b84c08fbbdd6d68079700e9a78e0"} Sep 29 11:15:16 crc kubenswrapper[4752]: I0929 11:15:16.162414 4752 generic.go:334] "Generic (PLEG): container finished" podID="8b09d00c-78f2-4399-bc78-3c901f3470ad" containerID="a59217981fcb9b639e460e9b3c77de8fb9adcca23087040a03f8f57cd23a1b8b" exitCode=0 Sep 29 11:15:16 crc kubenswrapper[4752]: I0929 11:15:16.162475 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"8b09d00c-78f2-4399-bc78-3c901f3470ad","Type":"ContainerDied","Data":"a59217981fcb9b639e460e9b3c77de8fb9adcca23087040a03f8f57cd23a1b8b"} Sep 29 11:15:16 crc kubenswrapper[4752]: I0929 11:15:16.164882 4752 generic.go:334] "Generic (PLEG): container finished" podID="8b00fc0a-2c62-480e-93e6-95ed8b1305c8" containerID="08fb7c9ca8887afc3730321ac0ce1b0e0e37d821b9e72f8a10de80286cba82ea" exitCode=0 Sep 29 11:15:16 crc kubenswrapper[4752]: I0929 11:15:16.164906 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"8b00fc0a-2c62-480e-93e6-95ed8b1305c8","Type":"ContainerDied","Data":"08fb7c9ca8887afc3730321ac0ce1b0e0e37d821b9e72f8a10de80286cba82ea"} Sep 29 11:15:16 crc kubenswrapper[4752]: I0929 11:15:16.164921 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"8b00fc0a-2c62-480e-93e6-95ed8b1305c8","Type":"ContainerDied","Data":"75107b0321cd3aaf5d2910da3a15988d281de943027b68eaf2e91171787ddd8c"} Sep 29 11:15:16 crc kubenswrapper[4752]: I0929 11:15:16.164937 4752 scope.go:117] "RemoveContainer" containerID="08fb7c9ca8887afc3730321ac0ce1b0e0e37d821b9e72f8a10de80286cba82ea" Sep 29 11:15:16 crc kubenswrapper[4752]: I0929 11:15:16.165051 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:15:16 crc kubenswrapper[4752]: I0929 11:15:16.201248 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Sep 29 11:15:16 crc kubenswrapper[4752]: I0929 11:15:16.206447 4752 scope.go:117] "RemoveContainer" containerID="08fb7c9ca8887afc3730321ac0ce1b0e0e37d821b9e72f8a10de80286cba82ea" Sep 29 11:15:16 crc kubenswrapper[4752]: E0929 11:15:16.206885 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08fb7c9ca8887afc3730321ac0ce1b0e0e37d821b9e72f8a10de80286cba82ea\": container with ID starting with 08fb7c9ca8887afc3730321ac0ce1b0e0e37d821b9e72f8a10de80286cba82ea not found: ID does not exist" containerID="08fb7c9ca8887afc3730321ac0ce1b0e0e37d821b9e72f8a10de80286cba82ea" Sep 29 11:15:16 crc kubenswrapper[4752]: I0929 11:15:16.206917 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08fb7c9ca8887afc3730321ac0ce1b0e0e37d821b9e72f8a10de80286cba82ea"} err="failed to get container status \"08fb7c9ca8887afc3730321ac0ce1b0e0e37d821b9e72f8a10de80286cba82ea\": rpc error: code = NotFound desc = could not find container \"08fb7c9ca8887afc3730321ac0ce1b0e0e37d821b9e72f8a10de80286cba82ea\": container with ID starting with 08fb7c9ca8887afc3730321ac0ce1b0e0e37d821b9e72f8a10de80286cba82ea not found: ID does not exist" Sep 29 11:15:16 crc kubenswrapper[4752]: I0929 11:15:16.210985 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Sep 29 11:15:16 crc kubenswrapper[4752]: I0929 11:15:16.298911 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:15:16 crc kubenswrapper[4752]: I0929 11:15:16.364191 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-db-create-rm9qv"] Sep 29 11:15:16 crc kubenswrapper[4752]: W0929 11:15:16.374146 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8d9cb262_2d02_4933_a57b_05a013784977.slice/crio-eab587f35bae7fb196e9e758337e328e4850dde9792f9f2c3ee74aaf1afc456d WatchSource:0}: Error finding container eab587f35bae7fb196e9e758337e328e4850dde9792f9f2c3ee74aaf1afc456d: Status 404 returned error can't find the container with id eab587f35bae7fb196e9e758337e328e4850dde9792f9f2c3ee74aaf1afc456d Sep 29 11:15:16 crc kubenswrapper[4752]: I0929 11:15:16.414873 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b09d00c-78f2-4399-bc78-3c901f3470ad-combined-ca-bundle\") pod \"8b09d00c-78f2-4399-bc78-3c901f3470ad\" (UID: \"8b09d00c-78f2-4399-bc78-3c901f3470ad\") " Sep 29 11:15:16 crc kubenswrapper[4752]: I0929 11:15:16.414930 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cgjjr\" (UniqueName: \"kubernetes.io/projected/8b09d00c-78f2-4399-bc78-3c901f3470ad-kube-api-access-cgjjr\") pod \"8b09d00c-78f2-4399-bc78-3c901f3470ad\" (UID: \"8b09d00c-78f2-4399-bc78-3c901f3470ad\") " Sep 29 11:15:16 crc kubenswrapper[4752]: I0929 11:15:16.415027 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8b09d00c-78f2-4399-bc78-3c901f3470ad-sg-core-conf-yaml\") pod \"8b09d00c-78f2-4399-bc78-3c901f3470ad\" (UID: \"8b09d00c-78f2-4399-bc78-3c901f3470ad\") " Sep 29 11:15:16 crc kubenswrapper[4752]: I0929 11:15:16.415052 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b09d00c-78f2-4399-bc78-3c901f3470ad-ceilometer-tls-certs\") pod \"8b09d00c-78f2-4399-bc78-3c901f3470ad\" (UID: \"8b09d00c-78f2-4399-bc78-3c901f3470ad\") " Sep 29 11:15:16 crc kubenswrapper[4752]: I0929 11:15:16.415083 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b09d00c-78f2-4399-bc78-3c901f3470ad-config-data\") pod \"8b09d00c-78f2-4399-bc78-3c901f3470ad\" (UID: \"8b09d00c-78f2-4399-bc78-3c901f3470ad\") " Sep 29 11:15:16 crc kubenswrapper[4752]: I0929 11:15:16.415120 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8b09d00c-78f2-4399-bc78-3c901f3470ad-run-httpd\") pod \"8b09d00c-78f2-4399-bc78-3c901f3470ad\" (UID: \"8b09d00c-78f2-4399-bc78-3c901f3470ad\") " Sep 29 11:15:16 crc kubenswrapper[4752]: I0929 11:15:16.415149 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b09d00c-78f2-4399-bc78-3c901f3470ad-scripts\") pod \"8b09d00c-78f2-4399-bc78-3c901f3470ad\" (UID: \"8b09d00c-78f2-4399-bc78-3c901f3470ad\") " Sep 29 11:15:16 crc kubenswrapper[4752]: I0929 11:15:16.415184 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8b09d00c-78f2-4399-bc78-3c901f3470ad-log-httpd\") pod \"8b09d00c-78f2-4399-bc78-3c901f3470ad\" (UID: \"8b09d00c-78f2-4399-bc78-3c901f3470ad\") " Sep 29 11:15:16 crc kubenswrapper[4752]: I0929 11:15:16.416285 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b09d00c-78f2-4399-bc78-3c901f3470ad-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "8b09d00c-78f2-4399-bc78-3c901f3470ad" (UID: "8b09d00c-78f2-4399-bc78-3c901f3470ad"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 11:15:16 crc kubenswrapper[4752]: I0929 11:15:16.416920 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b09d00c-78f2-4399-bc78-3c901f3470ad-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "8b09d00c-78f2-4399-bc78-3c901f3470ad" (UID: "8b09d00c-78f2-4399-bc78-3c901f3470ad"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 11:15:16 crc kubenswrapper[4752]: I0929 11:15:16.423290 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b09d00c-78f2-4399-bc78-3c901f3470ad-kube-api-access-cgjjr" (OuterVolumeSpecName: "kube-api-access-cgjjr") pod "8b09d00c-78f2-4399-bc78-3c901f3470ad" (UID: "8b09d00c-78f2-4399-bc78-3c901f3470ad"). InnerVolumeSpecName "kube-api-access-cgjjr". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 11:15:16 crc kubenswrapper[4752]: I0929 11:15:16.424523 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b09d00c-78f2-4399-bc78-3c901f3470ad-scripts" (OuterVolumeSpecName: "scripts") pod "8b09d00c-78f2-4399-bc78-3c901f3470ad" (UID: "8b09d00c-78f2-4399-bc78-3c901f3470ad"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:15:16 crc kubenswrapper[4752]: I0929 11:15:16.449194 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b09d00c-78f2-4399-bc78-3c901f3470ad-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "8b09d00c-78f2-4399-bc78-3c901f3470ad" (UID: "8b09d00c-78f2-4399-bc78-3c901f3470ad"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:15:16 crc kubenswrapper[4752]: I0929 11:15:16.460059 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b09d00c-78f2-4399-bc78-3c901f3470ad-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "8b09d00c-78f2-4399-bc78-3c901f3470ad" (UID: "8b09d00c-78f2-4399-bc78-3c901f3470ad"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:15:16 crc kubenswrapper[4752]: I0929 11:15:16.478688 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b09d00c-78f2-4399-bc78-3c901f3470ad-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8b09d00c-78f2-4399-bc78-3c901f3470ad" (UID: "8b09d00c-78f2-4399-bc78-3c901f3470ad"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:15:16 crc kubenswrapper[4752]: I0929 11:15:16.502479 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:15:16 crc kubenswrapper[4752]: I0929 11:15:16.502793 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b09d00c-78f2-4399-bc78-3c901f3470ad-config-data" (OuterVolumeSpecName: "config-data") pod "8b09d00c-78f2-4399-bc78-3c901f3470ad" (UID: "8b09d00c-78f2-4399-bc78-3c901f3470ad"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:15:16 crc kubenswrapper[4752]: I0929 11:15:16.516821 4752 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8b09d00c-78f2-4399-bc78-3c901f3470ad-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 29 11:15:16 crc kubenswrapper[4752]: I0929 11:15:16.516847 4752 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b09d00c-78f2-4399-bc78-3c901f3470ad-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 29 11:15:16 crc kubenswrapper[4752]: I0929 11:15:16.516857 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b09d00c-78f2-4399-bc78-3c901f3470ad-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 11:15:16 crc kubenswrapper[4752]: I0929 11:15:16.516865 4752 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8b09d00c-78f2-4399-bc78-3c901f3470ad-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 29 11:15:16 crc kubenswrapper[4752]: I0929 11:15:16.516873 4752 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b09d00c-78f2-4399-bc78-3c901f3470ad-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 11:15:16 crc kubenswrapper[4752]: I0929 11:15:16.516884 4752 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8b09d00c-78f2-4399-bc78-3c901f3470ad-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 29 11:15:16 crc kubenswrapper[4752]: I0929 11:15:16.516892 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b09d00c-78f2-4399-bc78-3c901f3470ad-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 11:15:16 crc kubenswrapper[4752]: I0929 11:15:16.516901 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cgjjr\" (UniqueName: \"kubernetes.io/projected/8b09d00c-78f2-4399-bc78-3c901f3470ad-kube-api-access-cgjjr\") on node \"crc\" DevicePath \"\"" Sep 29 11:15:16 crc kubenswrapper[4752]: I0929 11:15:16.618422 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/f39fa61e-8858-4d6e-80db-bc94fdccaec8-custom-prometheus-ca\") pod \"f39fa61e-8858-4d6e-80db-bc94fdccaec8\" (UID: \"f39fa61e-8858-4d6e-80db-bc94fdccaec8\") " Sep 29 11:15:16 crc kubenswrapper[4752]: I0929 11:15:16.618491 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f39fa61e-8858-4d6e-80db-bc94fdccaec8-logs\") pod \"f39fa61e-8858-4d6e-80db-bc94fdccaec8\" (UID: \"f39fa61e-8858-4d6e-80db-bc94fdccaec8\") " Sep 29 11:15:16 crc kubenswrapper[4752]: I0929 11:15:16.618594 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f39fa61e-8858-4d6e-80db-bc94fdccaec8-config-data\") pod \"f39fa61e-8858-4d6e-80db-bc94fdccaec8\" (UID: \"f39fa61e-8858-4d6e-80db-bc94fdccaec8\") " Sep 29 11:15:16 crc kubenswrapper[4752]: I0929 11:15:16.618617 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-66rbr\" (UniqueName: \"kubernetes.io/projected/f39fa61e-8858-4d6e-80db-bc94fdccaec8-kube-api-access-66rbr\") pod \"f39fa61e-8858-4d6e-80db-bc94fdccaec8\" (UID: \"f39fa61e-8858-4d6e-80db-bc94fdccaec8\") " Sep 29 11:15:16 crc kubenswrapper[4752]: I0929 11:15:16.618671 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f39fa61e-8858-4d6e-80db-bc94fdccaec8-combined-ca-bundle\") pod \"f39fa61e-8858-4d6e-80db-bc94fdccaec8\" (UID: \"f39fa61e-8858-4d6e-80db-bc94fdccaec8\") " Sep 29 11:15:16 crc kubenswrapper[4752]: I0929 11:15:16.618897 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f39fa61e-8858-4d6e-80db-bc94fdccaec8-logs" (OuterVolumeSpecName: "logs") pod "f39fa61e-8858-4d6e-80db-bc94fdccaec8" (UID: "f39fa61e-8858-4d6e-80db-bc94fdccaec8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 11:15:16 crc kubenswrapper[4752]: I0929 11:15:16.618974 4752 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f39fa61e-8858-4d6e-80db-bc94fdccaec8-logs\") on node \"crc\" DevicePath \"\"" Sep 29 11:15:16 crc kubenswrapper[4752]: I0929 11:15:16.622537 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f39fa61e-8858-4d6e-80db-bc94fdccaec8-kube-api-access-66rbr" (OuterVolumeSpecName: "kube-api-access-66rbr") pod "f39fa61e-8858-4d6e-80db-bc94fdccaec8" (UID: "f39fa61e-8858-4d6e-80db-bc94fdccaec8"). InnerVolumeSpecName "kube-api-access-66rbr". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 11:15:16 crc kubenswrapper[4752]: I0929 11:15:16.647160 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f39fa61e-8858-4d6e-80db-bc94fdccaec8-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "f39fa61e-8858-4d6e-80db-bc94fdccaec8" (UID: "f39fa61e-8858-4d6e-80db-bc94fdccaec8"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:15:16 crc kubenswrapper[4752]: I0929 11:15:16.653549 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f39fa61e-8858-4d6e-80db-bc94fdccaec8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f39fa61e-8858-4d6e-80db-bc94fdccaec8" (UID: "f39fa61e-8858-4d6e-80db-bc94fdccaec8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:15:16 crc kubenswrapper[4752]: I0929 11:15:16.663177 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f39fa61e-8858-4d6e-80db-bc94fdccaec8-config-data" (OuterVolumeSpecName: "config-data") pod "f39fa61e-8858-4d6e-80db-bc94fdccaec8" (UID: "f39fa61e-8858-4d6e-80db-bc94fdccaec8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:15:16 crc kubenswrapper[4752]: I0929 11:15:16.720025 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f39fa61e-8858-4d6e-80db-bc94fdccaec8-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 11:15:16 crc kubenswrapper[4752]: I0929 11:15:16.720063 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-66rbr\" (UniqueName: \"kubernetes.io/projected/f39fa61e-8858-4d6e-80db-bc94fdccaec8-kube-api-access-66rbr\") on node \"crc\" DevicePath \"\"" Sep 29 11:15:16 crc kubenswrapper[4752]: I0929 11:15:16.720079 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f39fa61e-8858-4d6e-80db-bc94fdccaec8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 11:15:16 crc kubenswrapper[4752]: I0929 11:15:16.720091 4752 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/f39fa61e-8858-4d6e-80db-bc94fdccaec8-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Sep 29 11:15:17 crc kubenswrapper[4752]: I0929 11:15:17.175925 4752 generic.go:334] "Generic (PLEG): container finished" podID="8d9cb262-2d02-4933-a57b-05a013784977" containerID="413a58f9ffaecd4c0151757e902267b0090dd6db37fc31ff45e03744b2d4c1ed" exitCode=0 Sep 29 11:15:17 crc kubenswrapper[4752]: I0929 11:15:17.175995 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-rm9qv" event={"ID":"8d9cb262-2d02-4933-a57b-05a013784977","Type":"ContainerDied","Data":"413a58f9ffaecd4c0151757e902267b0090dd6db37fc31ff45e03744b2d4c1ed"} Sep 29 11:15:17 crc kubenswrapper[4752]: I0929 11:15:17.176050 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-rm9qv" event={"ID":"8d9cb262-2d02-4933-a57b-05a013784977","Type":"ContainerStarted","Data":"eab587f35bae7fb196e9e758337e328e4850dde9792f9f2c3ee74aaf1afc456d"} Sep 29 11:15:17 crc kubenswrapper[4752]: I0929 11:15:17.177703 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"f39fa61e-8858-4d6e-80db-bc94fdccaec8","Type":"ContainerDied","Data":"936ccb41c8ff30271a8e57f830fc35a21510627a65779dcabce327b0d122c653"} Sep 29 11:15:17 crc kubenswrapper[4752]: I0929 11:15:17.177721 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:15:17 crc kubenswrapper[4752]: I0929 11:15:17.177741 4752 scope.go:117] "RemoveContainer" containerID="a17245ec1c9efb32f26a19fe377a440a7fd3b84c08fbbdd6d68079700e9a78e0" Sep 29 11:15:17 crc kubenswrapper[4752]: I0929 11:15:17.181689 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"8b09d00c-78f2-4399-bc78-3c901f3470ad","Type":"ContainerDied","Data":"0dcf8829037bd340f4cd12ccabd43f42549de12b5bbbe1b0821e3f6f25e724db"} Sep 29 11:15:17 crc kubenswrapper[4752]: I0929 11:15:17.181791 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:15:17 crc kubenswrapper[4752]: I0929 11:15:17.253723 4752 scope.go:117] "RemoveContainer" containerID="f185d6ba4609c787ea95f5509f05e77b699306647b1a4f71d4c17539ca5826c6" Sep 29 11:15:17 crc kubenswrapper[4752]: I0929 11:15:17.258471 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Sep 29 11:15:17 crc kubenswrapper[4752]: I0929 11:15:17.267412 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Sep 29 11:15:17 crc kubenswrapper[4752]: I0929 11:15:17.278388 4752 scope.go:117] "RemoveContainer" containerID="3d0632330e2d8ac690c35022aa66b02ac23ae110fccd155839efcc35e041325d" Sep 29 11:15:17 crc kubenswrapper[4752]: I0929 11:15:17.284389 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Sep 29 11:15:17 crc kubenswrapper[4752]: I0929 11:15:17.289672 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Sep 29 11:15:17 crc kubenswrapper[4752]: I0929 11:15:17.303819 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Sep 29 11:15:17 crc kubenswrapper[4752]: E0929 11:15:17.304117 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b09d00c-78f2-4399-bc78-3c901f3470ad" containerName="ceilometer-notification-agent" Sep 29 11:15:17 crc kubenswrapper[4752]: I0929 11:15:17.304136 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b09d00c-78f2-4399-bc78-3c901f3470ad" containerName="ceilometer-notification-agent" Sep 29 11:15:17 crc kubenswrapper[4752]: E0929 11:15:17.304145 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b00fc0a-2c62-480e-93e6-95ed8b1305c8" containerName="watcher-applier" Sep 29 11:15:17 crc kubenswrapper[4752]: I0929 11:15:17.304151 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b00fc0a-2c62-480e-93e6-95ed8b1305c8" containerName="watcher-applier" Sep 29 11:15:17 crc kubenswrapper[4752]: E0929 11:15:17.304162 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b09d00c-78f2-4399-bc78-3c901f3470ad" containerName="ceilometer-central-agent" Sep 29 11:15:17 crc kubenswrapper[4752]: I0929 11:15:17.304168 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b09d00c-78f2-4399-bc78-3c901f3470ad" containerName="ceilometer-central-agent" Sep 29 11:15:17 crc kubenswrapper[4752]: E0929 11:15:17.304179 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b09d00c-78f2-4399-bc78-3c901f3470ad" containerName="sg-core" Sep 29 11:15:17 crc kubenswrapper[4752]: I0929 11:15:17.304185 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b09d00c-78f2-4399-bc78-3c901f3470ad" containerName="sg-core" Sep 29 11:15:17 crc kubenswrapper[4752]: E0929 11:15:17.304199 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b09d00c-78f2-4399-bc78-3c901f3470ad" containerName="proxy-httpd" Sep 29 11:15:17 crc kubenswrapper[4752]: I0929 11:15:17.304205 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b09d00c-78f2-4399-bc78-3c901f3470ad" containerName="proxy-httpd" Sep 29 11:15:17 crc kubenswrapper[4752]: E0929 11:15:17.304224 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f39fa61e-8858-4d6e-80db-bc94fdccaec8" containerName="watcher-decision-engine" Sep 29 11:15:17 crc kubenswrapper[4752]: I0929 11:15:17.304230 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="f39fa61e-8858-4d6e-80db-bc94fdccaec8" containerName="watcher-decision-engine" Sep 29 11:15:17 crc kubenswrapper[4752]: I0929 11:15:17.304361 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b09d00c-78f2-4399-bc78-3c901f3470ad" containerName="ceilometer-notification-agent" Sep 29 11:15:17 crc kubenswrapper[4752]: I0929 11:15:17.304372 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b09d00c-78f2-4399-bc78-3c901f3470ad" containerName="sg-core" Sep 29 11:15:17 crc kubenswrapper[4752]: I0929 11:15:17.304385 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="f39fa61e-8858-4d6e-80db-bc94fdccaec8" containerName="watcher-decision-engine" Sep 29 11:15:17 crc kubenswrapper[4752]: I0929 11:15:17.304398 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b09d00c-78f2-4399-bc78-3c901f3470ad" containerName="proxy-httpd" Sep 29 11:15:17 crc kubenswrapper[4752]: I0929 11:15:17.304410 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b09d00c-78f2-4399-bc78-3c901f3470ad" containerName="ceilometer-central-agent" Sep 29 11:15:17 crc kubenswrapper[4752]: I0929 11:15:17.304417 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b00fc0a-2c62-480e-93e6-95ed8b1305c8" containerName="watcher-applier" Sep 29 11:15:17 crc kubenswrapper[4752]: I0929 11:15:17.306307 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:15:17 crc kubenswrapper[4752]: I0929 11:15:17.307292 4752 scope.go:117] "RemoveContainer" containerID="a59217981fcb9b639e460e9b3c77de8fb9adcca23087040a03f8f57cd23a1b8b" Sep 29 11:15:17 crc kubenswrapper[4752]: I0929 11:15:17.309258 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Sep 29 11:15:17 crc kubenswrapper[4752]: I0929 11:15:17.309587 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Sep 29 11:15:17 crc kubenswrapper[4752]: I0929 11:15:17.310791 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-ceilometer-internal-svc" Sep 29 11:15:17 crc kubenswrapper[4752]: I0929 11:15:17.328012 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Sep 29 11:15:17 crc kubenswrapper[4752]: I0929 11:15:17.329340 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83dd436e-d5bb-40a5-93f7-d6587789f13f-scripts\") pod \"ceilometer-0\" (UID: \"83dd436e-d5bb-40a5-93f7-d6587789f13f\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:15:17 crc kubenswrapper[4752]: I0929 11:15:17.329396 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/83dd436e-d5bb-40a5-93f7-d6587789f13f-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"83dd436e-d5bb-40a5-93f7-d6587789f13f\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:15:17 crc kubenswrapper[4752]: I0929 11:15:17.329435 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/83dd436e-d5bb-40a5-93f7-d6587789f13f-log-httpd\") pod \"ceilometer-0\" (UID: \"83dd436e-d5bb-40a5-93f7-d6587789f13f\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:15:17 crc kubenswrapper[4752]: I0929 11:15:17.329467 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83dd436e-d5bb-40a5-93f7-d6587789f13f-config-data\") pod \"ceilometer-0\" (UID: \"83dd436e-d5bb-40a5-93f7-d6587789f13f\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:15:17 crc kubenswrapper[4752]: I0929 11:15:17.329531 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83dd436e-d5bb-40a5-93f7-d6587789f13f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"83dd436e-d5bb-40a5-93f7-d6587789f13f\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:15:17 crc kubenswrapper[4752]: I0929 11:15:17.329571 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/83dd436e-d5bb-40a5-93f7-d6587789f13f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"83dd436e-d5bb-40a5-93f7-d6587789f13f\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:15:17 crc kubenswrapper[4752]: I0929 11:15:17.329598 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/83dd436e-d5bb-40a5-93f7-d6587789f13f-run-httpd\") pod \"ceilometer-0\" (UID: \"83dd436e-d5bb-40a5-93f7-d6587789f13f\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:15:17 crc kubenswrapper[4752]: I0929 11:15:17.329640 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4j5f\" (UniqueName: \"kubernetes.io/projected/83dd436e-d5bb-40a5-93f7-d6587789f13f-kube-api-access-g4j5f\") pod \"ceilometer-0\" (UID: \"83dd436e-d5bb-40a5-93f7-d6587789f13f\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:15:17 crc kubenswrapper[4752]: I0929 11:15:17.349006 4752 scope.go:117] "RemoveContainer" containerID="7027072bd227485b6c8846e90b7a47acbf71310ea093d71a484c0d57bd584f85" Sep 29 11:15:17 crc kubenswrapper[4752]: I0929 11:15:17.430967 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/83dd436e-d5bb-40a5-93f7-d6587789f13f-run-httpd\") pod \"ceilometer-0\" (UID: \"83dd436e-d5bb-40a5-93f7-d6587789f13f\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:15:17 crc kubenswrapper[4752]: I0929 11:15:17.431040 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4j5f\" (UniqueName: \"kubernetes.io/projected/83dd436e-d5bb-40a5-93f7-d6587789f13f-kube-api-access-g4j5f\") pod \"ceilometer-0\" (UID: \"83dd436e-d5bb-40a5-93f7-d6587789f13f\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:15:17 crc kubenswrapper[4752]: I0929 11:15:17.431109 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83dd436e-d5bb-40a5-93f7-d6587789f13f-scripts\") pod \"ceilometer-0\" (UID: \"83dd436e-d5bb-40a5-93f7-d6587789f13f\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:15:17 crc kubenswrapper[4752]: I0929 11:15:17.431133 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/83dd436e-d5bb-40a5-93f7-d6587789f13f-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"83dd436e-d5bb-40a5-93f7-d6587789f13f\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:15:17 crc kubenswrapper[4752]: I0929 11:15:17.431155 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/83dd436e-d5bb-40a5-93f7-d6587789f13f-log-httpd\") pod \"ceilometer-0\" (UID: \"83dd436e-d5bb-40a5-93f7-d6587789f13f\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:15:17 crc kubenswrapper[4752]: I0929 11:15:17.431181 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83dd436e-d5bb-40a5-93f7-d6587789f13f-config-data\") pod \"ceilometer-0\" (UID: \"83dd436e-d5bb-40a5-93f7-d6587789f13f\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:15:17 crc kubenswrapper[4752]: I0929 11:15:17.431235 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83dd436e-d5bb-40a5-93f7-d6587789f13f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"83dd436e-d5bb-40a5-93f7-d6587789f13f\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:15:17 crc kubenswrapper[4752]: I0929 11:15:17.431271 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/83dd436e-d5bb-40a5-93f7-d6587789f13f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"83dd436e-d5bb-40a5-93f7-d6587789f13f\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:15:17 crc kubenswrapper[4752]: I0929 11:15:17.431433 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/83dd436e-d5bb-40a5-93f7-d6587789f13f-run-httpd\") pod \"ceilometer-0\" (UID: \"83dd436e-d5bb-40a5-93f7-d6587789f13f\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:15:17 crc kubenswrapper[4752]: I0929 11:15:17.431664 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/83dd436e-d5bb-40a5-93f7-d6587789f13f-log-httpd\") pod \"ceilometer-0\" (UID: \"83dd436e-d5bb-40a5-93f7-d6587789f13f\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:15:17 crc kubenswrapper[4752]: I0929 11:15:17.435212 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83dd436e-d5bb-40a5-93f7-d6587789f13f-scripts\") pod \"ceilometer-0\" (UID: \"83dd436e-d5bb-40a5-93f7-d6587789f13f\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:15:17 crc kubenswrapper[4752]: I0929 11:15:17.436101 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/83dd436e-d5bb-40a5-93f7-d6587789f13f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"83dd436e-d5bb-40a5-93f7-d6587789f13f\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:15:17 crc kubenswrapper[4752]: I0929 11:15:17.436355 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/83dd436e-d5bb-40a5-93f7-d6587789f13f-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"83dd436e-d5bb-40a5-93f7-d6587789f13f\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:15:17 crc kubenswrapper[4752]: I0929 11:15:17.436598 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83dd436e-d5bb-40a5-93f7-d6587789f13f-config-data\") pod \"ceilometer-0\" (UID: \"83dd436e-d5bb-40a5-93f7-d6587789f13f\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:15:17 crc kubenswrapper[4752]: I0929 11:15:17.436998 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83dd436e-d5bb-40a5-93f7-d6587789f13f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"83dd436e-d5bb-40a5-93f7-d6587789f13f\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:15:17 crc kubenswrapper[4752]: I0929 11:15:17.452422 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4j5f\" (UniqueName: \"kubernetes.io/projected/83dd436e-d5bb-40a5-93f7-d6587789f13f-kube-api-access-g4j5f\") pod \"ceilometer-0\" (UID: \"83dd436e-d5bb-40a5-93f7-d6587789f13f\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:15:17 crc kubenswrapper[4752]: I0929 11:15:17.625120 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:15:18 crc kubenswrapper[4752]: I0929 11:15:18.041273 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b00fc0a-2c62-480e-93e6-95ed8b1305c8" path="/var/lib/kubelet/pods/8b00fc0a-2c62-480e-93e6-95ed8b1305c8/volumes" Sep 29 11:15:18 crc kubenswrapper[4752]: I0929 11:15:18.042275 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b09d00c-78f2-4399-bc78-3c901f3470ad" path="/var/lib/kubelet/pods/8b09d00c-78f2-4399-bc78-3c901f3470ad/volumes" Sep 29 11:15:18 crc kubenswrapper[4752]: I0929 11:15:18.043171 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f39fa61e-8858-4d6e-80db-bc94fdccaec8" path="/var/lib/kubelet/pods/f39fa61e-8858-4d6e-80db-bc94fdccaec8/volumes" Sep 29 11:15:18 crc kubenswrapper[4752]: I0929 11:15:18.088438 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Sep 29 11:15:18 crc kubenswrapper[4752]: W0929 11:15:18.093347 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod83dd436e_d5bb_40a5_93f7_d6587789f13f.slice/crio-a5869d471f0631604f24a318bd87936c598ab96383d7c2a06273a712c742059d WatchSource:0}: Error finding container a5869d471f0631604f24a318bd87936c598ab96383d7c2a06273a712c742059d: Status 404 returned error can't find the container with id a5869d471f0631604f24a318bd87936c598ab96383d7c2a06273a712c742059d Sep 29 11:15:18 crc kubenswrapper[4752]: I0929 11:15:18.196903 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"83dd436e-d5bb-40a5-93f7-d6587789f13f","Type":"ContainerStarted","Data":"a5869d471f0631604f24a318bd87936c598ab96383d7c2a06273a712c742059d"} Sep 29 11:15:18 crc kubenswrapper[4752]: I0929 11:15:18.657988 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-rm9qv" Sep 29 11:15:18 crc kubenswrapper[4752]: I0929 11:15:18.751436 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2wph\" (UniqueName: \"kubernetes.io/projected/8d9cb262-2d02-4933-a57b-05a013784977-kube-api-access-p2wph\") pod \"8d9cb262-2d02-4933-a57b-05a013784977\" (UID: \"8d9cb262-2d02-4933-a57b-05a013784977\") " Sep 29 11:15:18 crc kubenswrapper[4752]: I0929 11:15:18.756329 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d9cb262-2d02-4933-a57b-05a013784977-kube-api-access-p2wph" (OuterVolumeSpecName: "kube-api-access-p2wph") pod "8d9cb262-2d02-4933-a57b-05a013784977" (UID: "8d9cb262-2d02-4933-a57b-05a013784977"). InnerVolumeSpecName "kube-api-access-p2wph". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 11:15:18 crc kubenswrapper[4752]: I0929 11:15:18.853682 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p2wph\" (UniqueName: \"kubernetes.io/projected/8d9cb262-2d02-4933-a57b-05a013784977-kube-api-access-p2wph\") on node \"crc\" DevicePath \"\"" Sep 29 11:15:19 crc kubenswrapper[4752]: I0929 11:15:19.215164 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"83dd436e-d5bb-40a5-93f7-d6587789f13f","Type":"ContainerStarted","Data":"628d0a5ad9c94c8061a32bab98243f472011f8877e9084b5e7597be812cf0074"} Sep 29 11:15:19 crc kubenswrapper[4752]: I0929 11:15:19.220368 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-rm9qv" event={"ID":"8d9cb262-2d02-4933-a57b-05a013784977","Type":"ContainerDied","Data":"eab587f35bae7fb196e9e758337e328e4850dde9792f9f2c3ee74aaf1afc456d"} Sep 29 11:15:19 crc kubenswrapper[4752]: I0929 11:15:19.220416 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eab587f35bae7fb196e9e758337e328e4850dde9792f9f2c3ee74aaf1afc456d" Sep 29 11:15:19 crc kubenswrapper[4752]: I0929 11:15:19.220484 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-rm9qv" Sep 29 11:15:20 crc kubenswrapper[4752]: I0929 11:15:20.231423 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"83dd436e-d5bb-40a5-93f7-d6587789f13f","Type":"ContainerStarted","Data":"d68f6e86e1b11c3ebc15e7b5f47361ac86532bdc31b889c9450abc6b332910c4"} Sep 29 11:15:21 crc kubenswrapper[4752]: I0929 11:15:21.241198 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"83dd436e-d5bb-40a5-93f7-d6587789f13f","Type":"ContainerStarted","Data":"6d5b6eec2db4d0e77ea84569feef4a1a86c86a6b62cd32151362a9f18a91be7c"} Sep 29 11:15:21 crc kubenswrapper[4752]: I0929 11:15:21.256366 4752 scope.go:117] "RemoveContainer" containerID="b6f199eebbc77d83b64cdc8243513a246ec991f0c48284c2c2e95bfaeea3d843" Sep 29 11:15:23 crc kubenswrapper[4752]: I0929 11:15:23.268349 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"83dd436e-d5bb-40a5-93f7-d6587789f13f","Type":"ContainerStarted","Data":"c150b5835d75a473616e7e55fc4891b9b49ac361decb82cb0cd0472dddb3af7d"} Sep 29 11:15:23 crc kubenswrapper[4752]: I0929 11:15:23.268882 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:15:23 crc kubenswrapper[4752]: I0929 11:15:23.300628 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/ceilometer-0" podStartSLOduration=1.981144748 podStartE2EDuration="6.300610894s" podCreationTimestamp="2025-09-29 11:15:17 +0000 UTC" firstStartedPulling="2025-09-29 11:15:18.095880398 +0000 UTC m=+1858.885022065" lastFinishedPulling="2025-09-29 11:15:22.415346544 +0000 UTC m=+1863.204488211" observedRunningTime="2025-09-29 11:15:23.299619628 +0000 UTC m=+1864.088761295" watchObservedRunningTime="2025-09-29 11:15:23.300610894 +0000 UTC m=+1864.089752561" Sep 29 11:15:25 crc kubenswrapper[4752]: I0929 11:15:25.567029 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-d1a5-account-create-p8qf5"] Sep 29 11:15:25 crc kubenswrapper[4752]: E0929 11:15:25.567582 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d9cb262-2d02-4933-a57b-05a013784977" containerName="mariadb-database-create" Sep 29 11:15:25 crc kubenswrapper[4752]: I0929 11:15:25.567593 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d9cb262-2d02-4933-a57b-05a013784977" containerName="mariadb-database-create" Sep 29 11:15:25 crc kubenswrapper[4752]: I0929 11:15:25.567753 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d9cb262-2d02-4933-a57b-05a013784977" containerName="mariadb-database-create" Sep 29 11:15:25 crc kubenswrapper[4752]: I0929 11:15:25.568283 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-d1a5-account-create-p8qf5" Sep 29 11:15:25 crc kubenswrapper[4752]: I0929 11:15:25.571298 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-db-secret" Sep 29 11:15:25 crc kubenswrapper[4752]: I0929 11:15:25.588233 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-d1a5-account-create-p8qf5"] Sep 29 11:15:25 crc kubenswrapper[4752]: I0929 11:15:25.765429 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrc8r\" (UniqueName: \"kubernetes.io/projected/d41c2494-66d2-4067-9019-409ba7cdab85-kube-api-access-vrc8r\") pod \"watcher-d1a5-account-create-p8qf5\" (UID: \"d41c2494-66d2-4067-9019-409ba7cdab85\") " pod="watcher-kuttl-default/watcher-d1a5-account-create-p8qf5" Sep 29 11:15:25 crc kubenswrapper[4752]: I0929 11:15:25.866582 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrc8r\" (UniqueName: \"kubernetes.io/projected/d41c2494-66d2-4067-9019-409ba7cdab85-kube-api-access-vrc8r\") pod \"watcher-d1a5-account-create-p8qf5\" (UID: \"d41c2494-66d2-4067-9019-409ba7cdab85\") " pod="watcher-kuttl-default/watcher-d1a5-account-create-p8qf5" Sep 29 11:15:25 crc kubenswrapper[4752]: I0929 11:15:25.888422 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrc8r\" (UniqueName: \"kubernetes.io/projected/d41c2494-66d2-4067-9019-409ba7cdab85-kube-api-access-vrc8r\") pod \"watcher-d1a5-account-create-p8qf5\" (UID: \"d41c2494-66d2-4067-9019-409ba7cdab85\") " pod="watcher-kuttl-default/watcher-d1a5-account-create-p8qf5" Sep 29 11:15:25 crc kubenswrapper[4752]: I0929 11:15:25.888852 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-d1a5-account-create-p8qf5" Sep 29 11:15:26 crc kubenswrapper[4752]: I0929 11:15:26.326458 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-d1a5-account-create-p8qf5"] Sep 29 11:15:27 crc kubenswrapper[4752]: I0929 11:15:27.298474 4752 generic.go:334] "Generic (PLEG): container finished" podID="d41c2494-66d2-4067-9019-409ba7cdab85" containerID="d9ce4c48f5a8205cef2943a7896798e3861683be3a68bec961bcc5a03155448a" exitCode=0 Sep 29 11:15:27 crc kubenswrapper[4752]: I0929 11:15:27.298773 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-d1a5-account-create-p8qf5" event={"ID":"d41c2494-66d2-4067-9019-409ba7cdab85","Type":"ContainerDied","Data":"d9ce4c48f5a8205cef2943a7896798e3861683be3a68bec961bcc5a03155448a"} Sep 29 11:15:27 crc kubenswrapper[4752]: I0929 11:15:27.298823 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-d1a5-account-create-p8qf5" event={"ID":"d41c2494-66d2-4067-9019-409ba7cdab85","Type":"ContainerStarted","Data":"fcac1d2955aba8ffc67ed0510fec419a485ca07f23dd8b64fd76b0c5fd66157f"} Sep 29 11:15:28 crc kubenswrapper[4752]: I0929 11:15:28.622104 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-d1a5-account-create-p8qf5" Sep 29 11:15:28 crc kubenswrapper[4752]: I0929 11:15:28.815001 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vrc8r\" (UniqueName: \"kubernetes.io/projected/d41c2494-66d2-4067-9019-409ba7cdab85-kube-api-access-vrc8r\") pod \"d41c2494-66d2-4067-9019-409ba7cdab85\" (UID: \"d41c2494-66d2-4067-9019-409ba7cdab85\") " Sep 29 11:15:28 crc kubenswrapper[4752]: I0929 11:15:28.822498 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d41c2494-66d2-4067-9019-409ba7cdab85-kube-api-access-vrc8r" (OuterVolumeSpecName: "kube-api-access-vrc8r") pod "d41c2494-66d2-4067-9019-409ba7cdab85" (UID: "d41c2494-66d2-4067-9019-409ba7cdab85"). InnerVolumeSpecName "kube-api-access-vrc8r". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 11:15:28 crc kubenswrapper[4752]: I0929 11:15:28.917055 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vrc8r\" (UniqueName: \"kubernetes.io/projected/d41c2494-66d2-4067-9019-409ba7cdab85-kube-api-access-vrc8r\") on node \"crc\" DevicePath \"\"" Sep 29 11:15:29 crc kubenswrapper[4752]: I0929 11:15:29.316048 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-d1a5-account-create-p8qf5" event={"ID":"d41c2494-66d2-4067-9019-409ba7cdab85","Type":"ContainerDied","Data":"fcac1d2955aba8ffc67ed0510fec419a485ca07f23dd8b64fd76b0c5fd66157f"} Sep 29 11:15:29 crc kubenswrapper[4752]: I0929 11:15:29.316091 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fcac1d2955aba8ffc67ed0510fec419a485ca07f23dd8b64fd76b0c5fd66157f" Sep 29 11:15:29 crc kubenswrapper[4752]: I0929 11:15:29.316115 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-d1a5-account-create-p8qf5" Sep 29 11:15:30 crc kubenswrapper[4752]: I0929 11:15:30.837155 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-r67pr"] Sep 29 11:15:30 crc kubenswrapper[4752]: E0929 11:15:30.837864 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d41c2494-66d2-4067-9019-409ba7cdab85" containerName="mariadb-account-create" Sep 29 11:15:30 crc kubenswrapper[4752]: I0929 11:15:30.837877 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="d41c2494-66d2-4067-9019-409ba7cdab85" containerName="mariadb-account-create" Sep 29 11:15:30 crc kubenswrapper[4752]: I0929 11:15:30.838038 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="d41c2494-66d2-4067-9019-409ba7cdab85" containerName="mariadb-account-create" Sep 29 11:15:30 crc kubenswrapper[4752]: I0929 11:15:30.838656 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-r67pr" Sep 29 11:15:30 crc kubenswrapper[4752]: I0929 11:15:30.841047 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-watcher-kuttl-dockercfg-hjz6l" Sep 29 11:15:30 crc kubenswrapper[4752]: I0929 11:15:30.845732 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-config-data" Sep 29 11:15:30 crc kubenswrapper[4752]: I0929 11:15:30.849103 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-r67pr"] Sep 29 11:15:30 crc kubenswrapper[4752]: I0929 11:15:30.850132 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/30380408-18cc-4feb-b122-aa9ae9047279-db-sync-config-data\") pod \"watcher-kuttl-db-sync-r67pr\" (UID: \"30380408-18cc-4feb-b122-aa9ae9047279\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-r67pr" Sep 29 11:15:30 crc kubenswrapper[4752]: I0929 11:15:30.850200 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgsp9\" (UniqueName: \"kubernetes.io/projected/30380408-18cc-4feb-b122-aa9ae9047279-kube-api-access-dgsp9\") pod \"watcher-kuttl-db-sync-r67pr\" (UID: \"30380408-18cc-4feb-b122-aa9ae9047279\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-r67pr" Sep 29 11:15:30 crc kubenswrapper[4752]: I0929 11:15:30.850348 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30380408-18cc-4feb-b122-aa9ae9047279-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-r67pr\" (UID: \"30380408-18cc-4feb-b122-aa9ae9047279\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-r67pr" Sep 29 11:15:30 crc kubenswrapper[4752]: I0929 11:15:30.850389 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30380408-18cc-4feb-b122-aa9ae9047279-config-data\") pod \"watcher-kuttl-db-sync-r67pr\" (UID: \"30380408-18cc-4feb-b122-aa9ae9047279\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-r67pr" Sep 29 11:15:30 crc kubenswrapper[4752]: I0929 11:15:30.951380 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30380408-18cc-4feb-b122-aa9ae9047279-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-r67pr\" (UID: \"30380408-18cc-4feb-b122-aa9ae9047279\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-r67pr" Sep 29 11:15:30 crc kubenswrapper[4752]: I0929 11:15:30.951691 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30380408-18cc-4feb-b122-aa9ae9047279-config-data\") pod \"watcher-kuttl-db-sync-r67pr\" (UID: \"30380408-18cc-4feb-b122-aa9ae9047279\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-r67pr" Sep 29 11:15:30 crc kubenswrapper[4752]: I0929 11:15:30.951729 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/30380408-18cc-4feb-b122-aa9ae9047279-db-sync-config-data\") pod \"watcher-kuttl-db-sync-r67pr\" (UID: \"30380408-18cc-4feb-b122-aa9ae9047279\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-r67pr" Sep 29 11:15:30 crc kubenswrapper[4752]: I0929 11:15:30.951763 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgsp9\" (UniqueName: \"kubernetes.io/projected/30380408-18cc-4feb-b122-aa9ae9047279-kube-api-access-dgsp9\") pod \"watcher-kuttl-db-sync-r67pr\" (UID: \"30380408-18cc-4feb-b122-aa9ae9047279\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-r67pr" Sep 29 11:15:30 crc kubenswrapper[4752]: I0929 11:15:30.969639 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/30380408-18cc-4feb-b122-aa9ae9047279-db-sync-config-data\") pod \"watcher-kuttl-db-sync-r67pr\" (UID: \"30380408-18cc-4feb-b122-aa9ae9047279\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-r67pr" Sep 29 11:15:30 crc kubenswrapper[4752]: I0929 11:15:30.969863 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30380408-18cc-4feb-b122-aa9ae9047279-config-data\") pod \"watcher-kuttl-db-sync-r67pr\" (UID: \"30380408-18cc-4feb-b122-aa9ae9047279\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-r67pr" Sep 29 11:15:30 crc kubenswrapper[4752]: I0929 11:15:30.970643 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30380408-18cc-4feb-b122-aa9ae9047279-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-r67pr\" (UID: \"30380408-18cc-4feb-b122-aa9ae9047279\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-r67pr" Sep 29 11:15:30 crc kubenswrapper[4752]: I0929 11:15:30.972084 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgsp9\" (UniqueName: \"kubernetes.io/projected/30380408-18cc-4feb-b122-aa9ae9047279-kube-api-access-dgsp9\") pod \"watcher-kuttl-db-sync-r67pr\" (UID: \"30380408-18cc-4feb-b122-aa9ae9047279\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-r67pr" Sep 29 11:15:31 crc kubenswrapper[4752]: I0929 11:15:31.165551 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-r67pr" Sep 29 11:15:31 crc kubenswrapper[4752]: I0929 11:15:31.639702 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-r67pr"] Sep 29 11:15:32 crc kubenswrapper[4752]: I0929 11:15:32.339847 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-r67pr" event={"ID":"30380408-18cc-4feb-b122-aa9ae9047279","Type":"ContainerStarted","Data":"bb590b8744c608a6ef5cdfc1dc8db5a55a975093e91c7c95e5fec12d0bc116b7"} Sep 29 11:15:32 crc kubenswrapper[4752]: I0929 11:15:32.340222 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-r67pr" event={"ID":"30380408-18cc-4feb-b122-aa9ae9047279","Type":"ContainerStarted","Data":"8b6487b2a80978d6fe633c54c144b3370661f6dd23460ef79a9c70297d9dd113"} Sep 29 11:15:32 crc kubenswrapper[4752]: I0929 11:15:32.358382 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-db-sync-r67pr" podStartSLOduration=2.358363333 podStartE2EDuration="2.358363333s" podCreationTimestamp="2025-09-29 11:15:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 11:15:32.354997284 +0000 UTC m=+1873.144138951" watchObservedRunningTime="2025-09-29 11:15:32.358363333 +0000 UTC m=+1873.147505000" Sep 29 11:15:34 crc kubenswrapper[4752]: I0929 11:15:34.371633 4752 generic.go:334] "Generic (PLEG): container finished" podID="30380408-18cc-4feb-b122-aa9ae9047279" containerID="bb590b8744c608a6ef5cdfc1dc8db5a55a975093e91c7c95e5fec12d0bc116b7" exitCode=0 Sep 29 11:15:34 crc kubenswrapper[4752]: I0929 11:15:34.371701 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-r67pr" event={"ID":"30380408-18cc-4feb-b122-aa9ae9047279","Type":"ContainerDied","Data":"bb590b8744c608a6ef5cdfc1dc8db5a55a975093e91c7c95e5fec12d0bc116b7"} Sep 29 11:15:35 crc kubenswrapper[4752]: I0929 11:15:35.718740 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-r67pr" Sep 29 11:15:35 crc kubenswrapper[4752]: I0929 11:15:35.826264 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30380408-18cc-4feb-b122-aa9ae9047279-combined-ca-bundle\") pod \"30380408-18cc-4feb-b122-aa9ae9047279\" (UID: \"30380408-18cc-4feb-b122-aa9ae9047279\") " Sep 29 11:15:35 crc kubenswrapper[4752]: I0929 11:15:35.826751 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dgsp9\" (UniqueName: \"kubernetes.io/projected/30380408-18cc-4feb-b122-aa9ae9047279-kube-api-access-dgsp9\") pod \"30380408-18cc-4feb-b122-aa9ae9047279\" (UID: \"30380408-18cc-4feb-b122-aa9ae9047279\") " Sep 29 11:15:35 crc kubenswrapper[4752]: I0929 11:15:35.826785 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30380408-18cc-4feb-b122-aa9ae9047279-config-data\") pod \"30380408-18cc-4feb-b122-aa9ae9047279\" (UID: \"30380408-18cc-4feb-b122-aa9ae9047279\") " Sep 29 11:15:35 crc kubenswrapper[4752]: I0929 11:15:35.826883 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/30380408-18cc-4feb-b122-aa9ae9047279-db-sync-config-data\") pod \"30380408-18cc-4feb-b122-aa9ae9047279\" (UID: \"30380408-18cc-4feb-b122-aa9ae9047279\") " Sep 29 11:15:35 crc kubenswrapper[4752]: I0929 11:15:35.832847 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30380408-18cc-4feb-b122-aa9ae9047279-kube-api-access-dgsp9" (OuterVolumeSpecName: "kube-api-access-dgsp9") pod "30380408-18cc-4feb-b122-aa9ae9047279" (UID: "30380408-18cc-4feb-b122-aa9ae9047279"). InnerVolumeSpecName "kube-api-access-dgsp9". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 11:15:35 crc kubenswrapper[4752]: I0929 11:15:35.844837 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30380408-18cc-4feb-b122-aa9ae9047279-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "30380408-18cc-4feb-b122-aa9ae9047279" (UID: "30380408-18cc-4feb-b122-aa9ae9047279"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:15:35 crc kubenswrapper[4752]: I0929 11:15:35.849332 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30380408-18cc-4feb-b122-aa9ae9047279-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "30380408-18cc-4feb-b122-aa9ae9047279" (UID: "30380408-18cc-4feb-b122-aa9ae9047279"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:15:35 crc kubenswrapper[4752]: I0929 11:15:35.866853 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30380408-18cc-4feb-b122-aa9ae9047279-config-data" (OuterVolumeSpecName: "config-data") pod "30380408-18cc-4feb-b122-aa9ae9047279" (UID: "30380408-18cc-4feb-b122-aa9ae9047279"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:15:35 crc kubenswrapper[4752]: I0929 11:15:35.928714 4752 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/30380408-18cc-4feb-b122-aa9ae9047279-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 11:15:35 crc kubenswrapper[4752]: I0929 11:15:35.928764 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30380408-18cc-4feb-b122-aa9ae9047279-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 11:15:35 crc kubenswrapper[4752]: I0929 11:15:35.928773 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dgsp9\" (UniqueName: \"kubernetes.io/projected/30380408-18cc-4feb-b122-aa9ae9047279-kube-api-access-dgsp9\") on node \"crc\" DevicePath \"\"" Sep 29 11:15:35 crc kubenswrapper[4752]: I0929 11:15:35.928783 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30380408-18cc-4feb-b122-aa9ae9047279-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 11:15:36 crc kubenswrapper[4752]: I0929 11:15:36.388558 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-r67pr" event={"ID":"30380408-18cc-4feb-b122-aa9ae9047279","Type":"ContainerDied","Data":"8b6487b2a80978d6fe633c54c144b3370661f6dd23460ef79a9c70297d9dd113"} Sep 29 11:15:36 crc kubenswrapper[4752]: I0929 11:15:36.388620 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b6487b2a80978d6fe633c54c144b3370661f6dd23460ef79a9c70297d9dd113" Sep 29 11:15:36 crc kubenswrapper[4752]: I0929 11:15:36.388599 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-r67pr" Sep 29 11:15:36 crc kubenswrapper[4752]: I0929 11:15:36.686502 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Sep 29 11:15:36 crc kubenswrapper[4752]: E0929 11:15:36.686844 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30380408-18cc-4feb-b122-aa9ae9047279" containerName="watcher-kuttl-db-sync" Sep 29 11:15:36 crc kubenswrapper[4752]: I0929 11:15:36.686859 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="30380408-18cc-4feb-b122-aa9ae9047279" containerName="watcher-kuttl-db-sync" Sep 29 11:15:36 crc kubenswrapper[4752]: I0929 11:15:36.687023 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="30380408-18cc-4feb-b122-aa9ae9047279" containerName="watcher-kuttl-db-sync" Sep 29 11:15:36 crc kubenswrapper[4752]: I0929 11:15:36.687889 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:15:36 crc kubenswrapper[4752]: I0929 11:15:36.691744 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-watcher-internal-svc" Sep 29 11:15:36 crc kubenswrapper[4752]: I0929 11:15:36.691920 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-watcher-kuttl-dockercfg-hjz6l" Sep 29 11:15:36 crc kubenswrapper[4752]: I0929 11:15:36.691921 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-api-config-data" Sep 29 11:15:36 crc kubenswrapper[4752]: I0929 11:15:36.695916 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-watcher-public-svc" Sep 29 11:15:36 crc kubenswrapper[4752]: I0929 11:15:36.696284 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Sep 29 11:15:36 crc kubenswrapper[4752]: I0929 11:15:36.703376 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:15:36 crc kubenswrapper[4752]: I0929 11:15:36.712981 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-decision-engine-config-data" Sep 29 11:15:36 crc kubenswrapper[4752]: I0929 11:15:36.725092 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Sep 29 11:15:36 crc kubenswrapper[4752]: I0929 11:15:36.740378 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Sep 29 11:15:36 crc kubenswrapper[4752]: I0929 11:15:36.783893 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4632531-2c83-465d-9906-aa26083e17b4-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"e4632531-2c83-465d-9906-aa26083e17b4\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:15:36 crc kubenswrapper[4752]: I0929 11:15:36.783957 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4632531-2c83-465d-9906-aa26083e17b4-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"e4632531-2c83-465d-9906-aa26083e17b4\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:15:36 crc kubenswrapper[4752]: I0929 11:15:36.788421 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/e4632531-2c83-465d-9906-aa26083e17b4-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"e4632531-2c83-465d-9906-aa26083e17b4\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:15:36 crc kubenswrapper[4752]: I0929 11:15:36.788709 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnx6k\" (UniqueName: \"kubernetes.io/projected/e4632531-2c83-465d-9906-aa26083e17b4-kube-api-access-tnx6k\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"e4632531-2c83-465d-9906-aa26083e17b4\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:15:36 crc kubenswrapper[4752]: I0929 11:15:36.788769 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4632531-2c83-465d-9906-aa26083e17b4-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"e4632531-2c83-465d-9906-aa26083e17b4\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:15:36 crc kubenswrapper[4752]: I0929 11:15:36.811476 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Sep 29 11:15:36 crc kubenswrapper[4752]: I0929 11:15:36.813038 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:15:36 crc kubenswrapper[4752]: I0929 11:15:36.819671 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-applier-config-data" Sep 29 11:15:36 crc kubenswrapper[4752]: I0929 11:15:36.825999 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Sep 29 11:15:36 crc kubenswrapper[4752]: I0929 11:15:36.890261 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b4e8f55-830d-4a8a-928b-869e27fdc3ea-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"1b4e8f55-830d-4a8a-928b-869e27fdc3ea\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:15:36 crc kubenswrapper[4752]: I0929 11:15:36.890324 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b4e8f55-830d-4a8a-928b-869e27fdc3ea-internal-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"1b4e8f55-830d-4a8a-928b-869e27fdc3ea\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:15:36 crc kubenswrapper[4752]: I0929 11:15:36.890348 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/1b4e8f55-830d-4a8a-928b-869e27fdc3ea-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"1b4e8f55-830d-4a8a-928b-869e27fdc3ea\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:15:36 crc kubenswrapper[4752]: I0929 11:15:36.890419 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnx6k\" (UniqueName: \"kubernetes.io/projected/e4632531-2c83-465d-9906-aa26083e17b4-kube-api-access-tnx6k\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"e4632531-2c83-465d-9906-aa26083e17b4\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:15:36 crc kubenswrapper[4752]: I0929 11:15:36.890453 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b4e8f55-830d-4a8a-928b-869e27fdc3ea-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"1b4e8f55-830d-4a8a-928b-869e27fdc3ea\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:15:36 crc kubenswrapper[4752]: I0929 11:15:36.890476 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4632531-2c83-465d-9906-aa26083e17b4-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"e4632531-2c83-465d-9906-aa26083e17b4\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:15:36 crc kubenswrapper[4752]: I0929 11:15:36.890553 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b4e8f55-830d-4a8a-928b-869e27fdc3ea-public-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"1b4e8f55-830d-4a8a-928b-869e27fdc3ea\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:15:36 crc kubenswrapper[4752]: I0929 11:15:36.890590 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4632531-2c83-465d-9906-aa26083e17b4-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"e4632531-2c83-465d-9906-aa26083e17b4\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:15:36 crc kubenswrapper[4752]: I0929 11:15:36.890621 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4632531-2c83-465d-9906-aa26083e17b4-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"e4632531-2c83-465d-9906-aa26083e17b4\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:15:36 crc kubenswrapper[4752]: I0929 11:15:36.890643 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b4e8f55-830d-4a8a-928b-869e27fdc3ea-logs\") pod \"watcher-kuttl-api-0\" (UID: \"1b4e8f55-830d-4a8a-928b-869e27fdc3ea\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:15:36 crc kubenswrapper[4752]: I0929 11:15:36.890671 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-446sm\" (UniqueName: \"kubernetes.io/projected/1b4e8f55-830d-4a8a-928b-869e27fdc3ea-kube-api-access-446sm\") pod \"watcher-kuttl-api-0\" (UID: \"1b4e8f55-830d-4a8a-928b-869e27fdc3ea\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:15:36 crc kubenswrapper[4752]: I0929 11:15:36.890718 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/e4632531-2c83-465d-9906-aa26083e17b4-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"e4632531-2c83-465d-9906-aa26083e17b4\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:15:36 crc kubenswrapper[4752]: I0929 11:15:36.892105 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4632531-2c83-465d-9906-aa26083e17b4-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"e4632531-2c83-465d-9906-aa26083e17b4\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:15:36 crc kubenswrapper[4752]: I0929 11:15:36.895261 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4632531-2c83-465d-9906-aa26083e17b4-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"e4632531-2c83-465d-9906-aa26083e17b4\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:15:36 crc kubenswrapper[4752]: I0929 11:15:36.895327 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4632531-2c83-465d-9906-aa26083e17b4-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"e4632531-2c83-465d-9906-aa26083e17b4\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:15:36 crc kubenswrapper[4752]: I0929 11:15:36.896288 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/e4632531-2c83-465d-9906-aa26083e17b4-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"e4632531-2c83-465d-9906-aa26083e17b4\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:15:36 crc kubenswrapper[4752]: I0929 11:15:36.914269 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnx6k\" (UniqueName: \"kubernetes.io/projected/e4632531-2c83-465d-9906-aa26083e17b4-kube-api-access-tnx6k\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"e4632531-2c83-465d-9906-aa26083e17b4\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:15:36 crc kubenswrapper[4752]: I0929 11:15:36.992016 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b4e8f55-830d-4a8a-928b-869e27fdc3ea-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"1b4e8f55-830d-4a8a-928b-869e27fdc3ea\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:15:36 crc kubenswrapper[4752]: I0929 11:15:36.992063 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b4e8f55-830d-4a8a-928b-869e27fdc3ea-internal-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"1b4e8f55-830d-4a8a-928b-869e27fdc3ea\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:15:36 crc kubenswrapper[4752]: I0929 11:15:36.992080 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/1b4e8f55-830d-4a8a-928b-869e27fdc3ea-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"1b4e8f55-830d-4a8a-928b-869e27fdc3ea\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:15:36 crc kubenswrapper[4752]: I0929 11:15:36.992152 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b4e8f55-830d-4a8a-928b-869e27fdc3ea-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"1b4e8f55-830d-4a8a-928b-869e27fdc3ea\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:15:36 crc kubenswrapper[4752]: I0929 11:15:36.992215 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b36513c-f87c-4873-9ab6-629ccbb9c58e-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"1b36513c-f87c-4873-9ab6-629ccbb9c58e\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:15:36 crc kubenswrapper[4752]: I0929 11:15:36.992236 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b36513c-f87c-4873-9ab6-629ccbb9c58e-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"1b36513c-f87c-4873-9ab6-629ccbb9c58e\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:15:36 crc kubenswrapper[4752]: I0929 11:15:36.992264 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvsgf\" (UniqueName: \"kubernetes.io/projected/1b36513c-f87c-4873-9ab6-629ccbb9c58e-kube-api-access-qvsgf\") pod \"watcher-kuttl-applier-0\" (UID: \"1b36513c-f87c-4873-9ab6-629ccbb9c58e\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:15:36 crc kubenswrapper[4752]: I0929 11:15:36.992283 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b4e8f55-830d-4a8a-928b-869e27fdc3ea-public-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"1b4e8f55-830d-4a8a-928b-869e27fdc3ea\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:15:36 crc kubenswrapper[4752]: I0929 11:15:36.992313 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b4e8f55-830d-4a8a-928b-869e27fdc3ea-logs\") pod \"watcher-kuttl-api-0\" (UID: \"1b4e8f55-830d-4a8a-928b-869e27fdc3ea\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:15:36 crc kubenswrapper[4752]: I0929 11:15:36.992336 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-446sm\" (UniqueName: \"kubernetes.io/projected/1b4e8f55-830d-4a8a-928b-869e27fdc3ea-kube-api-access-446sm\") pod \"watcher-kuttl-api-0\" (UID: \"1b4e8f55-830d-4a8a-928b-869e27fdc3ea\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:15:36 crc kubenswrapper[4752]: I0929 11:15:36.992364 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b36513c-f87c-4873-9ab6-629ccbb9c58e-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"1b36513c-f87c-4873-9ab6-629ccbb9c58e\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:15:36 crc kubenswrapper[4752]: I0929 11:15:36.992900 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b4e8f55-830d-4a8a-928b-869e27fdc3ea-logs\") pod \"watcher-kuttl-api-0\" (UID: \"1b4e8f55-830d-4a8a-928b-869e27fdc3ea\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:15:37 crc kubenswrapper[4752]: I0929 11:15:37.001201 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/1b4e8f55-830d-4a8a-928b-869e27fdc3ea-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"1b4e8f55-830d-4a8a-928b-869e27fdc3ea\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:15:37 crc kubenswrapper[4752]: I0929 11:15:37.001223 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b4e8f55-830d-4a8a-928b-869e27fdc3ea-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"1b4e8f55-830d-4a8a-928b-869e27fdc3ea\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:15:37 crc kubenswrapper[4752]: I0929 11:15:37.001619 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b4e8f55-830d-4a8a-928b-869e27fdc3ea-public-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"1b4e8f55-830d-4a8a-928b-869e27fdc3ea\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:15:37 crc kubenswrapper[4752]: I0929 11:15:37.009562 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b4e8f55-830d-4a8a-928b-869e27fdc3ea-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"1b4e8f55-830d-4a8a-928b-869e27fdc3ea\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:15:37 crc kubenswrapper[4752]: I0929 11:15:37.009750 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b4e8f55-830d-4a8a-928b-869e27fdc3ea-internal-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"1b4e8f55-830d-4a8a-928b-869e27fdc3ea\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:15:37 crc kubenswrapper[4752]: I0929 11:15:37.020370 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-446sm\" (UniqueName: \"kubernetes.io/projected/1b4e8f55-830d-4a8a-928b-869e27fdc3ea-kube-api-access-446sm\") pod \"watcher-kuttl-api-0\" (UID: \"1b4e8f55-830d-4a8a-928b-869e27fdc3ea\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:15:37 crc kubenswrapper[4752]: I0929 11:15:37.041421 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:15:37 crc kubenswrapper[4752]: I0929 11:15:37.095612 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b36513c-f87c-4873-9ab6-629ccbb9c58e-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"1b36513c-f87c-4873-9ab6-629ccbb9c58e\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:15:37 crc kubenswrapper[4752]: I0929 11:15:37.095724 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b36513c-f87c-4873-9ab6-629ccbb9c58e-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"1b36513c-f87c-4873-9ab6-629ccbb9c58e\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:15:37 crc kubenswrapper[4752]: I0929 11:15:37.095745 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b36513c-f87c-4873-9ab6-629ccbb9c58e-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"1b36513c-f87c-4873-9ab6-629ccbb9c58e\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:15:37 crc kubenswrapper[4752]: I0929 11:15:37.095773 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvsgf\" (UniqueName: \"kubernetes.io/projected/1b36513c-f87c-4873-9ab6-629ccbb9c58e-kube-api-access-qvsgf\") pod \"watcher-kuttl-applier-0\" (UID: \"1b36513c-f87c-4873-9ab6-629ccbb9c58e\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:15:37 crc kubenswrapper[4752]: I0929 11:15:37.097496 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b36513c-f87c-4873-9ab6-629ccbb9c58e-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"1b36513c-f87c-4873-9ab6-629ccbb9c58e\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:15:37 crc kubenswrapper[4752]: I0929 11:15:37.103869 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b36513c-f87c-4873-9ab6-629ccbb9c58e-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"1b36513c-f87c-4873-9ab6-629ccbb9c58e\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:15:37 crc kubenswrapper[4752]: I0929 11:15:37.108605 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b36513c-f87c-4873-9ab6-629ccbb9c58e-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"1b36513c-f87c-4873-9ab6-629ccbb9c58e\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:15:37 crc kubenswrapper[4752]: I0929 11:15:37.137980 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvsgf\" (UniqueName: \"kubernetes.io/projected/1b36513c-f87c-4873-9ab6-629ccbb9c58e-kube-api-access-qvsgf\") pod \"watcher-kuttl-applier-0\" (UID: \"1b36513c-f87c-4873-9ab6-629ccbb9c58e\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:15:37 crc kubenswrapper[4752]: I0929 11:15:37.305710 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:15:37 crc kubenswrapper[4752]: I0929 11:15:37.434949 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:15:37 crc kubenswrapper[4752]: I0929 11:15:37.599220 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Sep 29 11:15:37 crc kubenswrapper[4752]: W0929 11:15:37.601560 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode4632531_2c83_465d_9906_aa26083e17b4.slice/crio-b017fd771e4601da235b357aae98aee69a55aa93c450e2b2082996bd07c6293c WatchSource:0}: Error finding container b017fd771e4601da235b357aae98aee69a55aa93c450e2b2082996bd07c6293c: Status 404 returned error can't find the container with id b017fd771e4601da235b357aae98aee69a55aa93c450e2b2082996bd07c6293c Sep 29 11:15:37 crc kubenswrapper[4752]: I0929 11:15:37.763671 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Sep 29 11:15:37 crc kubenswrapper[4752]: W0929 11:15:37.765180 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1b4e8f55_830d_4a8a_928b_869e27fdc3ea.slice/crio-75a796114782c0013e8f87b5c614a80211226f26f97df7d06848656b325b0b03 WatchSource:0}: Error finding container 75a796114782c0013e8f87b5c614a80211226f26f97df7d06848656b325b0b03: Status 404 returned error can't find the container with id 75a796114782c0013e8f87b5c614a80211226f26f97df7d06848656b325b0b03 Sep 29 11:15:37 crc kubenswrapper[4752]: I0929 11:15:37.930791 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Sep 29 11:15:38 crc kubenswrapper[4752]: I0929 11:15:38.413574 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"e4632531-2c83-465d-9906-aa26083e17b4","Type":"ContainerStarted","Data":"d6ec7c838c546cf5bc7200af013a511751d77a94449b3f368637d3a980d8d99a"} Sep 29 11:15:38 crc kubenswrapper[4752]: I0929 11:15:38.413908 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"e4632531-2c83-465d-9906-aa26083e17b4","Type":"ContainerStarted","Data":"b017fd771e4601da235b357aae98aee69a55aa93c450e2b2082996bd07c6293c"} Sep 29 11:15:38 crc kubenswrapper[4752]: I0929 11:15:38.416995 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"1b4e8f55-830d-4a8a-928b-869e27fdc3ea","Type":"ContainerStarted","Data":"253b5aedaef3072a17469aceb0d35b41e5d9a7d91657f3ad991b4b180da75a83"} Sep 29 11:15:38 crc kubenswrapper[4752]: I0929 11:15:38.417027 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"1b4e8f55-830d-4a8a-928b-869e27fdc3ea","Type":"ContainerStarted","Data":"fcea50b93181f051cdbb5b41c19c2e316e880c6ee9f4e5e21e7567b886c3169c"} Sep 29 11:15:38 crc kubenswrapper[4752]: I0929 11:15:38.417036 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"1b4e8f55-830d-4a8a-928b-869e27fdc3ea","Type":"ContainerStarted","Data":"75a796114782c0013e8f87b5c614a80211226f26f97df7d06848656b325b0b03"} Sep 29 11:15:38 crc kubenswrapper[4752]: I0929 11:15:38.419459 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:15:38 crc kubenswrapper[4752]: I0929 11:15:38.421729 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"1b36513c-f87c-4873-9ab6-629ccbb9c58e","Type":"ContainerStarted","Data":"93e5dc80b66d53ac1cbf12e5f78125492d39dc1295524630be0738540264aca1"} Sep 29 11:15:38 crc kubenswrapper[4752]: I0929 11:15:38.421760 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"1b36513c-f87c-4873-9ab6-629ccbb9c58e","Type":"ContainerStarted","Data":"6e509b79b09fc3075b491601313f1f6876d9596deda0cf560880649df13eedfc"} Sep 29 11:15:38 crc kubenswrapper[4752]: I0929 11:15:38.441351 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podStartSLOduration=2.441329682 podStartE2EDuration="2.441329682s" podCreationTimestamp="2025-09-29 11:15:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 11:15:38.436424784 +0000 UTC m=+1879.225566451" watchObservedRunningTime="2025-09-29 11:15:38.441329682 +0000 UTC m=+1879.230471349" Sep 29 11:15:38 crc kubenswrapper[4752]: I0929 11:15:38.465554 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-api-0" podStartSLOduration=2.46553214 podStartE2EDuration="2.46553214s" podCreationTimestamp="2025-09-29 11:15:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 11:15:38.465139799 +0000 UTC m=+1879.254281476" watchObservedRunningTime="2025-09-29 11:15:38.46553214 +0000 UTC m=+1879.254673807" Sep 29 11:15:38 crc kubenswrapper[4752]: I0929 11:15:38.495023 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podStartSLOduration=2.495005316 podStartE2EDuration="2.495005316s" podCreationTimestamp="2025-09-29 11:15:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 11:15:38.488475014 +0000 UTC m=+1879.277616681" watchObservedRunningTime="2025-09-29 11:15:38.495005316 +0000 UTC m=+1879.284146983" Sep 29 11:15:40 crc kubenswrapper[4752]: I0929 11:15:40.436184 4752 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 29 11:15:40 crc kubenswrapper[4752]: I0929 11:15:40.961979 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:15:42 crc kubenswrapper[4752]: I0929 11:15:42.307121 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:15:42 crc kubenswrapper[4752]: I0929 11:15:42.435858 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:15:47 crc kubenswrapper[4752]: I0929 11:15:47.042673 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:15:47 crc kubenswrapper[4752]: I0929 11:15:47.074067 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:15:47 crc kubenswrapper[4752]: I0929 11:15:47.306958 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:15:47 crc kubenswrapper[4752]: I0929 11:15:47.316395 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:15:47 crc kubenswrapper[4752]: I0929 11:15:47.436537 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:15:47 crc kubenswrapper[4752]: I0929 11:15:47.461542 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:15:47 crc kubenswrapper[4752]: I0929 11:15:47.494591 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:15:47 crc kubenswrapper[4752]: I0929 11:15:47.503945 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:15:47 crc kubenswrapper[4752]: I0929 11:15:47.532382 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:15:47 crc kubenswrapper[4752]: I0929 11:15:47.536849 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:15:47 crc kubenswrapper[4752]: I0929 11:15:47.649620 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:15:50 crc kubenswrapper[4752]: I0929 11:15:50.042441 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Sep 29 11:15:50 crc kubenswrapper[4752]: I0929 11:15:50.043060 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="83dd436e-d5bb-40a5-93f7-d6587789f13f" containerName="ceilometer-central-agent" containerID="cri-o://628d0a5ad9c94c8061a32bab98243f472011f8877e9084b5e7597be812cf0074" gracePeriod=30 Sep 29 11:15:50 crc kubenswrapper[4752]: I0929 11:15:50.043288 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="83dd436e-d5bb-40a5-93f7-d6587789f13f" containerName="proxy-httpd" containerID="cri-o://c150b5835d75a473616e7e55fc4891b9b49ac361decb82cb0cd0472dddb3af7d" gracePeriod=30 Sep 29 11:15:50 crc kubenswrapper[4752]: I0929 11:15:50.043346 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="83dd436e-d5bb-40a5-93f7-d6587789f13f" containerName="sg-core" containerID="cri-o://6d5b6eec2db4d0e77ea84569feef4a1a86c86a6b62cd32151362a9f18a91be7c" gracePeriod=30 Sep 29 11:15:50 crc kubenswrapper[4752]: I0929 11:15:50.043399 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="83dd436e-d5bb-40a5-93f7-d6587789f13f" containerName="ceilometer-notification-agent" containerID="cri-o://d68f6e86e1b11c3ebc15e7b5f47361ac86532bdc31b889c9450abc6b332910c4" gracePeriod=30 Sep 29 11:15:50 crc kubenswrapper[4752]: I0929 11:15:50.529850 4752 generic.go:334] "Generic (PLEG): container finished" podID="83dd436e-d5bb-40a5-93f7-d6587789f13f" containerID="6d5b6eec2db4d0e77ea84569feef4a1a86c86a6b62cd32151362a9f18a91be7c" exitCode=2 Sep 29 11:15:50 crc kubenswrapper[4752]: I0929 11:15:50.530247 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"83dd436e-d5bb-40a5-93f7-d6587789f13f","Type":"ContainerDied","Data":"6d5b6eec2db4d0e77ea84569feef4a1a86c86a6b62cd32151362a9f18a91be7c"} Sep 29 11:15:51 crc kubenswrapper[4752]: I0929 11:15:51.480161 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:15:51 crc kubenswrapper[4752]: I0929 11:15:51.542911 4752 generic.go:334] "Generic (PLEG): container finished" podID="83dd436e-d5bb-40a5-93f7-d6587789f13f" containerID="c150b5835d75a473616e7e55fc4891b9b49ac361decb82cb0cd0472dddb3af7d" exitCode=0 Sep 29 11:15:51 crc kubenswrapper[4752]: I0929 11:15:51.542944 4752 generic.go:334] "Generic (PLEG): container finished" podID="83dd436e-d5bb-40a5-93f7-d6587789f13f" containerID="d68f6e86e1b11c3ebc15e7b5f47361ac86532bdc31b889c9450abc6b332910c4" exitCode=0 Sep 29 11:15:51 crc kubenswrapper[4752]: I0929 11:15:51.542953 4752 generic.go:334] "Generic (PLEG): container finished" podID="83dd436e-d5bb-40a5-93f7-d6587789f13f" containerID="628d0a5ad9c94c8061a32bab98243f472011f8877e9084b5e7597be812cf0074" exitCode=0 Sep 29 11:15:51 crc kubenswrapper[4752]: I0929 11:15:51.542974 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"83dd436e-d5bb-40a5-93f7-d6587789f13f","Type":"ContainerDied","Data":"c150b5835d75a473616e7e55fc4891b9b49ac361decb82cb0cd0472dddb3af7d"} Sep 29 11:15:51 crc kubenswrapper[4752]: I0929 11:15:51.543002 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"83dd436e-d5bb-40a5-93f7-d6587789f13f","Type":"ContainerDied","Data":"d68f6e86e1b11c3ebc15e7b5f47361ac86532bdc31b889c9450abc6b332910c4"} Sep 29 11:15:51 crc kubenswrapper[4752]: I0929 11:15:51.543013 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"83dd436e-d5bb-40a5-93f7-d6587789f13f","Type":"ContainerDied","Data":"628d0a5ad9c94c8061a32bab98243f472011f8877e9084b5e7597be812cf0074"} Sep 29 11:15:51 crc kubenswrapper[4752]: I0929 11:15:51.543023 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"83dd436e-d5bb-40a5-93f7-d6587789f13f","Type":"ContainerDied","Data":"a5869d471f0631604f24a318bd87936c598ab96383d7c2a06273a712c742059d"} Sep 29 11:15:51 crc kubenswrapper[4752]: I0929 11:15:51.543049 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:15:51 crc kubenswrapper[4752]: I0929 11:15:51.543053 4752 scope.go:117] "RemoveContainer" containerID="c150b5835d75a473616e7e55fc4891b9b49ac361decb82cb0cd0472dddb3af7d" Sep 29 11:15:51 crc kubenswrapper[4752]: I0929 11:15:51.575497 4752 scope.go:117] "RemoveContainer" containerID="6d5b6eec2db4d0e77ea84569feef4a1a86c86a6b62cd32151362a9f18a91be7c" Sep 29 11:15:51 crc kubenswrapper[4752]: I0929 11:15:51.594002 4752 scope.go:117] "RemoveContainer" containerID="d68f6e86e1b11c3ebc15e7b5f47361ac86532bdc31b889c9450abc6b332910c4" Sep 29 11:15:51 crc kubenswrapper[4752]: I0929 11:15:51.613419 4752 scope.go:117] "RemoveContainer" containerID="628d0a5ad9c94c8061a32bab98243f472011f8877e9084b5e7597be812cf0074" Sep 29 11:15:51 crc kubenswrapper[4752]: I0929 11:15:51.632778 4752 scope.go:117] "RemoveContainer" containerID="c150b5835d75a473616e7e55fc4891b9b49ac361decb82cb0cd0472dddb3af7d" Sep 29 11:15:51 crc kubenswrapper[4752]: E0929 11:15:51.633339 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c150b5835d75a473616e7e55fc4891b9b49ac361decb82cb0cd0472dddb3af7d\": container with ID starting with c150b5835d75a473616e7e55fc4891b9b49ac361decb82cb0cd0472dddb3af7d not found: ID does not exist" containerID="c150b5835d75a473616e7e55fc4891b9b49ac361decb82cb0cd0472dddb3af7d" Sep 29 11:15:51 crc kubenswrapper[4752]: I0929 11:15:51.633373 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c150b5835d75a473616e7e55fc4891b9b49ac361decb82cb0cd0472dddb3af7d"} err="failed to get container status \"c150b5835d75a473616e7e55fc4891b9b49ac361decb82cb0cd0472dddb3af7d\": rpc error: code = NotFound desc = could not find container \"c150b5835d75a473616e7e55fc4891b9b49ac361decb82cb0cd0472dddb3af7d\": container with ID starting with c150b5835d75a473616e7e55fc4891b9b49ac361decb82cb0cd0472dddb3af7d not found: ID does not exist" Sep 29 11:15:51 crc kubenswrapper[4752]: I0929 11:15:51.633397 4752 scope.go:117] "RemoveContainer" containerID="6d5b6eec2db4d0e77ea84569feef4a1a86c86a6b62cd32151362a9f18a91be7c" Sep 29 11:15:51 crc kubenswrapper[4752]: E0929 11:15:51.633616 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d5b6eec2db4d0e77ea84569feef4a1a86c86a6b62cd32151362a9f18a91be7c\": container with ID starting with 6d5b6eec2db4d0e77ea84569feef4a1a86c86a6b62cd32151362a9f18a91be7c not found: ID does not exist" containerID="6d5b6eec2db4d0e77ea84569feef4a1a86c86a6b62cd32151362a9f18a91be7c" Sep 29 11:15:51 crc kubenswrapper[4752]: I0929 11:15:51.633639 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d5b6eec2db4d0e77ea84569feef4a1a86c86a6b62cd32151362a9f18a91be7c"} err="failed to get container status \"6d5b6eec2db4d0e77ea84569feef4a1a86c86a6b62cd32151362a9f18a91be7c\": rpc error: code = NotFound desc = could not find container \"6d5b6eec2db4d0e77ea84569feef4a1a86c86a6b62cd32151362a9f18a91be7c\": container with ID starting with 6d5b6eec2db4d0e77ea84569feef4a1a86c86a6b62cd32151362a9f18a91be7c not found: ID does not exist" Sep 29 11:15:51 crc kubenswrapper[4752]: I0929 11:15:51.633656 4752 scope.go:117] "RemoveContainer" containerID="d68f6e86e1b11c3ebc15e7b5f47361ac86532bdc31b889c9450abc6b332910c4" Sep 29 11:15:51 crc kubenswrapper[4752]: E0929 11:15:51.633884 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d68f6e86e1b11c3ebc15e7b5f47361ac86532bdc31b889c9450abc6b332910c4\": container with ID starting with d68f6e86e1b11c3ebc15e7b5f47361ac86532bdc31b889c9450abc6b332910c4 not found: ID does not exist" containerID="d68f6e86e1b11c3ebc15e7b5f47361ac86532bdc31b889c9450abc6b332910c4" Sep 29 11:15:51 crc kubenswrapper[4752]: I0929 11:15:51.633905 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d68f6e86e1b11c3ebc15e7b5f47361ac86532bdc31b889c9450abc6b332910c4"} err="failed to get container status \"d68f6e86e1b11c3ebc15e7b5f47361ac86532bdc31b889c9450abc6b332910c4\": rpc error: code = NotFound desc = could not find container \"d68f6e86e1b11c3ebc15e7b5f47361ac86532bdc31b889c9450abc6b332910c4\": container with ID starting with d68f6e86e1b11c3ebc15e7b5f47361ac86532bdc31b889c9450abc6b332910c4 not found: ID does not exist" Sep 29 11:15:51 crc kubenswrapper[4752]: I0929 11:15:51.633922 4752 scope.go:117] "RemoveContainer" containerID="628d0a5ad9c94c8061a32bab98243f472011f8877e9084b5e7597be812cf0074" Sep 29 11:15:51 crc kubenswrapper[4752]: E0929 11:15:51.634115 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"628d0a5ad9c94c8061a32bab98243f472011f8877e9084b5e7597be812cf0074\": container with ID starting with 628d0a5ad9c94c8061a32bab98243f472011f8877e9084b5e7597be812cf0074 not found: ID does not exist" containerID="628d0a5ad9c94c8061a32bab98243f472011f8877e9084b5e7597be812cf0074" Sep 29 11:15:51 crc kubenswrapper[4752]: I0929 11:15:51.634138 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"628d0a5ad9c94c8061a32bab98243f472011f8877e9084b5e7597be812cf0074"} err="failed to get container status \"628d0a5ad9c94c8061a32bab98243f472011f8877e9084b5e7597be812cf0074\": rpc error: code = NotFound desc = could not find container \"628d0a5ad9c94c8061a32bab98243f472011f8877e9084b5e7597be812cf0074\": container with ID starting with 628d0a5ad9c94c8061a32bab98243f472011f8877e9084b5e7597be812cf0074 not found: ID does not exist" Sep 29 11:15:51 crc kubenswrapper[4752]: I0929 11:15:51.634156 4752 scope.go:117] "RemoveContainer" containerID="c150b5835d75a473616e7e55fc4891b9b49ac361decb82cb0cd0472dddb3af7d" Sep 29 11:15:51 crc kubenswrapper[4752]: I0929 11:15:51.634383 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c150b5835d75a473616e7e55fc4891b9b49ac361decb82cb0cd0472dddb3af7d"} err="failed to get container status \"c150b5835d75a473616e7e55fc4891b9b49ac361decb82cb0cd0472dddb3af7d\": rpc error: code = NotFound desc = could not find container \"c150b5835d75a473616e7e55fc4891b9b49ac361decb82cb0cd0472dddb3af7d\": container with ID starting with c150b5835d75a473616e7e55fc4891b9b49ac361decb82cb0cd0472dddb3af7d not found: ID does not exist" Sep 29 11:15:51 crc kubenswrapper[4752]: I0929 11:15:51.634400 4752 scope.go:117] "RemoveContainer" containerID="6d5b6eec2db4d0e77ea84569feef4a1a86c86a6b62cd32151362a9f18a91be7c" Sep 29 11:15:51 crc kubenswrapper[4752]: I0929 11:15:51.634772 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d5b6eec2db4d0e77ea84569feef4a1a86c86a6b62cd32151362a9f18a91be7c"} err="failed to get container status \"6d5b6eec2db4d0e77ea84569feef4a1a86c86a6b62cd32151362a9f18a91be7c\": rpc error: code = NotFound desc = could not find container \"6d5b6eec2db4d0e77ea84569feef4a1a86c86a6b62cd32151362a9f18a91be7c\": container with ID starting with 6d5b6eec2db4d0e77ea84569feef4a1a86c86a6b62cd32151362a9f18a91be7c not found: ID does not exist" Sep 29 11:15:51 crc kubenswrapper[4752]: I0929 11:15:51.634794 4752 scope.go:117] "RemoveContainer" containerID="d68f6e86e1b11c3ebc15e7b5f47361ac86532bdc31b889c9450abc6b332910c4" Sep 29 11:15:51 crc kubenswrapper[4752]: I0929 11:15:51.635979 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d68f6e86e1b11c3ebc15e7b5f47361ac86532bdc31b889c9450abc6b332910c4"} err="failed to get container status \"d68f6e86e1b11c3ebc15e7b5f47361ac86532bdc31b889c9450abc6b332910c4\": rpc error: code = NotFound desc = could not find container \"d68f6e86e1b11c3ebc15e7b5f47361ac86532bdc31b889c9450abc6b332910c4\": container with ID starting with d68f6e86e1b11c3ebc15e7b5f47361ac86532bdc31b889c9450abc6b332910c4 not found: ID does not exist" Sep 29 11:15:51 crc kubenswrapper[4752]: I0929 11:15:51.636007 4752 scope.go:117] "RemoveContainer" containerID="628d0a5ad9c94c8061a32bab98243f472011f8877e9084b5e7597be812cf0074" Sep 29 11:15:51 crc kubenswrapper[4752]: I0929 11:15:51.636344 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"628d0a5ad9c94c8061a32bab98243f472011f8877e9084b5e7597be812cf0074"} err="failed to get container status \"628d0a5ad9c94c8061a32bab98243f472011f8877e9084b5e7597be812cf0074\": rpc error: code = NotFound desc = could not find container \"628d0a5ad9c94c8061a32bab98243f472011f8877e9084b5e7597be812cf0074\": container with ID starting with 628d0a5ad9c94c8061a32bab98243f472011f8877e9084b5e7597be812cf0074 not found: ID does not exist" Sep 29 11:15:51 crc kubenswrapper[4752]: I0929 11:15:51.636397 4752 scope.go:117] "RemoveContainer" containerID="c150b5835d75a473616e7e55fc4891b9b49ac361decb82cb0cd0472dddb3af7d" Sep 29 11:15:51 crc kubenswrapper[4752]: I0929 11:15:51.636778 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c150b5835d75a473616e7e55fc4891b9b49ac361decb82cb0cd0472dddb3af7d"} err="failed to get container status \"c150b5835d75a473616e7e55fc4891b9b49ac361decb82cb0cd0472dddb3af7d\": rpc error: code = NotFound desc = could not find container \"c150b5835d75a473616e7e55fc4891b9b49ac361decb82cb0cd0472dddb3af7d\": container with ID starting with c150b5835d75a473616e7e55fc4891b9b49ac361decb82cb0cd0472dddb3af7d not found: ID does not exist" Sep 29 11:15:51 crc kubenswrapper[4752]: I0929 11:15:51.636841 4752 scope.go:117] "RemoveContainer" containerID="6d5b6eec2db4d0e77ea84569feef4a1a86c86a6b62cd32151362a9f18a91be7c" Sep 29 11:15:51 crc kubenswrapper[4752]: I0929 11:15:51.637234 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d5b6eec2db4d0e77ea84569feef4a1a86c86a6b62cd32151362a9f18a91be7c"} err="failed to get container status \"6d5b6eec2db4d0e77ea84569feef4a1a86c86a6b62cd32151362a9f18a91be7c\": rpc error: code = NotFound desc = could not find container \"6d5b6eec2db4d0e77ea84569feef4a1a86c86a6b62cd32151362a9f18a91be7c\": container with ID starting with 6d5b6eec2db4d0e77ea84569feef4a1a86c86a6b62cd32151362a9f18a91be7c not found: ID does not exist" Sep 29 11:15:51 crc kubenswrapper[4752]: I0929 11:15:51.637278 4752 scope.go:117] "RemoveContainer" containerID="d68f6e86e1b11c3ebc15e7b5f47361ac86532bdc31b889c9450abc6b332910c4" Sep 29 11:15:51 crc kubenswrapper[4752]: I0929 11:15:51.637721 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d68f6e86e1b11c3ebc15e7b5f47361ac86532bdc31b889c9450abc6b332910c4"} err="failed to get container status \"d68f6e86e1b11c3ebc15e7b5f47361ac86532bdc31b889c9450abc6b332910c4\": rpc error: code = NotFound desc = could not find container \"d68f6e86e1b11c3ebc15e7b5f47361ac86532bdc31b889c9450abc6b332910c4\": container with ID starting with d68f6e86e1b11c3ebc15e7b5f47361ac86532bdc31b889c9450abc6b332910c4 not found: ID does not exist" Sep 29 11:15:51 crc kubenswrapper[4752]: I0929 11:15:51.637746 4752 scope.go:117] "RemoveContainer" containerID="628d0a5ad9c94c8061a32bab98243f472011f8877e9084b5e7597be812cf0074" Sep 29 11:15:51 crc kubenswrapper[4752]: I0929 11:15:51.638022 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"628d0a5ad9c94c8061a32bab98243f472011f8877e9084b5e7597be812cf0074"} err="failed to get container status \"628d0a5ad9c94c8061a32bab98243f472011f8877e9084b5e7597be812cf0074\": rpc error: code = NotFound desc = could not find container \"628d0a5ad9c94c8061a32bab98243f472011f8877e9084b5e7597be812cf0074\": container with ID starting with 628d0a5ad9c94c8061a32bab98243f472011f8877e9084b5e7597be812cf0074 not found: ID does not exist" Sep 29 11:15:51 crc kubenswrapper[4752]: I0929 11:15:51.649671 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/83dd436e-d5bb-40a5-93f7-d6587789f13f-sg-core-conf-yaml\") pod \"83dd436e-d5bb-40a5-93f7-d6587789f13f\" (UID: \"83dd436e-d5bb-40a5-93f7-d6587789f13f\") " Sep 29 11:15:51 crc kubenswrapper[4752]: I0929 11:15:51.649754 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/83dd436e-d5bb-40a5-93f7-d6587789f13f-ceilometer-tls-certs\") pod \"83dd436e-d5bb-40a5-93f7-d6587789f13f\" (UID: \"83dd436e-d5bb-40a5-93f7-d6587789f13f\") " Sep 29 11:15:51 crc kubenswrapper[4752]: I0929 11:15:51.649787 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g4j5f\" (UniqueName: \"kubernetes.io/projected/83dd436e-d5bb-40a5-93f7-d6587789f13f-kube-api-access-g4j5f\") pod \"83dd436e-d5bb-40a5-93f7-d6587789f13f\" (UID: \"83dd436e-d5bb-40a5-93f7-d6587789f13f\") " Sep 29 11:15:51 crc kubenswrapper[4752]: I0929 11:15:51.649831 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83dd436e-d5bb-40a5-93f7-d6587789f13f-combined-ca-bundle\") pod \"83dd436e-d5bb-40a5-93f7-d6587789f13f\" (UID: \"83dd436e-d5bb-40a5-93f7-d6587789f13f\") " Sep 29 11:15:51 crc kubenswrapper[4752]: I0929 11:15:51.649850 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/83dd436e-d5bb-40a5-93f7-d6587789f13f-run-httpd\") pod \"83dd436e-d5bb-40a5-93f7-d6587789f13f\" (UID: \"83dd436e-d5bb-40a5-93f7-d6587789f13f\") " Sep 29 11:15:51 crc kubenswrapper[4752]: I0929 11:15:51.649915 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/83dd436e-d5bb-40a5-93f7-d6587789f13f-log-httpd\") pod \"83dd436e-d5bb-40a5-93f7-d6587789f13f\" (UID: \"83dd436e-d5bb-40a5-93f7-d6587789f13f\") " Sep 29 11:15:51 crc kubenswrapper[4752]: I0929 11:15:51.649993 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83dd436e-d5bb-40a5-93f7-d6587789f13f-config-data\") pod \"83dd436e-d5bb-40a5-93f7-d6587789f13f\" (UID: \"83dd436e-d5bb-40a5-93f7-d6587789f13f\") " Sep 29 11:15:51 crc kubenswrapper[4752]: I0929 11:15:51.650035 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83dd436e-d5bb-40a5-93f7-d6587789f13f-scripts\") pod \"83dd436e-d5bb-40a5-93f7-d6587789f13f\" (UID: \"83dd436e-d5bb-40a5-93f7-d6587789f13f\") " Sep 29 11:15:51 crc kubenswrapper[4752]: I0929 11:15:51.650891 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83dd436e-d5bb-40a5-93f7-d6587789f13f-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "83dd436e-d5bb-40a5-93f7-d6587789f13f" (UID: "83dd436e-d5bb-40a5-93f7-d6587789f13f"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 11:15:51 crc kubenswrapper[4752]: I0929 11:15:51.650913 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83dd436e-d5bb-40a5-93f7-d6587789f13f-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "83dd436e-d5bb-40a5-93f7-d6587789f13f" (UID: "83dd436e-d5bb-40a5-93f7-d6587789f13f"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 11:15:51 crc kubenswrapper[4752]: I0929 11:15:51.656417 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83dd436e-d5bb-40a5-93f7-d6587789f13f-scripts" (OuterVolumeSpecName: "scripts") pod "83dd436e-d5bb-40a5-93f7-d6587789f13f" (UID: "83dd436e-d5bb-40a5-93f7-d6587789f13f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:15:51 crc kubenswrapper[4752]: I0929 11:15:51.659048 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83dd436e-d5bb-40a5-93f7-d6587789f13f-kube-api-access-g4j5f" (OuterVolumeSpecName: "kube-api-access-g4j5f") pod "83dd436e-d5bb-40a5-93f7-d6587789f13f" (UID: "83dd436e-d5bb-40a5-93f7-d6587789f13f"). InnerVolumeSpecName "kube-api-access-g4j5f". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 11:15:51 crc kubenswrapper[4752]: I0929 11:15:51.675377 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83dd436e-d5bb-40a5-93f7-d6587789f13f-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "83dd436e-d5bb-40a5-93f7-d6587789f13f" (UID: "83dd436e-d5bb-40a5-93f7-d6587789f13f"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:15:51 crc kubenswrapper[4752]: I0929 11:15:51.703334 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83dd436e-d5bb-40a5-93f7-d6587789f13f-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "83dd436e-d5bb-40a5-93f7-d6587789f13f" (UID: "83dd436e-d5bb-40a5-93f7-d6587789f13f"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:15:51 crc kubenswrapper[4752]: I0929 11:15:51.724220 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83dd436e-d5bb-40a5-93f7-d6587789f13f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "83dd436e-d5bb-40a5-93f7-d6587789f13f" (UID: "83dd436e-d5bb-40a5-93f7-d6587789f13f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:15:51 crc kubenswrapper[4752]: I0929 11:15:51.741731 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83dd436e-d5bb-40a5-93f7-d6587789f13f-config-data" (OuterVolumeSpecName: "config-data") pod "83dd436e-d5bb-40a5-93f7-d6587789f13f" (UID: "83dd436e-d5bb-40a5-93f7-d6587789f13f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:15:51 crc kubenswrapper[4752]: I0929 11:15:51.751845 4752 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/83dd436e-d5bb-40a5-93f7-d6587789f13f-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 29 11:15:51 crc kubenswrapper[4752]: I0929 11:15:51.751893 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83dd436e-d5bb-40a5-93f7-d6587789f13f-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 11:15:51 crc kubenswrapper[4752]: I0929 11:15:51.751905 4752 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83dd436e-d5bb-40a5-93f7-d6587789f13f-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 11:15:51 crc kubenswrapper[4752]: I0929 11:15:51.751917 4752 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/83dd436e-d5bb-40a5-93f7-d6587789f13f-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 29 11:15:51 crc kubenswrapper[4752]: I0929 11:15:51.751932 4752 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/83dd436e-d5bb-40a5-93f7-d6587789f13f-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 29 11:15:51 crc kubenswrapper[4752]: I0929 11:15:51.751948 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g4j5f\" (UniqueName: \"kubernetes.io/projected/83dd436e-d5bb-40a5-93f7-d6587789f13f-kube-api-access-g4j5f\") on node \"crc\" DevicePath \"\"" Sep 29 11:15:51 crc kubenswrapper[4752]: I0929 11:15:51.751962 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83dd436e-d5bb-40a5-93f7-d6587789f13f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 11:15:51 crc kubenswrapper[4752]: I0929 11:15:51.751972 4752 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/83dd436e-d5bb-40a5-93f7-d6587789f13f-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 29 11:15:51 crc kubenswrapper[4752]: I0929 11:15:51.876997 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Sep 29 11:15:51 crc kubenswrapper[4752]: I0929 11:15:51.884873 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Sep 29 11:15:51 crc kubenswrapper[4752]: I0929 11:15:51.904347 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Sep 29 11:15:51 crc kubenswrapper[4752]: E0929 11:15:51.904759 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83dd436e-d5bb-40a5-93f7-d6587789f13f" containerName="sg-core" Sep 29 11:15:51 crc kubenswrapper[4752]: I0929 11:15:51.904780 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="83dd436e-d5bb-40a5-93f7-d6587789f13f" containerName="sg-core" Sep 29 11:15:51 crc kubenswrapper[4752]: E0929 11:15:51.904844 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83dd436e-d5bb-40a5-93f7-d6587789f13f" containerName="proxy-httpd" Sep 29 11:15:51 crc kubenswrapper[4752]: I0929 11:15:51.904855 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="83dd436e-d5bb-40a5-93f7-d6587789f13f" containerName="proxy-httpd" Sep 29 11:15:51 crc kubenswrapper[4752]: E0929 11:15:51.904870 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83dd436e-d5bb-40a5-93f7-d6587789f13f" containerName="ceilometer-notification-agent" Sep 29 11:15:51 crc kubenswrapper[4752]: I0929 11:15:51.904880 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="83dd436e-d5bb-40a5-93f7-d6587789f13f" containerName="ceilometer-notification-agent" Sep 29 11:15:51 crc kubenswrapper[4752]: E0929 11:15:51.904904 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83dd436e-d5bb-40a5-93f7-d6587789f13f" containerName="ceilometer-central-agent" Sep 29 11:15:51 crc kubenswrapper[4752]: I0929 11:15:51.904913 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="83dd436e-d5bb-40a5-93f7-d6587789f13f" containerName="ceilometer-central-agent" Sep 29 11:15:51 crc kubenswrapper[4752]: I0929 11:15:51.905110 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="83dd436e-d5bb-40a5-93f7-d6587789f13f" containerName="ceilometer-central-agent" Sep 29 11:15:51 crc kubenswrapper[4752]: I0929 11:15:51.905142 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="83dd436e-d5bb-40a5-93f7-d6587789f13f" containerName="proxy-httpd" Sep 29 11:15:51 crc kubenswrapper[4752]: I0929 11:15:51.905157 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="83dd436e-d5bb-40a5-93f7-d6587789f13f" containerName="ceilometer-notification-agent" Sep 29 11:15:51 crc kubenswrapper[4752]: I0929 11:15:51.905167 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="83dd436e-d5bb-40a5-93f7-d6587789f13f" containerName="sg-core" Sep 29 11:15:51 crc kubenswrapper[4752]: I0929 11:15:51.907123 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:15:51 crc kubenswrapper[4752]: I0929 11:15:51.908635 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Sep 29 11:15:51 crc kubenswrapper[4752]: I0929 11:15:51.909096 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Sep 29 11:15:51 crc kubenswrapper[4752]: I0929 11:15:51.909359 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-ceilometer-internal-svc" Sep 29 11:15:51 crc kubenswrapper[4752]: I0929 11:15:51.918371 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Sep 29 11:15:52 crc kubenswrapper[4752]: I0929 11:15:52.041546 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83dd436e-d5bb-40a5-93f7-d6587789f13f" path="/var/lib/kubelet/pods/83dd436e-d5bb-40a5-93f7-d6587789f13f/volumes" Sep 29 11:15:52 crc kubenswrapper[4752]: I0929 11:15:52.056570 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec5e871e-bdda-4e23-8272-885dc11508b0-scripts\") pod \"ceilometer-0\" (UID: \"ec5e871e-bdda-4e23-8272-885dc11508b0\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:15:52 crc kubenswrapper[4752]: I0929 11:15:52.056660 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l955s\" (UniqueName: \"kubernetes.io/projected/ec5e871e-bdda-4e23-8272-885dc11508b0-kube-api-access-l955s\") pod \"ceilometer-0\" (UID: \"ec5e871e-bdda-4e23-8272-885dc11508b0\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:15:52 crc kubenswrapper[4752]: I0929 11:15:52.056692 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ec5e871e-bdda-4e23-8272-885dc11508b0-run-httpd\") pod \"ceilometer-0\" (UID: \"ec5e871e-bdda-4e23-8272-885dc11508b0\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:15:52 crc kubenswrapper[4752]: I0929 11:15:52.056726 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec5e871e-bdda-4e23-8272-885dc11508b0-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ec5e871e-bdda-4e23-8272-885dc11508b0\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:15:52 crc kubenswrapper[4752]: I0929 11:15:52.056763 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ec5e871e-bdda-4e23-8272-885dc11508b0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ec5e871e-bdda-4e23-8272-885dc11508b0\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:15:52 crc kubenswrapper[4752]: I0929 11:15:52.056793 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ec5e871e-bdda-4e23-8272-885dc11508b0-log-httpd\") pod \"ceilometer-0\" (UID: \"ec5e871e-bdda-4e23-8272-885dc11508b0\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:15:52 crc kubenswrapper[4752]: I0929 11:15:52.056879 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec5e871e-bdda-4e23-8272-885dc11508b0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ec5e871e-bdda-4e23-8272-885dc11508b0\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:15:52 crc kubenswrapper[4752]: I0929 11:15:52.056909 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec5e871e-bdda-4e23-8272-885dc11508b0-config-data\") pod \"ceilometer-0\" (UID: \"ec5e871e-bdda-4e23-8272-885dc11508b0\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:15:52 crc kubenswrapper[4752]: I0929 11:15:52.158995 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec5e871e-bdda-4e23-8272-885dc11508b0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ec5e871e-bdda-4e23-8272-885dc11508b0\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:15:52 crc kubenswrapper[4752]: I0929 11:15:52.159073 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec5e871e-bdda-4e23-8272-885dc11508b0-config-data\") pod \"ceilometer-0\" (UID: \"ec5e871e-bdda-4e23-8272-885dc11508b0\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:15:52 crc kubenswrapper[4752]: I0929 11:15:52.159163 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec5e871e-bdda-4e23-8272-885dc11508b0-scripts\") pod \"ceilometer-0\" (UID: \"ec5e871e-bdda-4e23-8272-885dc11508b0\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:15:52 crc kubenswrapper[4752]: I0929 11:15:52.159207 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l955s\" (UniqueName: \"kubernetes.io/projected/ec5e871e-bdda-4e23-8272-885dc11508b0-kube-api-access-l955s\") pod \"ceilometer-0\" (UID: \"ec5e871e-bdda-4e23-8272-885dc11508b0\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:15:52 crc kubenswrapper[4752]: I0929 11:15:52.159235 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ec5e871e-bdda-4e23-8272-885dc11508b0-run-httpd\") pod \"ceilometer-0\" (UID: \"ec5e871e-bdda-4e23-8272-885dc11508b0\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:15:52 crc kubenswrapper[4752]: I0929 11:15:52.159277 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec5e871e-bdda-4e23-8272-885dc11508b0-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ec5e871e-bdda-4e23-8272-885dc11508b0\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:15:52 crc kubenswrapper[4752]: I0929 11:15:52.159318 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ec5e871e-bdda-4e23-8272-885dc11508b0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ec5e871e-bdda-4e23-8272-885dc11508b0\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:15:52 crc kubenswrapper[4752]: I0929 11:15:52.159351 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ec5e871e-bdda-4e23-8272-885dc11508b0-log-httpd\") pod \"ceilometer-0\" (UID: \"ec5e871e-bdda-4e23-8272-885dc11508b0\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:15:52 crc kubenswrapper[4752]: I0929 11:15:52.160453 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ec5e871e-bdda-4e23-8272-885dc11508b0-run-httpd\") pod \"ceilometer-0\" (UID: \"ec5e871e-bdda-4e23-8272-885dc11508b0\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:15:52 crc kubenswrapper[4752]: I0929 11:15:52.161126 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ec5e871e-bdda-4e23-8272-885dc11508b0-log-httpd\") pod \"ceilometer-0\" (UID: \"ec5e871e-bdda-4e23-8272-885dc11508b0\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:15:52 crc kubenswrapper[4752]: I0929 11:15:52.163508 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ec5e871e-bdda-4e23-8272-885dc11508b0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ec5e871e-bdda-4e23-8272-885dc11508b0\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:15:52 crc kubenswrapper[4752]: I0929 11:15:52.163768 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec5e871e-bdda-4e23-8272-885dc11508b0-scripts\") pod \"ceilometer-0\" (UID: \"ec5e871e-bdda-4e23-8272-885dc11508b0\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:15:52 crc kubenswrapper[4752]: I0929 11:15:52.164380 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec5e871e-bdda-4e23-8272-885dc11508b0-config-data\") pod \"ceilometer-0\" (UID: \"ec5e871e-bdda-4e23-8272-885dc11508b0\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:15:52 crc kubenswrapper[4752]: I0929 11:15:52.165228 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec5e871e-bdda-4e23-8272-885dc11508b0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ec5e871e-bdda-4e23-8272-885dc11508b0\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:15:52 crc kubenswrapper[4752]: I0929 11:15:52.166837 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec5e871e-bdda-4e23-8272-885dc11508b0-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ec5e871e-bdda-4e23-8272-885dc11508b0\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:15:52 crc kubenswrapper[4752]: I0929 11:15:52.182117 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l955s\" (UniqueName: \"kubernetes.io/projected/ec5e871e-bdda-4e23-8272-885dc11508b0-kube-api-access-l955s\") pod \"ceilometer-0\" (UID: \"ec5e871e-bdda-4e23-8272-885dc11508b0\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:15:52 crc kubenswrapper[4752]: I0929 11:15:52.227891 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:15:52 crc kubenswrapper[4752]: I0929 11:15:52.733733 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Sep 29 11:15:52 crc kubenswrapper[4752]: W0929 11:15:52.742437 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podec5e871e_bdda_4e23_8272_885dc11508b0.slice/crio-f85e735580def5f136789a0da867928ae873f88596556498f8f397aeb08c6f13 WatchSource:0}: Error finding container f85e735580def5f136789a0da867928ae873f88596556498f8f397aeb08c6f13: Status 404 returned error can't find the container with id f85e735580def5f136789a0da867928ae873f88596556498f8f397aeb08c6f13 Sep 29 11:15:53 crc kubenswrapper[4752]: I0929 11:15:53.561191 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"ec5e871e-bdda-4e23-8272-885dc11508b0","Type":"ContainerStarted","Data":"f85e735580def5f136789a0da867928ae873f88596556498f8f397aeb08c6f13"} Sep 29 11:15:54 crc kubenswrapper[4752]: I0929 11:15:54.571284 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"ec5e871e-bdda-4e23-8272-885dc11508b0","Type":"ContainerStarted","Data":"91374e3675552cbcbff4c57f4c196bfb1d123bcd3e926b8c6567f208c5f12a0a"} Sep 29 11:15:55 crc kubenswrapper[4752]: I0929 11:15:55.597360 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"ec5e871e-bdda-4e23-8272-885dc11508b0","Type":"ContainerStarted","Data":"716804ba5a6f0b8313c456060a661291191ca055cbbb93e747fd1452e4a593c0"} Sep 29 11:15:56 crc kubenswrapper[4752]: I0929 11:15:56.291894 4752 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 29 11:15:56 crc kubenswrapper[4752]: I0929 11:15:56.612767 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"ec5e871e-bdda-4e23-8272-885dc11508b0","Type":"ContainerStarted","Data":"9901c1bf9f831b52d6b1cacae0e986c4ac21d9ef6a50d6356aecc6ec7e3033e7"} Sep 29 11:15:59 crc kubenswrapper[4752]: I0929 11:15:59.648904 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"ec5e871e-bdda-4e23-8272-885dc11508b0","Type":"ContainerStarted","Data":"417a7ce2376ea65c4d417e8c1805bd4c0a2d82be06b56a5cff0593eb996d01be"} Sep 29 11:15:59 crc kubenswrapper[4752]: I0929 11:15:59.649556 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:15:59 crc kubenswrapper[4752]: I0929 11:15:59.681692 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/ceilometer-0" podStartSLOduration=2.511810317 podStartE2EDuration="8.681667178s" podCreationTimestamp="2025-09-29 11:15:51 +0000 UTC" firstStartedPulling="2025-09-29 11:15:52.744962171 +0000 UTC m=+1893.534103838" lastFinishedPulling="2025-09-29 11:15:58.914819042 +0000 UTC m=+1899.703960699" observedRunningTime="2025-09-29 11:15:59.672318163 +0000 UTC m=+1900.461459830" watchObservedRunningTime="2025-09-29 11:15:59.681667178 +0000 UTC m=+1900.470808885" Sep 29 11:16:21 crc kubenswrapper[4752]: I0929 11:16:21.374267 4752 scope.go:117] "RemoveContainer" containerID="30e5b3313d62a38a3cc1d0ae790605dd5cc2e3cc17a37a875a5e931339bdd518" Sep 29 11:16:22 crc kubenswrapper[4752]: I0929 11:16:22.236512 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:16:26 crc kubenswrapper[4752]: I0929 11:16:26.175826 4752 patch_prober.go:28] interesting pod/machine-config-daemon-mgrvs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 11:16:26 crc kubenswrapper[4752]: I0929 11:16:26.176428 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 11:16:56 crc kubenswrapper[4752]: I0929 11:16:56.175719 4752 patch_prober.go:28] interesting pod/machine-config-daemon-mgrvs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 11:16:56 crc kubenswrapper[4752]: I0929 11:16:56.176170 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 11:17:19 crc kubenswrapper[4752]: E0929 11:17:19.045296 4752 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.21:41910->38.102.83.21:41527: write tcp 38.102.83.21:41910->38.102.83.21:41527: write: broken pipe Sep 29 11:17:21 crc kubenswrapper[4752]: I0929 11:17:21.456367 4752 scope.go:117] "RemoveContainer" containerID="cb8a557c5b5a0ca2f113f14da7e6e690901826eefcdf792986324eb9d242d188" Sep 29 11:17:21 crc kubenswrapper[4752]: I0929 11:17:21.476535 4752 scope.go:117] "RemoveContainer" containerID="c99e3da615c2d0a795c0b7893146592e1a9e083d04457218db84f9dedba7aaf8" Sep 29 11:17:26 crc kubenswrapper[4752]: I0929 11:17:26.175537 4752 patch_prober.go:28] interesting pod/machine-config-daemon-mgrvs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 11:17:26 crc kubenswrapper[4752]: I0929 11:17:26.177461 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 11:17:26 crc kubenswrapper[4752]: I0929 11:17:26.177651 4752 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" Sep 29 11:17:26 crc kubenswrapper[4752]: I0929 11:17:26.179653 4752 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"910e9715ff191c5fe48e666106182c9ab8ee872d75d2fda3908f263b58dd32be"} pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 29 11:17:26 crc kubenswrapper[4752]: I0929 11:17:26.179769 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" containerName="machine-config-daemon" containerID="cri-o://910e9715ff191c5fe48e666106182c9ab8ee872d75d2fda3908f263b58dd32be" gracePeriod=600 Sep 29 11:17:26 crc kubenswrapper[4752]: I0929 11:17:26.366398 4752 generic.go:334] "Generic (PLEG): container finished" podID="5863c243-797d-462a-b11f-71aaf005f8d1" containerID="910e9715ff191c5fe48e666106182c9ab8ee872d75d2fda3908f263b58dd32be" exitCode=0 Sep 29 11:17:26 crc kubenswrapper[4752]: I0929 11:17:26.366486 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" event={"ID":"5863c243-797d-462a-b11f-71aaf005f8d1","Type":"ContainerDied","Data":"910e9715ff191c5fe48e666106182c9ab8ee872d75d2fda3908f263b58dd32be"} Sep 29 11:17:26 crc kubenswrapper[4752]: I0929 11:17:26.366875 4752 scope.go:117] "RemoveContainer" containerID="18eab399f36ee078445fd05909a0d35ada9fdfa2424d9729b71ad67d5ec2e670" Sep 29 11:17:27 crc kubenswrapper[4752]: I0929 11:17:27.377415 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" event={"ID":"5863c243-797d-462a-b11f-71aaf005f8d1","Type":"ContainerStarted","Data":"93752bca1235c82c7e20c88ea68e0afd59b9dc59d3315066b08789cc37a87e37"} Sep 29 11:18:34 crc kubenswrapper[4752]: I0929 11:18:34.808997 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-7rqgg"] Sep 29 11:18:34 crc kubenswrapper[4752]: I0929 11:18:34.830411 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7rqgg" Sep 29 11:18:34 crc kubenswrapper[4752]: I0929 11:18:34.851293 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7rqgg"] Sep 29 11:18:34 crc kubenswrapper[4752]: I0929 11:18:34.997497 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e81527ea-5684-45d4-a543-61c65a064357-utilities\") pod \"redhat-marketplace-7rqgg\" (UID: \"e81527ea-5684-45d4-a543-61c65a064357\") " pod="openshift-marketplace/redhat-marketplace-7rqgg" Sep 29 11:18:34 crc kubenswrapper[4752]: I0929 11:18:34.997693 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vw72k\" (UniqueName: \"kubernetes.io/projected/e81527ea-5684-45d4-a543-61c65a064357-kube-api-access-vw72k\") pod \"redhat-marketplace-7rqgg\" (UID: \"e81527ea-5684-45d4-a543-61c65a064357\") " pod="openshift-marketplace/redhat-marketplace-7rqgg" Sep 29 11:18:34 crc kubenswrapper[4752]: I0929 11:18:34.997748 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e81527ea-5684-45d4-a543-61c65a064357-catalog-content\") pod \"redhat-marketplace-7rqgg\" (UID: \"e81527ea-5684-45d4-a543-61c65a064357\") " pod="openshift-marketplace/redhat-marketplace-7rqgg" Sep 29 11:18:35 crc kubenswrapper[4752]: I0929 11:18:35.100123 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vw72k\" (UniqueName: \"kubernetes.io/projected/e81527ea-5684-45d4-a543-61c65a064357-kube-api-access-vw72k\") pod \"redhat-marketplace-7rqgg\" (UID: \"e81527ea-5684-45d4-a543-61c65a064357\") " pod="openshift-marketplace/redhat-marketplace-7rqgg" Sep 29 11:18:35 crc kubenswrapper[4752]: I0929 11:18:35.100306 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e81527ea-5684-45d4-a543-61c65a064357-catalog-content\") pod \"redhat-marketplace-7rqgg\" (UID: \"e81527ea-5684-45d4-a543-61c65a064357\") " pod="openshift-marketplace/redhat-marketplace-7rqgg" Sep 29 11:18:35 crc kubenswrapper[4752]: I0929 11:18:35.100441 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e81527ea-5684-45d4-a543-61c65a064357-utilities\") pod \"redhat-marketplace-7rqgg\" (UID: \"e81527ea-5684-45d4-a543-61c65a064357\") " pod="openshift-marketplace/redhat-marketplace-7rqgg" Sep 29 11:18:35 crc kubenswrapper[4752]: I0929 11:18:35.100921 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e81527ea-5684-45d4-a543-61c65a064357-catalog-content\") pod \"redhat-marketplace-7rqgg\" (UID: \"e81527ea-5684-45d4-a543-61c65a064357\") " pod="openshift-marketplace/redhat-marketplace-7rqgg" Sep 29 11:18:35 crc kubenswrapper[4752]: I0929 11:18:35.101116 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e81527ea-5684-45d4-a543-61c65a064357-utilities\") pod \"redhat-marketplace-7rqgg\" (UID: \"e81527ea-5684-45d4-a543-61c65a064357\") " pod="openshift-marketplace/redhat-marketplace-7rqgg" Sep 29 11:18:35 crc kubenswrapper[4752]: I0929 11:18:35.120882 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vw72k\" (UniqueName: \"kubernetes.io/projected/e81527ea-5684-45d4-a543-61c65a064357-kube-api-access-vw72k\") pod \"redhat-marketplace-7rqgg\" (UID: \"e81527ea-5684-45d4-a543-61c65a064357\") " pod="openshift-marketplace/redhat-marketplace-7rqgg" Sep 29 11:18:35 crc kubenswrapper[4752]: I0929 11:18:35.157354 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7rqgg" Sep 29 11:18:35 crc kubenswrapper[4752]: I0929 11:18:35.591789 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7rqgg"] Sep 29 11:18:35 crc kubenswrapper[4752]: I0929 11:18:35.931495 4752 generic.go:334] "Generic (PLEG): container finished" podID="e81527ea-5684-45d4-a543-61c65a064357" containerID="67aaf2f4528f86f8e02dd8ced4a2167c9c9fba7ede11eeb2fd9d05383cee0576" exitCode=0 Sep 29 11:18:35 crc kubenswrapper[4752]: I0929 11:18:35.931667 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7rqgg" event={"ID":"e81527ea-5684-45d4-a543-61c65a064357","Type":"ContainerDied","Data":"67aaf2f4528f86f8e02dd8ced4a2167c9c9fba7ede11eeb2fd9d05383cee0576"} Sep 29 11:18:35 crc kubenswrapper[4752]: I0929 11:18:35.931884 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7rqgg" event={"ID":"e81527ea-5684-45d4-a543-61c65a064357","Type":"ContainerStarted","Data":"a1b4da4d20fdfeff8fd3cd268ee62e5294d84f150be23c71cdd6ad90fa26f7b9"} Sep 29 11:18:36 crc kubenswrapper[4752]: I0929 11:18:36.941510 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7rqgg" event={"ID":"e81527ea-5684-45d4-a543-61c65a064357","Type":"ContainerStarted","Data":"5a22b2cc50a5be3cf7619c398fa207f47f1ca089ed0114bba580bfb3dda28183"} Sep 29 11:18:37 crc kubenswrapper[4752]: I0929 11:18:37.952199 4752 generic.go:334] "Generic (PLEG): container finished" podID="e81527ea-5684-45d4-a543-61c65a064357" containerID="5a22b2cc50a5be3cf7619c398fa207f47f1ca089ed0114bba580bfb3dda28183" exitCode=0 Sep 29 11:18:37 crc kubenswrapper[4752]: I0929 11:18:37.952250 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7rqgg" event={"ID":"e81527ea-5684-45d4-a543-61c65a064357","Type":"ContainerDied","Data":"5a22b2cc50a5be3cf7619c398fa207f47f1ca089ed0114bba580bfb3dda28183"} Sep 29 11:18:38 crc kubenswrapper[4752]: I0929 11:18:38.962400 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7rqgg" event={"ID":"e81527ea-5684-45d4-a543-61c65a064357","Type":"ContainerStarted","Data":"90e01c698867e9ea04a25bc3f53c2c94a4a880db6e68ccaeb05be760703ebf08"} Sep 29 11:18:38 crc kubenswrapper[4752]: I0929 11:18:38.983115 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-7rqgg" podStartSLOduration=2.411131153 podStartE2EDuration="4.983091526s" podCreationTimestamp="2025-09-29 11:18:34 +0000 UTC" firstStartedPulling="2025-09-29 11:18:35.933579609 +0000 UTC m=+2056.722721276" lastFinishedPulling="2025-09-29 11:18:38.505539982 +0000 UTC m=+2059.294681649" observedRunningTime="2025-09-29 11:18:38.976867832 +0000 UTC m=+2059.766009509" watchObservedRunningTime="2025-09-29 11:18:38.983091526 +0000 UTC m=+2059.772233193" Sep 29 11:18:45 crc kubenswrapper[4752]: I0929 11:18:45.157668 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-7rqgg" Sep 29 11:18:45 crc kubenswrapper[4752]: I0929 11:18:45.158277 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-7rqgg" Sep 29 11:18:45 crc kubenswrapper[4752]: I0929 11:18:45.201373 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-7rqgg" Sep 29 11:18:46 crc kubenswrapper[4752]: I0929 11:18:46.079688 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-7rqgg" Sep 29 11:18:48 crc kubenswrapper[4752]: I0929 11:18:48.778795 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7rqgg"] Sep 29 11:18:48 crc kubenswrapper[4752]: I0929 11:18:48.779533 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-7rqgg" podUID="e81527ea-5684-45d4-a543-61c65a064357" containerName="registry-server" containerID="cri-o://90e01c698867e9ea04a25bc3f53c2c94a4a880db6e68ccaeb05be760703ebf08" gracePeriod=2 Sep 29 11:18:49 crc kubenswrapper[4752]: I0929 11:18:49.797997 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7rqgg" Sep 29 11:18:49 crc kubenswrapper[4752]: I0929 11:18:49.868584 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e81527ea-5684-45d4-a543-61c65a064357-utilities\") pod \"e81527ea-5684-45d4-a543-61c65a064357\" (UID: \"e81527ea-5684-45d4-a543-61c65a064357\") " Sep 29 11:18:49 crc kubenswrapper[4752]: I0929 11:18:49.868653 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vw72k\" (UniqueName: \"kubernetes.io/projected/e81527ea-5684-45d4-a543-61c65a064357-kube-api-access-vw72k\") pod \"e81527ea-5684-45d4-a543-61c65a064357\" (UID: \"e81527ea-5684-45d4-a543-61c65a064357\") " Sep 29 11:18:49 crc kubenswrapper[4752]: I0929 11:18:49.868696 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e81527ea-5684-45d4-a543-61c65a064357-catalog-content\") pod \"e81527ea-5684-45d4-a543-61c65a064357\" (UID: \"e81527ea-5684-45d4-a543-61c65a064357\") " Sep 29 11:18:49 crc kubenswrapper[4752]: I0929 11:18:49.870341 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e81527ea-5684-45d4-a543-61c65a064357-utilities" (OuterVolumeSpecName: "utilities") pod "e81527ea-5684-45d4-a543-61c65a064357" (UID: "e81527ea-5684-45d4-a543-61c65a064357"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 11:18:49 crc kubenswrapper[4752]: I0929 11:18:49.874280 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e81527ea-5684-45d4-a543-61c65a064357-kube-api-access-vw72k" (OuterVolumeSpecName: "kube-api-access-vw72k") pod "e81527ea-5684-45d4-a543-61c65a064357" (UID: "e81527ea-5684-45d4-a543-61c65a064357"). InnerVolumeSpecName "kube-api-access-vw72k". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 11:18:49 crc kubenswrapper[4752]: I0929 11:18:49.886239 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e81527ea-5684-45d4-a543-61c65a064357-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e81527ea-5684-45d4-a543-61c65a064357" (UID: "e81527ea-5684-45d4-a543-61c65a064357"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 11:18:49 crc kubenswrapper[4752]: I0929 11:18:49.970961 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e81527ea-5684-45d4-a543-61c65a064357-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 11:18:49 crc kubenswrapper[4752]: I0929 11:18:49.970999 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vw72k\" (UniqueName: \"kubernetes.io/projected/e81527ea-5684-45d4-a543-61c65a064357-kube-api-access-vw72k\") on node \"crc\" DevicePath \"\"" Sep 29 11:18:49 crc kubenswrapper[4752]: I0929 11:18:49.971009 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e81527ea-5684-45d4-a543-61c65a064357-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 11:18:50 crc kubenswrapper[4752]: I0929 11:18:50.051791 4752 generic.go:334] "Generic (PLEG): container finished" podID="e81527ea-5684-45d4-a543-61c65a064357" containerID="90e01c698867e9ea04a25bc3f53c2c94a4a880db6e68ccaeb05be760703ebf08" exitCode=0 Sep 29 11:18:50 crc kubenswrapper[4752]: I0929 11:18:50.051864 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7rqgg" event={"ID":"e81527ea-5684-45d4-a543-61c65a064357","Type":"ContainerDied","Data":"90e01c698867e9ea04a25bc3f53c2c94a4a880db6e68ccaeb05be760703ebf08"} Sep 29 11:18:50 crc kubenswrapper[4752]: I0929 11:18:50.051892 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7rqgg" event={"ID":"e81527ea-5684-45d4-a543-61c65a064357","Type":"ContainerDied","Data":"a1b4da4d20fdfeff8fd3cd268ee62e5294d84f150be23c71cdd6ad90fa26f7b9"} Sep 29 11:18:50 crc kubenswrapper[4752]: I0929 11:18:50.051912 4752 scope.go:117] "RemoveContainer" containerID="90e01c698867e9ea04a25bc3f53c2c94a4a880db6e68ccaeb05be760703ebf08" Sep 29 11:18:50 crc kubenswrapper[4752]: I0929 11:18:50.051909 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7rqgg" Sep 29 11:18:50 crc kubenswrapper[4752]: I0929 11:18:50.071652 4752 scope.go:117] "RemoveContainer" containerID="5a22b2cc50a5be3cf7619c398fa207f47f1ca089ed0114bba580bfb3dda28183" Sep 29 11:18:50 crc kubenswrapper[4752]: I0929 11:18:50.099826 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7rqgg"] Sep 29 11:18:50 crc kubenswrapper[4752]: I0929 11:18:50.101833 4752 scope.go:117] "RemoveContainer" containerID="67aaf2f4528f86f8e02dd8ced4a2167c9c9fba7ede11eeb2fd9d05383cee0576" Sep 29 11:18:50 crc kubenswrapper[4752]: I0929 11:18:50.106679 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-7rqgg"] Sep 29 11:18:50 crc kubenswrapper[4752]: I0929 11:18:50.147012 4752 scope.go:117] "RemoveContainer" containerID="90e01c698867e9ea04a25bc3f53c2c94a4a880db6e68ccaeb05be760703ebf08" Sep 29 11:18:50 crc kubenswrapper[4752]: E0929 11:18:50.147738 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90e01c698867e9ea04a25bc3f53c2c94a4a880db6e68ccaeb05be760703ebf08\": container with ID starting with 90e01c698867e9ea04a25bc3f53c2c94a4a880db6e68ccaeb05be760703ebf08 not found: ID does not exist" containerID="90e01c698867e9ea04a25bc3f53c2c94a4a880db6e68ccaeb05be760703ebf08" Sep 29 11:18:50 crc kubenswrapper[4752]: I0929 11:18:50.147826 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90e01c698867e9ea04a25bc3f53c2c94a4a880db6e68ccaeb05be760703ebf08"} err="failed to get container status \"90e01c698867e9ea04a25bc3f53c2c94a4a880db6e68ccaeb05be760703ebf08\": rpc error: code = NotFound desc = could not find container \"90e01c698867e9ea04a25bc3f53c2c94a4a880db6e68ccaeb05be760703ebf08\": container with ID starting with 90e01c698867e9ea04a25bc3f53c2c94a4a880db6e68ccaeb05be760703ebf08 not found: ID does not exist" Sep 29 11:18:50 crc kubenswrapper[4752]: I0929 11:18:50.147877 4752 scope.go:117] "RemoveContainer" containerID="5a22b2cc50a5be3cf7619c398fa207f47f1ca089ed0114bba580bfb3dda28183" Sep 29 11:18:50 crc kubenswrapper[4752]: E0929 11:18:50.148439 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a22b2cc50a5be3cf7619c398fa207f47f1ca089ed0114bba580bfb3dda28183\": container with ID starting with 5a22b2cc50a5be3cf7619c398fa207f47f1ca089ed0114bba580bfb3dda28183 not found: ID does not exist" containerID="5a22b2cc50a5be3cf7619c398fa207f47f1ca089ed0114bba580bfb3dda28183" Sep 29 11:18:50 crc kubenswrapper[4752]: I0929 11:18:50.148513 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a22b2cc50a5be3cf7619c398fa207f47f1ca089ed0114bba580bfb3dda28183"} err="failed to get container status \"5a22b2cc50a5be3cf7619c398fa207f47f1ca089ed0114bba580bfb3dda28183\": rpc error: code = NotFound desc = could not find container \"5a22b2cc50a5be3cf7619c398fa207f47f1ca089ed0114bba580bfb3dda28183\": container with ID starting with 5a22b2cc50a5be3cf7619c398fa207f47f1ca089ed0114bba580bfb3dda28183 not found: ID does not exist" Sep 29 11:18:50 crc kubenswrapper[4752]: I0929 11:18:50.148548 4752 scope.go:117] "RemoveContainer" containerID="67aaf2f4528f86f8e02dd8ced4a2167c9c9fba7ede11eeb2fd9d05383cee0576" Sep 29 11:18:50 crc kubenswrapper[4752]: E0929 11:18:50.149141 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67aaf2f4528f86f8e02dd8ced4a2167c9c9fba7ede11eeb2fd9d05383cee0576\": container with ID starting with 67aaf2f4528f86f8e02dd8ced4a2167c9c9fba7ede11eeb2fd9d05383cee0576 not found: ID does not exist" containerID="67aaf2f4528f86f8e02dd8ced4a2167c9c9fba7ede11eeb2fd9d05383cee0576" Sep 29 11:18:50 crc kubenswrapper[4752]: I0929 11:18:50.149281 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67aaf2f4528f86f8e02dd8ced4a2167c9c9fba7ede11eeb2fd9d05383cee0576"} err="failed to get container status \"67aaf2f4528f86f8e02dd8ced4a2167c9c9fba7ede11eeb2fd9d05383cee0576\": rpc error: code = NotFound desc = could not find container \"67aaf2f4528f86f8e02dd8ced4a2167c9c9fba7ede11eeb2fd9d05383cee0576\": container with ID starting with 67aaf2f4528f86f8e02dd8ced4a2167c9c9fba7ede11eeb2fd9d05383cee0576 not found: ID does not exist" Sep 29 11:18:52 crc kubenswrapper[4752]: I0929 11:18:52.046415 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e81527ea-5684-45d4-a543-61c65a064357" path="/var/lib/kubelet/pods/e81527ea-5684-45d4-a543-61c65a064357/volumes" Sep 29 11:19:26 crc kubenswrapper[4752]: I0929 11:19:26.175821 4752 patch_prober.go:28] interesting pod/machine-config-daemon-mgrvs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 11:19:26 crc kubenswrapper[4752]: I0929 11:19:26.176520 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 11:19:41 crc kubenswrapper[4752]: I0929 11:19:41.989482 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7l2nn"] Sep 29 11:19:41 crc kubenswrapper[4752]: E0929 11:19:41.991767 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e81527ea-5684-45d4-a543-61c65a064357" containerName="extract-utilities" Sep 29 11:19:41 crc kubenswrapper[4752]: I0929 11:19:41.991889 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="e81527ea-5684-45d4-a543-61c65a064357" containerName="extract-utilities" Sep 29 11:19:41 crc kubenswrapper[4752]: E0929 11:19:41.991977 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e81527ea-5684-45d4-a543-61c65a064357" containerName="registry-server" Sep 29 11:19:41 crc kubenswrapper[4752]: I0929 11:19:41.992058 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="e81527ea-5684-45d4-a543-61c65a064357" containerName="registry-server" Sep 29 11:19:41 crc kubenswrapper[4752]: E0929 11:19:41.992781 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e81527ea-5684-45d4-a543-61c65a064357" containerName="extract-content" Sep 29 11:19:41 crc kubenswrapper[4752]: I0929 11:19:41.992884 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="e81527ea-5684-45d4-a543-61c65a064357" containerName="extract-content" Sep 29 11:19:41 crc kubenswrapper[4752]: I0929 11:19:41.993154 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="e81527ea-5684-45d4-a543-61c65a064357" containerName="registry-server" Sep 29 11:19:41 crc kubenswrapper[4752]: I0929 11:19:41.995445 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7l2nn" Sep 29 11:19:41 crc kubenswrapper[4752]: I0929 11:19:41.999683 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7l2nn"] Sep 29 11:19:42 crc kubenswrapper[4752]: I0929 11:19:42.137386 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jzzr\" (UniqueName: \"kubernetes.io/projected/75317a48-de24-4698-ae2b-5b7a34cacd1c-kube-api-access-9jzzr\") pod \"community-operators-7l2nn\" (UID: \"75317a48-de24-4698-ae2b-5b7a34cacd1c\") " pod="openshift-marketplace/community-operators-7l2nn" Sep 29 11:19:42 crc kubenswrapper[4752]: I0929 11:19:42.137507 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75317a48-de24-4698-ae2b-5b7a34cacd1c-catalog-content\") pod \"community-operators-7l2nn\" (UID: \"75317a48-de24-4698-ae2b-5b7a34cacd1c\") " pod="openshift-marketplace/community-operators-7l2nn" Sep 29 11:19:42 crc kubenswrapper[4752]: I0929 11:19:42.137560 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75317a48-de24-4698-ae2b-5b7a34cacd1c-utilities\") pod \"community-operators-7l2nn\" (UID: \"75317a48-de24-4698-ae2b-5b7a34cacd1c\") " pod="openshift-marketplace/community-operators-7l2nn" Sep 29 11:19:42 crc kubenswrapper[4752]: I0929 11:19:42.238683 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jzzr\" (UniqueName: \"kubernetes.io/projected/75317a48-de24-4698-ae2b-5b7a34cacd1c-kube-api-access-9jzzr\") pod \"community-operators-7l2nn\" (UID: \"75317a48-de24-4698-ae2b-5b7a34cacd1c\") " pod="openshift-marketplace/community-operators-7l2nn" Sep 29 11:19:42 crc kubenswrapper[4752]: I0929 11:19:42.239027 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75317a48-de24-4698-ae2b-5b7a34cacd1c-catalog-content\") pod \"community-operators-7l2nn\" (UID: \"75317a48-de24-4698-ae2b-5b7a34cacd1c\") " pod="openshift-marketplace/community-operators-7l2nn" Sep 29 11:19:42 crc kubenswrapper[4752]: I0929 11:19:42.239162 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75317a48-de24-4698-ae2b-5b7a34cacd1c-utilities\") pod \"community-operators-7l2nn\" (UID: \"75317a48-de24-4698-ae2b-5b7a34cacd1c\") " pod="openshift-marketplace/community-operators-7l2nn" Sep 29 11:19:42 crc kubenswrapper[4752]: I0929 11:19:42.239563 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75317a48-de24-4698-ae2b-5b7a34cacd1c-catalog-content\") pod \"community-operators-7l2nn\" (UID: \"75317a48-de24-4698-ae2b-5b7a34cacd1c\") " pod="openshift-marketplace/community-operators-7l2nn" Sep 29 11:19:42 crc kubenswrapper[4752]: I0929 11:19:42.239604 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75317a48-de24-4698-ae2b-5b7a34cacd1c-utilities\") pod \"community-operators-7l2nn\" (UID: \"75317a48-de24-4698-ae2b-5b7a34cacd1c\") " pod="openshift-marketplace/community-operators-7l2nn" Sep 29 11:19:42 crc kubenswrapper[4752]: I0929 11:19:42.259019 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jzzr\" (UniqueName: \"kubernetes.io/projected/75317a48-de24-4698-ae2b-5b7a34cacd1c-kube-api-access-9jzzr\") pod \"community-operators-7l2nn\" (UID: \"75317a48-de24-4698-ae2b-5b7a34cacd1c\") " pod="openshift-marketplace/community-operators-7l2nn" Sep 29 11:19:42 crc kubenswrapper[4752]: I0929 11:19:42.323449 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7l2nn" Sep 29 11:19:42 crc kubenswrapper[4752]: I0929 11:19:42.813371 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7l2nn"] Sep 29 11:19:43 crc kubenswrapper[4752]: I0929 11:19:43.502514 4752 generic.go:334] "Generic (PLEG): container finished" podID="75317a48-de24-4698-ae2b-5b7a34cacd1c" containerID="f15a8bb442a51a678ebde0546d73481be85419f62f5a5c7ce896f4bc32bed92c" exitCode=0 Sep 29 11:19:43 crc kubenswrapper[4752]: I0929 11:19:43.502573 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7l2nn" event={"ID":"75317a48-de24-4698-ae2b-5b7a34cacd1c","Type":"ContainerDied","Data":"f15a8bb442a51a678ebde0546d73481be85419f62f5a5c7ce896f4bc32bed92c"} Sep 29 11:19:43 crc kubenswrapper[4752]: I0929 11:19:43.502754 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7l2nn" event={"ID":"75317a48-de24-4698-ae2b-5b7a34cacd1c","Type":"ContainerStarted","Data":"995ff86cd1b03da90352347fe93cb12a981033fd5d2832617638e9e76ed3f47f"} Sep 29 11:19:46 crc kubenswrapper[4752]: I0929 11:19:46.533396 4752 generic.go:334] "Generic (PLEG): container finished" podID="75317a48-de24-4698-ae2b-5b7a34cacd1c" containerID="677ab6c2f5bed7596e70d334d998a9e911915f5add047779a0020fe282fe2d04" exitCode=0 Sep 29 11:19:46 crc kubenswrapper[4752]: I0929 11:19:46.533546 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7l2nn" event={"ID":"75317a48-de24-4698-ae2b-5b7a34cacd1c","Type":"ContainerDied","Data":"677ab6c2f5bed7596e70d334d998a9e911915f5add047779a0020fe282fe2d04"} Sep 29 11:19:48 crc kubenswrapper[4752]: I0929 11:19:48.554839 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7l2nn" event={"ID":"75317a48-de24-4698-ae2b-5b7a34cacd1c","Type":"ContainerStarted","Data":"eeef5656eeb1c35fe864f226bec885b77000cf1e0b00ea410bd57ddb27f9d93b"} Sep 29 11:19:48 crc kubenswrapper[4752]: I0929 11:19:48.580683 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7l2nn" podStartSLOduration=3.672557517 podStartE2EDuration="7.580666948s" podCreationTimestamp="2025-09-29 11:19:41 +0000 UTC" firstStartedPulling="2025-09-29 11:19:43.50498868 +0000 UTC m=+2124.294130347" lastFinishedPulling="2025-09-29 11:19:47.413098101 +0000 UTC m=+2128.202239778" observedRunningTime="2025-09-29 11:19:48.572171264 +0000 UTC m=+2129.361312931" watchObservedRunningTime="2025-09-29 11:19:48.580666948 +0000 UTC m=+2129.369808615" Sep 29 11:19:52 crc kubenswrapper[4752]: I0929 11:19:52.323686 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7l2nn" Sep 29 11:19:52 crc kubenswrapper[4752]: I0929 11:19:52.324081 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7l2nn" Sep 29 11:19:52 crc kubenswrapper[4752]: I0929 11:19:52.395111 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7l2nn" Sep 29 11:19:52 crc kubenswrapper[4752]: I0929 11:19:52.622138 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7l2nn" Sep 29 11:19:54 crc kubenswrapper[4752]: I0929 11:19:54.587692 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7l2nn"] Sep 29 11:19:54 crc kubenswrapper[4752]: I0929 11:19:54.601512 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-7l2nn" podUID="75317a48-de24-4698-ae2b-5b7a34cacd1c" containerName="registry-server" containerID="cri-o://eeef5656eeb1c35fe864f226bec885b77000cf1e0b00ea410bd57ddb27f9d93b" gracePeriod=2 Sep 29 11:19:55 crc kubenswrapper[4752]: I0929 11:19:55.611068 4752 generic.go:334] "Generic (PLEG): container finished" podID="75317a48-de24-4698-ae2b-5b7a34cacd1c" containerID="eeef5656eeb1c35fe864f226bec885b77000cf1e0b00ea410bd57ddb27f9d93b" exitCode=0 Sep 29 11:19:55 crc kubenswrapper[4752]: I0929 11:19:55.611143 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7l2nn" event={"ID":"75317a48-de24-4698-ae2b-5b7a34cacd1c","Type":"ContainerDied","Data":"eeef5656eeb1c35fe864f226bec885b77000cf1e0b00ea410bd57ddb27f9d93b"} Sep 29 11:19:55 crc kubenswrapper[4752]: I0929 11:19:55.611508 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7l2nn" event={"ID":"75317a48-de24-4698-ae2b-5b7a34cacd1c","Type":"ContainerDied","Data":"995ff86cd1b03da90352347fe93cb12a981033fd5d2832617638e9e76ed3f47f"} Sep 29 11:19:55 crc kubenswrapper[4752]: I0929 11:19:55.611524 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="995ff86cd1b03da90352347fe93cb12a981033fd5d2832617638e9e76ed3f47f" Sep 29 11:19:55 crc kubenswrapper[4752]: I0929 11:19:55.615742 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7l2nn" Sep 29 11:19:55 crc kubenswrapper[4752]: I0929 11:19:55.766173 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75317a48-de24-4698-ae2b-5b7a34cacd1c-utilities\") pod \"75317a48-de24-4698-ae2b-5b7a34cacd1c\" (UID: \"75317a48-de24-4698-ae2b-5b7a34cacd1c\") " Sep 29 11:19:55 crc kubenswrapper[4752]: I0929 11:19:55.766245 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9jzzr\" (UniqueName: \"kubernetes.io/projected/75317a48-de24-4698-ae2b-5b7a34cacd1c-kube-api-access-9jzzr\") pod \"75317a48-de24-4698-ae2b-5b7a34cacd1c\" (UID: \"75317a48-de24-4698-ae2b-5b7a34cacd1c\") " Sep 29 11:19:55 crc kubenswrapper[4752]: I0929 11:19:55.766265 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75317a48-de24-4698-ae2b-5b7a34cacd1c-catalog-content\") pod \"75317a48-de24-4698-ae2b-5b7a34cacd1c\" (UID: \"75317a48-de24-4698-ae2b-5b7a34cacd1c\") " Sep 29 11:19:55 crc kubenswrapper[4752]: I0929 11:19:55.948573 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75317a48-de24-4698-ae2b-5b7a34cacd1c-utilities" (OuterVolumeSpecName: "utilities") pod "75317a48-de24-4698-ae2b-5b7a34cacd1c" (UID: "75317a48-de24-4698-ae2b-5b7a34cacd1c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 11:19:55 crc kubenswrapper[4752]: I0929 11:19:55.960558 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75317a48-de24-4698-ae2b-5b7a34cacd1c-kube-api-access-9jzzr" (OuterVolumeSpecName: "kube-api-access-9jzzr") pod "75317a48-de24-4698-ae2b-5b7a34cacd1c" (UID: "75317a48-de24-4698-ae2b-5b7a34cacd1c"). InnerVolumeSpecName "kube-api-access-9jzzr". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 11:19:55 crc kubenswrapper[4752]: I0929 11:19:55.970405 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75317a48-de24-4698-ae2b-5b7a34cacd1c-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 11:19:55 crc kubenswrapper[4752]: I0929 11:19:55.970450 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9jzzr\" (UniqueName: \"kubernetes.io/projected/75317a48-de24-4698-ae2b-5b7a34cacd1c-kube-api-access-9jzzr\") on node \"crc\" DevicePath \"\"" Sep 29 11:19:56 crc kubenswrapper[4752]: I0929 11:19:56.086593 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75317a48-de24-4698-ae2b-5b7a34cacd1c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "75317a48-de24-4698-ae2b-5b7a34cacd1c" (UID: "75317a48-de24-4698-ae2b-5b7a34cacd1c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 11:19:56 crc kubenswrapper[4752]: I0929 11:19:56.174652 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75317a48-de24-4698-ae2b-5b7a34cacd1c-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 11:19:56 crc kubenswrapper[4752]: I0929 11:19:56.175789 4752 patch_prober.go:28] interesting pod/machine-config-daemon-mgrvs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 11:19:56 crc kubenswrapper[4752]: I0929 11:19:56.175867 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 11:19:56 crc kubenswrapper[4752]: I0929 11:19:56.620048 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7l2nn" Sep 29 11:19:56 crc kubenswrapper[4752]: I0929 11:19:56.663590 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7l2nn"] Sep 29 11:19:56 crc kubenswrapper[4752]: I0929 11:19:56.669168 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-7l2nn"] Sep 29 11:19:58 crc kubenswrapper[4752]: I0929 11:19:58.042392 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75317a48-de24-4698-ae2b-5b7a34cacd1c" path="/var/lib/kubelet/pods/75317a48-de24-4698-ae2b-5b7a34cacd1c/volumes" Sep 29 11:20:11 crc kubenswrapper[4752]: I0929 11:20:11.177947 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-r67pr"] Sep 29 11:20:11 crc kubenswrapper[4752]: I0929 11:20:11.183373 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-r67pr"] Sep 29 11:20:11 crc kubenswrapper[4752]: I0929 11:20:11.235859 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcherd1a5-account-delete-4vszv"] Sep 29 11:20:11 crc kubenswrapper[4752]: E0929 11:20:11.236299 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75317a48-de24-4698-ae2b-5b7a34cacd1c" containerName="extract-content" Sep 29 11:20:11 crc kubenswrapper[4752]: I0929 11:20:11.236322 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="75317a48-de24-4698-ae2b-5b7a34cacd1c" containerName="extract-content" Sep 29 11:20:11 crc kubenswrapper[4752]: E0929 11:20:11.236357 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75317a48-de24-4698-ae2b-5b7a34cacd1c" containerName="extract-utilities" Sep 29 11:20:11 crc kubenswrapper[4752]: I0929 11:20:11.236366 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="75317a48-de24-4698-ae2b-5b7a34cacd1c" containerName="extract-utilities" Sep 29 11:20:11 crc kubenswrapper[4752]: E0929 11:20:11.236379 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75317a48-de24-4698-ae2b-5b7a34cacd1c" containerName="registry-server" Sep 29 11:20:11 crc kubenswrapper[4752]: I0929 11:20:11.236387 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="75317a48-de24-4698-ae2b-5b7a34cacd1c" containerName="registry-server" Sep 29 11:20:11 crc kubenswrapper[4752]: I0929 11:20:11.236582 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="75317a48-de24-4698-ae2b-5b7a34cacd1c" containerName="registry-server" Sep 29 11:20:11 crc kubenswrapper[4752]: I0929 11:20:11.237318 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcherd1a5-account-delete-4vszv" Sep 29 11:20:11 crc kubenswrapper[4752]: I0929 11:20:11.262438 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcherd1a5-account-delete-4vszv"] Sep 29 11:20:11 crc kubenswrapper[4752]: I0929 11:20:11.319110 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Sep 29 11:20:11 crc kubenswrapper[4752]: I0929 11:20:11.319671 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="1b4e8f55-830d-4a8a-928b-869e27fdc3ea" containerName="watcher-kuttl-api-log" containerID="cri-o://fcea50b93181f051cdbb5b41c19c2e316e880c6ee9f4e5e21e7567b886c3169c" gracePeriod=30 Sep 29 11:20:11 crc kubenswrapper[4752]: I0929 11:20:11.319800 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="1b4e8f55-830d-4a8a-928b-869e27fdc3ea" containerName="watcher-api" containerID="cri-o://253b5aedaef3072a17469aceb0d35b41e5d9a7d91657f3ad991b4b180da75a83" gracePeriod=30 Sep 29 11:20:11 crc kubenswrapper[4752]: I0929 11:20:11.338748 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4gbn\" (UniqueName: \"kubernetes.io/projected/db605475-0036-43df-8035-ce1d158382f6-kube-api-access-m4gbn\") pod \"watcherd1a5-account-delete-4vszv\" (UID: \"db605475-0036-43df-8035-ce1d158382f6\") " pod="watcher-kuttl-default/watcherd1a5-account-delete-4vszv" Sep 29 11:20:11 crc kubenswrapper[4752]: I0929 11:20:11.347436 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Sep 29 11:20:11 crc kubenswrapper[4752]: I0929 11:20:11.348710 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="e4632531-2c83-465d-9906-aa26083e17b4" containerName="watcher-decision-engine" containerID="cri-o://d6ec7c838c546cf5bc7200af013a511751d77a94449b3f368637d3a980d8d99a" gracePeriod=30 Sep 29 11:20:11 crc kubenswrapper[4752]: I0929 11:20:11.374818 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Sep 29 11:20:11 crc kubenswrapper[4752]: I0929 11:20:11.375011 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podUID="1b36513c-f87c-4873-9ab6-629ccbb9c58e" containerName="watcher-applier" containerID="cri-o://93e5dc80b66d53ac1cbf12e5f78125492d39dc1295524630be0738540264aca1" gracePeriod=30 Sep 29 11:20:11 crc kubenswrapper[4752]: I0929 11:20:11.441941 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4gbn\" (UniqueName: \"kubernetes.io/projected/db605475-0036-43df-8035-ce1d158382f6-kube-api-access-m4gbn\") pod \"watcherd1a5-account-delete-4vszv\" (UID: \"db605475-0036-43df-8035-ce1d158382f6\") " pod="watcher-kuttl-default/watcherd1a5-account-delete-4vszv" Sep 29 11:20:11 crc kubenswrapper[4752]: I0929 11:20:11.460973 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4gbn\" (UniqueName: \"kubernetes.io/projected/db605475-0036-43df-8035-ce1d158382f6-kube-api-access-m4gbn\") pod \"watcherd1a5-account-delete-4vszv\" (UID: \"db605475-0036-43df-8035-ce1d158382f6\") " pod="watcher-kuttl-default/watcherd1a5-account-delete-4vszv" Sep 29 11:20:11 crc kubenswrapper[4752]: I0929 11:20:11.560401 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcherd1a5-account-delete-4vszv" Sep 29 11:20:11 crc kubenswrapper[4752]: I0929 11:20:11.743507 4752 generic.go:334] "Generic (PLEG): container finished" podID="1b4e8f55-830d-4a8a-928b-869e27fdc3ea" containerID="fcea50b93181f051cdbb5b41c19c2e316e880c6ee9f4e5e21e7567b886c3169c" exitCode=143 Sep 29 11:20:11 crc kubenswrapper[4752]: I0929 11:20:11.743712 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"1b4e8f55-830d-4a8a-928b-869e27fdc3ea","Type":"ContainerDied","Data":"fcea50b93181f051cdbb5b41c19c2e316e880c6ee9f4e5e21e7567b886c3169c"} Sep 29 11:20:12 crc kubenswrapper[4752]: I0929 11:20:12.024312 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcherd1a5-account-delete-4vszv"] Sep 29 11:20:12 crc kubenswrapper[4752]: I0929 11:20:12.053588 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30380408-18cc-4feb-b122-aa9ae9047279" path="/var/lib/kubelet/pods/30380408-18cc-4feb-b122-aa9ae9047279/volumes" Sep 29 11:20:12 crc kubenswrapper[4752]: I0929 11:20:12.306415 4752 prober.go:107] "Probe failed" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="1b4e8f55-830d-4a8a-928b-869e27fdc3ea" containerName="watcher-kuttl-api-log" probeResult="failure" output="Get \"https://10.217.0.158:9322/\": dial tcp 10.217.0.158:9322: connect: connection refused" Sep 29 11:20:12 crc kubenswrapper[4752]: I0929 11:20:12.306500 4752 prober.go:107] "Probe failed" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="1b4e8f55-830d-4a8a-928b-869e27fdc3ea" containerName="watcher-api" probeResult="failure" output="Get \"https://10.217.0.158:9322/\": dial tcp 10.217.0.158:9322: connect: connection refused" Sep 29 11:20:12 crc kubenswrapper[4752]: E0929 11:20:12.438283 4752 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="93e5dc80b66d53ac1cbf12e5f78125492d39dc1295524630be0738540264aca1" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Sep 29 11:20:12 crc kubenswrapper[4752]: E0929 11:20:12.439526 4752 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="93e5dc80b66d53ac1cbf12e5f78125492d39dc1295524630be0738540264aca1" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Sep 29 11:20:12 crc kubenswrapper[4752]: E0929 11:20:12.440725 4752 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="93e5dc80b66d53ac1cbf12e5f78125492d39dc1295524630be0738540264aca1" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Sep 29 11:20:12 crc kubenswrapper[4752]: E0929 11:20:12.440772 4752 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podUID="1b36513c-f87c-4873-9ab6-629ccbb9c58e" containerName="watcher-applier" Sep 29 11:20:12 crc kubenswrapper[4752]: I0929 11:20:12.754062 4752 generic.go:334] "Generic (PLEG): container finished" podID="db605475-0036-43df-8035-ce1d158382f6" containerID="5f30c3a09e2a5ce1467c592640ce15f1520dbbe9b97508532e8211b2514f57aa" exitCode=0 Sep 29 11:20:12 crc kubenswrapper[4752]: I0929 11:20:12.754129 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcherd1a5-account-delete-4vszv" event={"ID":"db605475-0036-43df-8035-ce1d158382f6","Type":"ContainerDied","Data":"5f30c3a09e2a5ce1467c592640ce15f1520dbbe9b97508532e8211b2514f57aa"} Sep 29 11:20:12 crc kubenswrapper[4752]: I0929 11:20:12.754156 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcherd1a5-account-delete-4vszv" event={"ID":"db605475-0036-43df-8035-ce1d158382f6","Type":"ContainerStarted","Data":"7fc765d54979928dc0881b14417e627d264ffecf6e1b03ea32513d4c6f65560b"} Sep 29 11:20:12 crc kubenswrapper[4752]: I0929 11:20:12.756536 4752 generic.go:334] "Generic (PLEG): container finished" podID="1b4e8f55-830d-4a8a-928b-869e27fdc3ea" containerID="253b5aedaef3072a17469aceb0d35b41e5d9a7d91657f3ad991b4b180da75a83" exitCode=0 Sep 29 11:20:12 crc kubenswrapper[4752]: I0929 11:20:12.756562 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"1b4e8f55-830d-4a8a-928b-869e27fdc3ea","Type":"ContainerDied","Data":"253b5aedaef3072a17469aceb0d35b41e5d9a7d91657f3ad991b4b180da75a83"} Sep 29 11:20:13 crc kubenswrapper[4752]: I0929 11:20:13.217731 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:20:13 crc kubenswrapper[4752]: I0929 11:20:13.275837 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b4e8f55-830d-4a8a-928b-869e27fdc3ea-config-data\") pod \"1b4e8f55-830d-4a8a-928b-869e27fdc3ea\" (UID: \"1b4e8f55-830d-4a8a-928b-869e27fdc3ea\") " Sep 29 11:20:13 crc kubenswrapper[4752]: I0929 11:20:13.275899 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b4e8f55-830d-4a8a-928b-869e27fdc3ea-logs\") pod \"1b4e8f55-830d-4a8a-928b-869e27fdc3ea\" (UID: \"1b4e8f55-830d-4a8a-928b-869e27fdc3ea\") " Sep 29 11:20:13 crc kubenswrapper[4752]: I0929 11:20:13.275936 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b4e8f55-830d-4a8a-928b-869e27fdc3ea-internal-tls-certs\") pod \"1b4e8f55-830d-4a8a-928b-869e27fdc3ea\" (UID: \"1b4e8f55-830d-4a8a-928b-869e27fdc3ea\") " Sep 29 11:20:13 crc kubenswrapper[4752]: I0929 11:20:13.276050 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b4e8f55-830d-4a8a-928b-869e27fdc3ea-public-tls-certs\") pod \"1b4e8f55-830d-4a8a-928b-869e27fdc3ea\" (UID: \"1b4e8f55-830d-4a8a-928b-869e27fdc3ea\") " Sep 29 11:20:13 crc kubenswrapper[4752]: I0929 11:20:13.276078 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/1b4e8f55-830d-4a8a-928b-869e27fdc3ea-custom-prometheus-ca\") pod \"1b4e8f55-830d-4a8a-928b-869e27fdc3ea\" (UID: \"1b4e8f55-830d-4a8a-928b-869e27fdc3ea\") " Sep 29 11:20:13 crc kubenswrapper[4752]: I0929 11:20:13.276142 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-446sm\" (UniqueName: \"kubernetes.io/projected/1b4e8f55-830d-4a8a-928b-869e27fdc3ea-kube-api-access-446sm\") pod \"1b4e8f55-830d-4a8a-928b-869e27fdc3ea\" (UID: \"1b4e8f55-830d-4a8a-928b-869e27fdc3ea\") " Sep 29 11:20:13 crc kubenswrapper[4752]: I0929 11:20:13.276162 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b4e8f55-830d-4a8a-928b-869e27fdc3ea-combined-ca-bundle\") pod \"1b4e8f55-830d-4a8a-928b-869e27fdc3ea\" (UID: \"1b4e8f55-830d-4a8a-928b-869e27fdc3ea\") " Sep 29 11:20:13 crc kubenswrapper[4752]: I0929 11:20:13.276431 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b4e8f55-830d-4a8a-928b-869e27fdc3ea-logs" (OuterVolumeSpecName: "logs") pod "1b4e8f55-830d-4a8a-928b-869e27fdc3ea" (UID: "1b4e8f55-830d-4a8a-928b-869e27fdc3ea"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 11:20:13 crc kubenswrapper[4752]: I0929 11:20:13.276616 4752 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b4e8f55-830d-4a8a-928b-869e27fdc3ea-logs\") on node \"crc\" DevicePath \"\"" Sep 29 11:20:13 crc kubenswrapper[4752]: I0929 11:20:13.281615 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b4e8f55-830d-4a8a-928b-869e27fdc3ea-kube-api-access-446sm" (OuterVolumeSpecName: "kube-api-access-446sm") pod "1b4e8f55-830d-4a8a-928b-869e27fdc3ea" (UID: "1b4e8f55-830d-4a8a-928b-869e27fdc3ea"). InnerVolumeSpecName "kube-api-access-446sm". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 11:20:13 crc kubenswrapper[4752]: I0929 11:20:13.299797 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b4e8f55-830d-4a8a-928b-869e27fdc3ea-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1b4e8f55-830d-4a8a-928b-869e27fdc3ea" (UID: "1b4e8f55-830d-4a8a-928b-869e27fdc3ea"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:20:13 crc kubenswrapper[4752]: I0929 11:20:13.311471 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b4e8f55-830d-4a8a-928b-869e27fdc3ea-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "1b4e8f55-830d-4a8a-928b-869e27fdc3ea" (UID: "1b4e8f55-830d-4a8a-928b-869e27fdc3ea"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:20:13 crc kubenswrapper[4752]: I0929 11:20:13.328873 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b4e8f55-830d-4a8a-928b-869e27fdc3ea-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "1b4e8f55-830d-4a8a-928b-869e27fdc3ea" (UID: "1b4e8f55-830d-4a8a-928b-869e27fdc3ea"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:20:13 crc kubenswrapper[4752]: I0929 11:20:13.330103 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b4e8f55-830d-4a8a-928b-869e27fdc3ea-config-data" (OuterVolumeSpecName: "config-data") pod "1b4e8f55-830d-4a8a-928b-869e27fdc3ea" (UID: "1b4e8f55-830d-4a8a-928b-869e27fdc3ea"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:20:13 crc kubenswrapper[4752]: I0929 11:20:13.334974 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b4e8f55-830d-4a8a-928b-869e27fdc3ea-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "1b4e8f55-830d-4a8a-928b-869e27fdc3ea" (UID: "1b4e8f55-830d-4a8a-928b-869e27fdc3ea"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:20:13 crc kubenswrapper[4752]: I0929 11:20:13.378235 4752 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b4e8f55-830d-4a8a-928b-869e27fdc3ea-public-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 29 11:20:13 crc kubenswrapper[4752]: I0929 11:20:13.378270 4752 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/1b4e8f55-830d-4a8a-928b-869e27fdc3ea-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Sep 29 11:20:13 crc kubenswrapper[4752]: I0929 11:20:13.378280 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-446sm\" (UniqueName: \"kubernetes.io/projected/1b4e8f55-830d-4a8a-928b-869e27fdc3ea-kube-api-access-446sm\") on node \"crc\" DevicePath \"\"" Sep 29 11:20:13 crc kubenswrapper[4752]: I0929 11:20:13.378289 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b4e8f55-830d-4a8a-928b-869e27fdc3ea-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 11:20:13 crc kubenswrapper[4752]: I0929 11:20:13.378299 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b4e8f55-830d-4a8a-928b-869e27fdc3ea-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 11:20:13 crc kubenswrapper[4752]: I0929 11:20:13.378306 4752 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b4e8f55-830d-4a8a-928b-869e27fdc3ea-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 29 11:20:13 crc kubenswrapper[4752]: I0929 11:20:13.624169 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Sep 29 11:20:13 crc kubenswrapper[4752]: I0929 11:20:13.624421 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="ec5e871e-bdda-4e23-8272-885dc11508b0" containerName="ceilometer-central-agent" containerID="cri-o://91374e3675552cbcbff4c57f4c196bfb1d123bcd3e926b8c6567f208c5f12a0a" gracePeriod=30 Sep 29 11:20:13 crc kubenswrapper[4752]: I0929 11:20:13.624530 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="ec5e871e-bdda-4e23-8272-885dc11508b0" containerName="ceilometer-notification-agent" containerID="cri-o://716804ba5a6f0b8313c456060a661291191ca055cbbb93e747fd1452e4a593c0" gracePeriod=30 Sep 29 11:20:13 crc kubenswrapper[4752]: I0929 11:20:13.624587 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="ec5e871e-bdda-4e23-8272-885dc11508b0" containerName="sg-core" containerID="cri-o://9901c1bf9f831b52d6b1cacae0e986c4ac21d9ef6a50d6356aecc6ec7e3033e7" gracePeriod=30 Sep 29 11:20:13 crc kubenswrapper[4752]: I0929 11:20:13.624705 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="ec5e871e-bdda-4e23-8272-885dc11508b0" containerName="proxy-httpd" containerID="cri-o://417a7ce2376ea65c4d417e8c1805bd4c0a2d82be06b56a5cff0593eb996d01be" gracePeriod=30 Sep 29 11:20:13 crc kubenswrapper[4752]: I0929 11:20:13.771711 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"1b4e8f55-830d-4a8a-928b-869e27fdc3ea","Type":"ContainerDied","Data":"75a796114782c0013e8f87b5c614a80211226f26f97df7d06848656b325b0b03"} Sep 29 11:20:13 crc kubenswrapper[4752]: I0929 11:20:13.771772 4752 scope.go:117] "RemoveContainer" containerID="253b5aedaef3072a17469aceb0d35b41e5d9a7d91657f3ad991b4b180da75a83" Sep 29 11:20:13 crc kubenswrapper[4752]: I0929 11:20:13.771980 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:20:13 crc kubenswrapper[4752]: I0929 11:20:13.791653 4752 generic.go:334] "Generic (PLEG): container finished" podID="ec5e871e-bdda-4e23-8272-885dc11508b0" containerID="417a7ce2376ea65c4d417e8c1805bd4c0a2d82be06b56a5cff0593eb996d01be" exitCode=0 Sep 29 11:20:13 crc kubenswrapper[4752]: I0929 11:20:13.791697 4752 generic.go:334] "Generic (PLEG): container finished" podID="ec5e871e-bdda-4e23-8272-885dc11508b0" containerID="9901c1bf9f831b52d6b1cacae0e986c4ac21d9ef6a50d6356aecc6ec7e3033e7" exitCode=2 Sep 29 11:20:13 crc kubenswrapper[4752]: I0929 11:20:13.791911 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"ec5e871e-bdda-4e23-8272-885dc11508b0","Type":"ContainerDied","Data":"417a7ce2376ea65c4d417e8c1805bd4c0a2d82be06b56a5cff0593eb996d01be"} Sep 29 11:20:13 crc kubenswrapper[4752]: I0929 11:20:13.791942 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"ec5e871e-bdda-4e23-8272-885dc11508b0","Type":"ContainerDied","Data":"9901c1bf9f831b52d6b1cacae0e986c4ac21d9ef6a50d6356aecc6ec7e3033e7"} Sep 29 11:20:13 crc kubenswrapper[4752]: I0929 11:20:13.804411 4752 scope.go:117] "RemoveContainer" containerID="fcea50b93181f051cdbb5b41c19c2e316e880c6ee9f4e5e21e7567b886c3169c" Sep 29 11:20:13 crc kubenswrapper[4752]: I0929 11:20:13.818162 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Sep 29 11:20:13 crc kubenswrapper[4752]: I0929 11:20:13.823688 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Sep 29 11:20:14 crc kubenswrapper[4752]: I0929 11:20:14.041766 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b4e8f55-830d-4a8a-928b-869e27fdc3ea" path="/var/lib/kubelet/pods/1b4e8f55-830d-4a8a-928b-869e27fdc3ea/volumes" Sep 29 11:20:14 crc kubenswrapper[4752]: I0929 11:20:14.101501 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcherd1a5-account-delete-4vszv" Sep 29 11:20:14 crc kubenswrapper[4752]: I0929 11:20:14.192188 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4gbn\" (UniqueName: \"kubernetes.io/projected/db605475-0036-43df-8035-ce1d158382f6-kube-api-access-m4gbn\") pod \"db605475-0036-43df-8035-ce1d158382f6\" (UID: \"db605475-0036-43df-8035-ce1d158382f6\") " Sep 29 11:20:14 crc kubenswrapper[4752]: I0929 11:20:14.196578 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db605475-0036-43df-8035-ce1d158382f6-kube-api-access-m4gbn" (OuterVolumeSpecName: "kube-api-access-m4gbn") pod "db605475-0036-43df-8035-ce1d158382f6" (UID: "db605475-0036-43df-8035-ce1d158382f6"). InnerVolumeSpecName "kube-api-access-m4gbn". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 11:20:14 crc kubenswrapper[4752]: I0929 11:20:14.294229 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4gbn\" (UniqueName: \"kubernetes.io/projected/db605475-0036-43df-8035-ce1d158382f6-kube-api-access-m4gbn\") on node \"crc\" DevicePath \"\"" Sep 29 11:20:14 crc kubenswrapper[4752]: I0929 11:20:14.804359 4752 generic.go:334] "Generic (PLEG): container finished" podID="ec5e871e-bdda-4e23-8272-885dc11508b0" containerID="91374e3675552cbcbff4c57f4c196bfb1d123bcd3e926b8c6567f208c5f12a0a" exitCode=0 Sep 29 11:20:14 crc kubenswrapper[4752]: I0929 11:20:14.804426 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"ec5e871e-bdda-4e23-8272-885dc11508b0","Type":"ContainerDied","Data":"91374e3675552cbcbff4c57f4c196bfb1d123bcd3e926b8c6567f208c5f12a0a"} Sep 29 11:20:14 crc kubenswrapper[4752]: I0929 11:20:14.807256 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcherd1a5-account-delete-4vszv" Sep 29 11:20:14 crc kubenswrapper[4752]: I0929 11:20:14.807247 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcherd1a5-account-delete-4vszv" event={"ID":"db605475-0036-43df-8035-ce1d158382f6","Type":"ContainerDied","Data":"7fc765d54979928dc0881b14417e627d264ffecf6e1b03ea32513d4c6f65560b"} Sep 29 11:20:14 crc kubenswrapper[4752]: I0929 11:20:14.807358 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7fc765d54979928dc0881b14417e627d264ffecf6e1b03ea32513d4c6f65560b" Sep 29 11:20:16 crc kubenswrapper[4752]: I0929 11:20:16.259855 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-db-create-rm9qv"] Sep 29 11:20:16 crc kubenswrapper[4752]: I0929 11:20:16.268130 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-db-create-rm9qv"] Sep 29 11:20:16 crc kubenswrapper[4752]: I0929 11:20:16.275943 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcherd1a5-account-delete-4vszv"] Sep 29 11:20:16 crc kubenswrapper[4752]: I0929 11:20:16.283730 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-d1a5-account-create-p8qf5"] Sep 29 11:20:16 crc kubenswrapper[4752]: I0929 11:20:16.298747 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcherd1a5-account-delete-4vszv"] Sep 29 11:20:16 crc kubenswrapper[4752]: I0929 11:20:16.308293 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-d1a5-account-create-p8qf5"] Sep 29 11:20:16 crc kubenswrapper[4752]: I0929 11:20:16.412406 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-db-create-rphvs"] Sep 29 11:20:16 crc kubenswrapper[4752]: E0929 11:20:16.412789 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b4e8f55-830d-4a8a-928b-869e27fdc3ea" containerName="watcher-kuttl-api-log" Sep 29 11:20:16 crc kubenswrapper[4752]: I0929 11:20:16.412822 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b4e8f55-830d-4a8a-928b-869e27fdc3ea" containerName="watcher-kuttl-api-log" Sep 29 11:20:16 crc kubenswrapper[4752]: E0929 11:20:16.412840 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db605475-0036-43df-8035-ce1d158382f6" containerName="mariadb-account-delete" Sep 29 11:20:16 crc kubenswrapper[4752]: I0929 11:20:16.412848 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="db605475-0036-43df-8035-ce1d158382f6" containerName="mariadb-account-delete" Sep 29 11:20:16 crc kubenswrapper[4752]: E0929 11:20:16.412865 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b4e8f55-830d-4a8a-928b-869e27fdc3ea" containerName="watcher-api" Sep 29 11:20:16 crc kubenswrapper[4752]: I0929 11:20:16.412872 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b4e8f55-830d-4a8a-928b-869e27fdc3ea" containerName="watcher-api" Sep 29 11:20:16 crc kubenswrapper[4752]: I0929 11:20:16.413050 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b4e8f55-830d-4a8a-928b-869e27fdc3ea" containerName="watcher-api" Sep 29 11:20:16 crc kubenswrapper[4752]: I0929 11:20:16.413067 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="db605475-0036-43df-8035-ce1d158382f6" containerName="mariadb-account-delete" Sep 29 11:20:16 crc kubenswrapper[4752]: I0929 11:20:16.413085 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b4e8f55-830d-4a8a-928b-869e27fdc3ea" containerName="watcher-kuttl-api-log" Sep 29 11:20:16 crc kubenswrapper[4752]: I0929 11:20:16.413722 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-rphvs" Sep 29 11:20:16 crc kubenswrapper[4752]: I0929 11:20:16.421867 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-db-create-rphvs"] Sep 29 11:20:16 crc kubenswrapper[4752]: I0929 11:20:16.529402 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kxml\" (UniqueName: \"kubernetes.io/projected/879c3c04-5844-497c-81ae-c5218cdcd806-kube-api-access-9kxml\") pod \"watcher-db-create-rphvs\" (UID: \"879c3c04-5844-497c-81ae-c5218cdcd806\") " pod="watcher-kuttl-default/watcher-db-create-rphvs" Sep 29 11:20:16 crc kubenswrapper[4752]: I0929 11:20:16.630896 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9kxml\" (UniqueName: \"kubernetes.io/projected/879c3c04-5844-497c-81ae-c5218cdcd806-kube-api-access-9kxml\") pod \"watcher-db-create-rphvs\" (UID: \"879c3c04-5844-497c-81ae-c5218cdcd806\") " pod="watcher-kuttl-default/watcher-db-create-rphvs" Sep 29 11:20:16 crc kubenswrapper[4752]: I0929 11:20:16.654674 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kxml\" (UniqueName: \"kubernetes.io/projected/879c3c04-5844-497c-81ae-c5218cdcd806-kube-api-access-9kxml\") pod \"watcher-db-create-rphvs\" (UID: \"879c3c04-5844-497c-81ae-c5218cdcd806\") " pod="watcher-kuttl-default/watcher-db-create-rphvs" Sep 29 11:20:16 crc kubenswrapper[4752]: I0929 11:20:16.752825 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:20:16 crc kubenswrapper[4752]: I0929 11:20:16.772570 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-rphvs" Sep 29 11:20:16 crc kubenswrapper[4752]: I0929 11:20:16.835404 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b36513c-f87c-4873-9ab6-629ccbb9c58e-logs\") pod \"1b36513c-f87c-4873-9ab6-629ccbb9c58e\" (UID: \"1b36513c-f87c-4873-9ab6-629ccbb9c58e\") " Sep 29 11:20:16 crc kubenswrapper[4752]: I0929 11:20:16.835782 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b36513c-f87c-4873-9ab6-629ccbb9c58e-combined-ca-bundle\") pod \"1b36513c-f87c-4873-9ab6-629ccbb9c58e\" (UID: \"1b36513c-f87c-4873-9ab6-629ccbb9c58e\") " Sep 29 11:20:16 crc kubenswrapper[4752]: I0929 11:20:16.835877 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b36513c-f87c-4873-9ab6-629ccbb9c58e-config-data\") pod \"1b36513c-f87c-4873-9ab6-629ccbb9c58e\" (UID: \"1b36513c-f87c-4873-9ab6-629ccbb9c58e\") " Sep 29 11:20:16 crc kubenswrapper[4752]: I0929 11:20:16.835903 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qvsgf\" (UniqueName: \"kubernetes.io/projected/1b36513c-f87c-4873-9ab6-629ccbb9c58e-kube-api-access-qvsgf\") pod \"1b36513c-f87c-4873-9ab6-629ccbb9c58e\" (UID: \"1b36513c-f87c-4873-9ab6-629ccbb9c58e\") " Sep 29 11:20:16 crc kubenswrapper[4752]: I0929 11:20:16.835951 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b36513c-f87c-4873-9ab6-629ccbb9c58e-logs" (OuterVolumeSpecName: "logs") pod "1b36513c-f87c-4873-9ab6-629ccbb9c58e" (UID: "1b36513c-f87c-4873-9ab6-629ccbb9c58e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 11:20:16 crc kubenswrapper[4752]: I0929 11:20:16.838419 4752 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b36513c-f87c-4873-9ab6-629ccbb9c58e-logs\") on node \"crc\" DevicePath \"\"" Sep 29 11:20:16 crc kubenswrapper[4752]: I0929 11:20:16.840318 4752 generic.go:334] "Generic (PLEG): container finished" podID="1b36513c-f87c-4873-9ab6-629ccbb9c58e" containerID="93e5dc80b66d53ac1cbf12e5f78125492d39dc1295524630be0738540264aca1" exitCode=0 Sep 29 11:20:16 crc kubenswrapper[4752]: I0929 11:20:16.840369 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:20:16 crc kubenswrapper[4752]: I0929 11:20:16.840392 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"1b36513c-f87c-4873-9ab6-629ccbb9c58e","Type":"ContainerDied","Data":"93e5dc80b66d53ac1cbf12e5f78125492d39dc1295524630be0738540264aca1"} Sep 29 11:20:16 crc kubenswrapper[4752]: I0929 11:20:16.840419 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"1b36513c-f87c-4873-9ab6-629ccbb9c58e","Type":"ContainerDied","Data":"6e509b79b09fc3075b491601313f1f6876d9596deda0cf560880649df13eedfc"} Sep 29 11:20:16 crc kubenswrapper[4752]: I0929 11:20:16.840436 4752 scope.go:117] "RemoveContainer" containerID="93e5dc80b66d53ac1cbf12e5f78125492d39dc1295524630be0738540264aca1" Sep 29 11:20:16 crc kubenswrapper[4752]: I0929 11:20:16.840952 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b36513c-f87c-4873-9ab6-629ccbb9c58e-kube-api-access-qvsgf" (OuterVolumeSpecName: "kube-api-access-qvsgf") pod "1b36513c-f87c-4873-9ab6-629ccbb9c58e" (UID: "1b36513c-f87c-4873-9ab6-629ccbb9c58e"). InnerVolumeSpecName "kube-api-access-qvsgf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 11:20:16 crc kubenswrapper[4752]: I0929 11:20:16.847714 4752 generic.go:334] "Generic (PLEG): container finished" podID="e4632531-2c83-465d-9906-aa26083e17b4" containerID="d6ec7c838c546cf5bc7200af013a511751d77a94449b3f368637d3a980d8d99a" exitCode=0 Sep 29 11:20:16 crc kubenswrapper[4752]: I0929 11:20:16.847751 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"e4632531-2c83-465d-9906-aa26083e17b4","Type":"ContainerDied","Data":"d6ec7c838c546cf5bc7200af013a511751d77a94449b3f368637d3a980d8d99a"} Sep 29 11:20:16 crc kubenswrapper[4752]: I0929 11:20:16.857233 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b36513c-f87c-4873-9ab6-629ccbb9c58e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1b36513c-f87c-4873-9ab6-629ccbb9c58e" (UID: "1b36513c-f87c-4873-9ab6-629ccbb9c58e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:20:16 crc kubenswrapper[4752]: I0929 11:20:16.865886 4752 scope.go:117] "RemoveContainer" containerID="93e5dc80b66d53ac1cbf12e5f78125492d39dc1295524630be0738540264aca1" Sep 29 11:20:16 crc kubenswrapper[4752]: E0929 11:20:16.866445 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93e5dc80b66d53ac1cbf12e5f78125492d39dc1295524630be0738540264aca1\": container with ID starting with 93e5dc80b66d53ac1cbf12e5f78125492d39dc1295524630be0738540264aca1 not found: ID does not exist" containerID="93e5dc80b66d53ac1cbf12e5f78125492d39dc1295524630be0738540264aca1" Sep 29 11:20:16 crc kubenswrapper[4752]: I0929 11:20:16.866497 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93e5dc80b66d53ac1cbf12e5f78125492d39dc1295524630be0738540264aca1"} err="failed to get container status \"93e5dc80b66d53ac1cbf12e5f78125492d39dc1295524630be0738540264aca1\": rpc error: code = NotFound desc = could not find container \"93e5dc80b66d53ac1cbf12e5f78125492d39dc1295524630be0738540264aca1\": container with ID starting with 93e5dc80b66d53ac1cbf12e5f78125492d39dc1295524630be0738540264aca1 not found: ID does not exist" Sep 29 11:20:16 crc kubenswrapper[4752]: I0929 11:20:16.876564 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b36513c-f87c-4873-9ab6-629ccbb9c58e-config-data" (OuterVolumeSpecName: "config-data") pod "1b36513c-f87c-4873-9ab6-629ccbb9c58e" (UID: "1b36513c-f87c-4873-9ab6-629ccbb9c58e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:20:16 crc kubenswrapper[4752]: I0929 11:20:16.914838 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:20:16 crc kubenswrapper[4752]: I0929 11:20:16.943737 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b36513c-f87c-4873-9ab6-629ccbb9c58e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 11:20:16 crc kubenswrapper[4752]: I0929 11:20:16.944452 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b36513c-f87c-4873-9ab6-629ccbb9c58e-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 11:20:16 crc kubenswrapper[4752]: I0929 11:20:16.944477 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qvsgf\" (UniqueName: \"kubernetes.io/projected/1b36513c-f87c-4873-9ab6-629ccbb9c58e-kube-api-access-qvsgf\") on node \"crc\" DevicePath \"\"" Sep 29 11:20:17 crc kubenswrapper[4752]: I0929 11:20:17.045416 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tnx6k\" (UniqueName: \"kubernetes.io/projected/e4632531-2c83-465d-9906-aa26083e17b4-kube-api-access-tnx6k\") pod \"e4632531-2c83-465d-9906-aa26083e17b4\" (UID: \"e4632531-2c83-465d-9906-aa26083e17b4\") " Sep 29 11:20:17 crc kubenswrapper[4752]: I0929 11:20:17.045478 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/e4632531-2c83-465d-9906-aa26083e17b4-custom-prometheus-ca\") pod \"e4632531-2c83-465d-9906-aa26083e17b4\" (UID: \"e4632531-2c83-465d-9906-aa26083e17b4\") " Sep 29 11:20:17 crc kubenswrapper[4752]: I0929 11:20:17.045508 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4632531-2c83-465d-9906-aa26083e17b4-config-data\") pod \"e4632531-2c83-465d-9906-aa26083e17b4\" (UID: \"e4632531-2c83-465d-9906-aa26083e17b4\") " Sep 29 11:20:17 crc kubenswrapper[4752]: I0929 11:20:17.045567 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4632531-2c83-465d-9906-aa26083e17b4-logs\") pod \"e4632531-2c83-465d-9906-aa26083e17b4\" (UID: \"e4632531-2c83-465d-9906-aa26083e17b4\") " Sep 29 11:20:17 crc kubenswrapper[4752]: I0929 11:20:17.045615 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4632531-2c83-465d-9906-aa26083e17b4-combined-ca-bundle\") pod \"e4632531-2c83-465d-9906-aa26083e17b4\" (UID: \"e4632531-2c83-465d-9906-aa26083e17b4\") " Sep 29 11:20:17 crc kubenswrapper[4752]: I0929 11:20:17.046483 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4632531-2c83-465d-9906-aa26083e17b4-logs" (OuterVolumeSpecName: "logs") pod "e4632531-2c83-465d-9906-aa26083e17b4" (UID: "e4632531-2c83-465d-9906-aa26083e17b4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 11:20:17 crc kubenswrapper[4752]: I0929 11:20:17.050096 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4632531-2c83-465d-9906-aa26083e17b4-kube-api-access-tnx6k" (OuterVolumeSpecName: "kube-api-access-tnx6k") pod "e4632531-2c83-465d-9906-aa26083e17b4" (UID: "e4632531-2c83-465d-9906-aa26083e17b4"). InnerVolumeSpecName "kube-api-access-tnx6k". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 11:20:17 crc kubenswrapper[4752]: I0929 11:20:17.067309 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4632531-2c83-465d-9906-aa26083e17b4-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "e4632531-2c83-465d-9906-aa26083e17b4" (UID: "e4632531-2c83-465d-9906-aa26083e17b4"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:20:17 crc kubenswrapper[4752]: I0929 11:20:17.072264 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4632531-2c83-465d-9906-aa26083e17b4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e4632531-2c83-465d-9906-aa26083e17b4" (UID: "e4632531-2c83-465d-9906-aa26083e17b4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:20:17 crc kubenswrapper[4752]: I0929 11:20:17.086975 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4632531-2c83-465d-9906-aa26083e17b4-config-data" (OuterVolumeSpecName: "config-data") pod "e4632531-2c83-465d-9906-aa26083e17b4" (UID: "e4632531-2c83-465d-9906-aa26083e17b4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:20:17 crc kubenswrapper[4752]: I0929 11:20:17.147050 4752 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4632531-2c83-465d-9906-aa26083e17b4-logs\") on node \"crc\" DevicePath \"\"" Sep 29 11:20:17 crc kubenswrapper[4752]: I0929 11:20:17.147097 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4632531-2c83-465d-9906-aa26083e17b4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 11:20:17 crc kubenswrapper[4752]: I0929 11:20:17.147107 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tnx6k\" (UniqueName: \"kubernetes.io/projected/e4632531-2c83-465d-9906-aa26083e17b4-kube-api-access-tnx6k\") on node \"crc\" DevicePath \"\"" Sep 29 11:20:17 crc kubenswrapper[4752]: I0929 11:20:17.147116 4752 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/e4632531-2c83-465d-9906-aa26083e17b4-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Sep 29 11:20:17 crc kubenswrapper[4752]: I0929 11:20:17.147124 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4632531-2c83-465d-9906-aa26083e17b4-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 11:20:17 crc kubenswrapper[4752]: I0929 11:20:17.173494 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Sep 29 11:20:17 crc kubenswrapper[4752]: I0929 11:20:17.179111 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Sep 29 11:20:17 crc kubenswrapper[4752]: I0929 11:20:17.229049 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-db-create-rphvs"] Sep 29 11:20:17 crc kubenswrapper[4752]: I0929 11:20:17.857131 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"e4632531-2c83-465d-9906-aa26083e17b4","Type":"ContainerDied","Data":"b017fd771e4601da235b357aae98aee69a55aa93c450e2b2082996bd07c6293c"} Sep 29 11:20:17 crc kubenswrapper[4752]: I0929 11:20:17.857181 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:20:17 crc kubenswrapper[4752]: I0929 11:20:17.857472 4752 scope.go:117] "RemoveContainer" containerID="d6ec7c838c546cf5bc7200af013a511751d77a94449b3f368637d3a980d8d99a" Sep 29 11:20:17 crc kubenswrapper[4752]: I0929 11:20:17.861776 4752 generic.go:334] "Generic (PLEG): container finished" podID="ec5e871e-bdda-4e23-8272-885dc11508b0" containerID="716804ba5a6f0b8313c456060a661291191ca055cbbb93e747fd1452e4a593c0" exitCode=0 Sep 29 11:20:17 crc kubenswrapper[4752]: I0929 11:20:17.861842 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"ec5e871e-bdda-4e23-8272-885dc11508b0","Type":"ContainerDied","Data":"716804ba5a6f0b8313c456060a661291191ca055cbbb93e747fd1452e4a593c0"} Sep 29 11:20:17 crc kubenswrapper[4752]: I0929 11:20:17.863182 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-rphvs" event={"ID":"879c3c04-5844-497c-81ae-c5218cdcd806","Type":"ContainerStarted","Data":"0e4f1e827e2e4ebc0ea1d82fac6dbc8c45acccf2dca2fa2aecbdde2864203599"} Sep 29 11:20:17 crc kubenswrapper[4752]: I0929 11:20:17.863206 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-rphvs" event={"ID":"879c3c04-5844-497c-81ae-c5218cdcd806","Type":"ContainerStarted","Data":"bccb1616a5d572c1fcbe5614c5470344a3448b43da7692ecbfd11d0cabb70e94"} Sep 29 11:20:17 crc kubenswrapper[4752]: I0929 11:20:17.884102 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-db-create-rphvs" podStartSLOduration=1.8840799339999998 podStartE2EDuration="1.884079934s" podCreationTimestamp="2025-09-29 11:20:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 11:20:17.877128773 +0000 UTC m=+2158.666270440" watchObservedRunningTime="2025-09-29 11:20:17.884079934 +0000 UTC m=+2158.673221601" Sep 29 11:20:17 crc kubenswrapper[4752]: I0929 11:20:17.896585 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Sep 29 11:20:17 crc kubenswrapper[4752]: I0929 11:20:17.901402 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Sep 29 11:20:18 crc kubenswrapper[4752]: I0929 11:20:18.047378 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b36513c-f87c-4873-9ab6-629ccbb9c58e" path="/var/lib/kubelet/pods/1b36513c-f87c-4873-9ab6-629ccbb9c58e/volumes" Sep 29 11:20:18 crc kubenswrapper[4752]: I0929 11:20:18.048171 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d9cb262-2d02-4933-a57b-05a013784977" path="/var/lib/kubelet/pods/8d9cb262-2d02-4933-a57b-05a013784977/volumes" Sep 29 11:20:18 crc kubenswrapper[4752]: I0929 11:20:18.048930 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d41c2494-66d2-4067-9019-409ba7cdab85" path="/var/lib/kubelet/pods/d41c2494-66d2-4067-9019-409ba7cdab85/volumes" Sep 29 11:20:18 crc kubenswrapper[4752]: I0929 11:20:18.050077 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db605475-0036-43df-8035-ce1d158382f6" path="/var/lib/kubelet/pods/db605475-0036-43df-8035-ce1d158382f6/volumes" Sep 29 11:20:18 crc kubenswrapper[4752]: I0929 11:20:18.051996 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4632531-2c83-465d-9906-aa26083e17b4" path="/var/lib/kubelet/pods/e4632531-2c83-465d-9906-aa26083e17b4/volumes" Sep 29 11:20:18 crc kubenswrapper[4752]: I0929 11:20:18.190752 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:20:18 crc kubenswrapper[4752]: I0929 11:20:18.277407 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l955s\" (UniqueName: \"kubernetes.io/projected/ec5e871e-bdda-4e23-8272-885dc11508b0-kube-api-access-l955s\") pod \"ec5e871e-bdda-4e23-8272-885dc11508b0\" (UID: \"ec5e871e-bdda-4e23-8272-885dc11508b0\") " Sep 29 11:20:18 crc kubenswrapper[4752]: I0929 11:20:18.277463 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec5e871e-bdda-4e23-8272-885dc11508b0-config-data\") pod \"ec5e871e-bdda-4e23-8272-885dc11508b0\" (UID: \"ec5e871e-bdda-4e23-8272-885dc11508b0\") " Sep 29 11:20:18 crc kubenswrapper[4752]: I0929 11:20:18.277498 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ec5e871e-bdda-4e23-8272-885dc11508b0-log-httpd\") pod \"ec5e871e-bdda-4e23-8272-885dc11508b0\" (UID: \"ec5e871e-bdda-4e23-8272-885dc11508b0\") " Sep 29 11:20:18 crc kubenswrapper[4752]: I0929 11:20:18.277572 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ec5e871e-bdda-4e23-8272-885dc11508b0-sg-core-conf-yaml\") pod \"ec5e871e-bdda-4e23-8272-885dc11508b0\" (UID: \"ec5e871e-bdda-4e23-8272-885dc11508b0\") " Sep 29 11:20:18 crc kubenswrapper[4752]: I0929 11:20:18.277621 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec5e871e-bdda-4e23-8272-885dc11508b0-ceilometer-tls-certs\") pod \"ec5e871e-bdda-4e23-8272-885dc11508b0\" (UID: \"ec5e871e-bdda-4e23-8272-885dc11508b0\") " Sep 29 11:20:18 crc kubenswrapper[4752]: I0929 11:20:18.277694 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ec5e871e-bdda-4e23-8272-885dc11508b0-run-httpd\") pod \"ec5e871e-bdda-4e23-8272-885dc11508b0\" (UID: \"ec5e871e-bdda-4e23-8272-885dc11508b0\") " Sep 29 11:20:18 crc kubenswrapper[4752]: I0929 11:20:18.278034 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec5e871e-bdda-4e23-8272-885dc11508b0-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ec5e871e-bdda-4e23-8272-885dc11508b0" (UID: "ec5e871e-bdda-4e23-8272-885dc11508b0"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 11:20:18 crc kubenswrapper[4752]: I0929 11:20:18.278043 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec5e871e-bdda-4e23-8272-885dc11508b0-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ec5e871e-bdda-4e23-8272-885dc11508b0" (UID: "ec5e871e-bdda-4e23-8272-885dc11508b0"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 11:20:18 crc kubenswrapper[4752]: I0929 11:20:18.278122 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec5e871e-bdda-4e23-8272-885dc11508b0-scripts\") pod \"ec5e871e-bdda-4e23-8272-885dc11508b0\" (UID: \"ec5e871e-bdda-4e23-8272-885dc11508b0\") " Sep 29 11:20:18 crc kubenswrapper[4752]: I0929 11:20:18.278176 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec5e871e-bdda-4e23-8272-885dc11508b0-combined-ca-bundle\") pod \"ec5e871e-bdda-4e23-8272-885dc11508b0\" (UID: \"ec5e871e-bdda-4e23-8272-885dc11508b0\") " Sep 29 11:20:18 crc kubenswrapper[4752]: I0929 11:20:18.278631 4752 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ec5e871e-bdda-4e23-8272-885dc11508b0-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 29 11:20:18 crc kubenswrapper[4752]: I0929 11:20:18.278653 4752 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ec5e871e-bdda-4e23-8272-885dc11508b0-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 29 11:20:18 crc kubenswrapper[4752]: I0929 11:20:18.282762 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec5e871e-bdda-4e23-8272-885dc11508b0-scripts" (OuterVolumeSpecName: "scripts") pod "ec5e871e-bdda-4e23-8272-885dc11508b0" (UID: "ec5e871e-bdda-4e23-8272-885dc11508b0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:20:18 crc kubenswrapper[4752]: I0929 11:20:18.299129 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec5e871e-bdda-4e23-8272-885dc11508b0-kube-api-access-l955s" (OuterVolumeSpecName: "kube-api-access-l955s") pod "ec5e871e-bdda-4e23-8272-885dc11508b0" (UID: "ec5e871e-bdda-4e23-8272-885dc11508b0"). InnerVolumeSpecName "kube-api-access-l955s". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 11:20:18 crc kubenswrapper[4752]: I0929 11:20:18.311338 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec5e871e-bdda-4e23-8272-885dc11508b0-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ec5e871e-bdda-4e23-8272-885dc11508b0" (UID: "ec5e871e-bdda-4e23-8272-885dc11508b0"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:20:18 crc kubenswrapper[4752]: I0929 11:20:18.337434 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec5e871e-bdda-4e23-8272-885dc11508b0-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "ec5e871e-bdda-4e23-8272-885dc11508b0" (UID: "ec5e871e-bdda-4e23-8272-885dc11508b0"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:20:18 crc kubenswrapper[4752]: I0929 11:20:18.339263 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec5e871e-bdda-4e23-8272-885dc11508b0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ec5e871e-bdda-4e23-8272-885dc11508b0" (UID: "ec5e871e-bdda-4e23-8272-885dc11508b0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:20:18 crc kubenswrapper[4752]: I0929 11:20:18.364771 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec5e871e-bdda-4e23-8272-885dc11508b0-config-data" (OuterVolumeSpecName: "config-data") pod "ec5e871e-bdda-4e23-8272-885dc11508b0" (UID: "ec5e871e-bdda-4e23-8272-885dc11508b0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:20:18 crc kubenswrapper[4752]: I0929 11:20:18.379831 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l955s\" (UniqueName: \"kubernetes.io/projected/ec5e871e-bdda-4e23-8272-885dc11508b0-kube-api-access-l955s\") on node \"crc\" DevicePath \"\"" Sep 29 11:20:18 crc kubenswrapper[4752]: I0929 11:20:18.379864 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec5e871e-bdda-4e23-8272-885dc11508b0-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 11:20:18 crc kubenswrapper[4752]: I0929 11:20:18.379875 4752 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ec5e871e-bdda-4e23-8272-885dc11508b0-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 29 11:20:18 crc kubenswrapper[4752]: I0929 11:20:18.379886 4752 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec5e871e-bdda-4e23-8272-885dc11508b0-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 29 11:20:18 crc kubenswrapper[4752]: I0929 11:20:18.379897 4752 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec5e871e-bdda-4e23-8272-885dc11508b0-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 11:20:18 crc kubenswrapper[4752]: I0929 11:20:18.379907 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec5e871e-bdda-4e23-8272-885dc11508b0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 11:20:18 crc kubenswrapper[4752]: I0929 11:20:18.878573 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"ec5e871e-bdda-4e23-8272-885dc11508b0","Type":"ContainerDied","Data":"f85e735580def5f136789a0da867928ae873f88596556498f8f397aeb08c6f13"} Sep 29 11:20:18 crc kubenswrapper[4752]: I0929 11:20:18.878609 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:20:18 crc kubenswrapper[4752]: I0929 11:20:18.878643 4752 scope.go:117] "RemoveContainer" containerID="417a7ce2376ea65c4d417e8c1805bd4c0a2d82be06b56a5cff0593eb996d01be" Sep 29 11:20:18 crc kubenswrapper[4752]: I0929 11:20:18.880073 4752 generic.go:334] "Generic (PLEG): container finished" podID="879c3c04-5844-497c-81ae-c5218cdcd806" containerID="0e4f1e827e2e4ebc0ea1d82fac6dbc8c45acccf2dca2fa2aecbdde2864203599" exitCode=0 Sep 29 11:20:18 crc kubenswrapper[4752]: I0929 11:20:18.880121 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-rphvs" event={"ID":"879c3c04-5844-497c-81ae-c5218cdcd806","Type":"ContainerDied","Data":"0e4f1e827e2e4ebc0ea1d82fac6dbc8c45acccf2dca2fa2aecbdde2864203599"} Sep 29 11:20:18 crc kubenswrapper[4752]: I0929 11:20:18.898703 4752 scope.go:117] "RemoveContainer" containerID="9901c1bf9f831b52d6b1cacae0e986c4ac21d9ef6a50d6356aecc6ec7e3033e7" Sep 29 11:20:18 crc kubenswrapper[4752]: I0929 11:20:18.919780 4752 scope.go:117] "RemoveContainer" containerID="716804ba5a6f0b8313c456060a661291191ca055cbbb93e747fd1452e4a593c0" Sep 29 11:20:18 crc kubenswrapper[4752]: I0929 11:20:18.922502 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Sep 29 11:20:18 crc kubenswrapper[4752]: I0929 11:20:18.930369 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Sep 29 11:20:18 crc kubenswrapper[4752]: I0929 11:20:18.939249 4752 scope.go:117] "RemoveContainer" containerID="91374e3675552cbcbff4c57f4c196bfb1d123bcd3e926b8c6567f208c5f12a0a" Sep 29 11:20:18 crc kubenswrapper[4752]: I0929 11:20:18.947895 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Sep 29 11:20:18 crc kubenswrapper[4752]: E0929 11:20:18.948373 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4632531-2c83-465d-9906-aa26083e17b4" containerName="watcher-decision-engine" Sep 29 11:20:18 crc kubenswrapper[4752]: I0929 11:20:18.948466 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4632531-2c83-465d-9906-aa26083e17b4" containerName="watcher-decision-engine" Sep 29 11:20:18 crc kubenswrapper[4752]: E0929 11:20:18.948540 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec5e871e-bdda-4e23-8272-885dc11508b0" containerName="proxy-httpd" Sep 29 11:20:18 crc kubenswrapper[4752]: I0929 11:20:18.948612 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec5e871e-bdda-4e23-8272-885dc11508b0" containerName="proxy-httpd" Sep 29 11:20:18 crc kubenswrapper[4752]: E0929 11:20:18.948669 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec5e871e-bdda-4e23-8272-885dc11508b0" containerName="ceilometer-notification-agent" Sep 29 11:20:18 crc kubenswrapper[4752]: I0929 11:20:18.948721 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec5e871e-bdda-4e23-8272-885dc11508b0" containerName="ceilometer-notification-agent" Sep 29 11:20:18 crc kubenswrapper[4752]: E0929 11:20:18.948796 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec5e871e-bdda-4e23-8272-885dc11508b0" containerName="ceilometer-central-agent" Sep 29 11:20:18 crc kubenswrapper[4752]: I0929 11:20:18.948888 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec5e871e-bdda-4e23-8272-885dc11508b0" containerName="ceilometer-central-agent" Sep 29 11:20:18 crc kubenswrapper[4752]: E0929 11:20:18.948964 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec5e871e-bdda-4e23-8272-885dc11508b0" containerName="sg-core" Sep 29 11:20:18 crc kubenswrapper[4752]: I0929 11:20:18.949169 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec5e871e-bdda-4e23-8272-885dc11508b0" containerName="sg-core" Sep 29 11:20:18 crc kubenswrapper[4752]: E0929 11:20:18.949247 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b36513c-f87c-4873-9ab6-629ccbb9c58e" containerName="watcher-applier" Sep 29 11:20:18 crc kubenswrapper[4752]: I0929 11:20:18.949307 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b36513c-f87c-4873-9ab6-629ccbb9c58e" containerName="watcher-applier" Sep 29 11:20:18 crc kubenswrapper[4752]: I0929 11:20:18.949489 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b36513c-f87c-4873-9ab6-629ccbb9c58e" containerName="watcher-applier" Sep 29 11:20:18 crc kubenswrapper[4752]: I0929 11:20:18.949565 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec5e871e-bdda-4e23-8272-885dc11508b0" containerName="ceilometer-notification-agent" Sep 29 11:20:18 crc kubenswrapper[4752]: I0929 11:20:18.949644 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec5e871e-bdda-4e23-8272-885dc11508b0" containerName="proxy-httpd" Sep 29 11:20:18 crc kubenswrapper[4752]: I0929 11:20:18.949708 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec5e871e-bdda-4e23-8272-885dc11508b0" containerName="sg-core" Sep 29 11:20:18 crc kubenswrapper[4752]: I0929 11:20:18.949786 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec5e871e-bdda-4e23-8272-885dc11508b0" containerName="ceilometer-central-agent" Sep 29 11:20:18 crc kubenswrapper[4752]: I0929 11:20:18.949878 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4632531-2c83-465d-9906-aa26083e17b4" containerName="watcher-decision-engine" Sep 29 11:20:18 crc kubenswrapper[4752]: I0929 11:20:18.951550 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:20:18 crc kubenswrapper[4752]: I0929 11:20:18.953823 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Sep 29 11:20:18 crc kubenswrapper[4752]: I0929 11:20:18.957361 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Sep 29 11:20:18 crc kubenswrapper[4752]: I0929 11:20:18.957546 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-ceilometer-internal-svc" Sep 29 11:20:18 crc kubenswrapper[4752]: I0929 11:20:18.964168 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Sep 29 11:20:19 crc kubenswrapper[4752]: I0929 11:20:19.088639 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8a0c35a4-54fb-48cf-8ead-1dab82509893-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8a0c35a4-54fb-48cf-8ead-1dab82509893\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:20:19 crc kubenswrapper[4752]: I0929 11:20:19.088689 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a0c35a4-54fb-48cf-8ead-1dab82509893-config-data\") pod \"ceilometer-0\" (UID: \"8a0c35a4-54fb-48cf-8ead-1dab82509893\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:20:19 crc kubenswrapper[4752]: I0929 11:20:19.088842 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a0c35a4-54fb-48cf-8ead-1dab82509893-scripts\") pod \"ceilometer-0\" (UID: \"8a0c35a4-54fb-48cf-8ead-1dab82509893\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:20:19 crc kubenswrapper[4752]: I0929 11:20:19.088926 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a0c35a4-54fb-48cf-8ead-1dab82509893-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8a0c35a4-54fb-48cf-8ead-1dab82509893\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:20:19 crc kubenswrapper[4752]: I0929 11:20:19.088978 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jnnz\" (UniqueName: \"kubernetes.io/projected/8a0c35a4-54fb-48cf-8ead-1dab82509893-kube-api-access-5jnnz\") pod \"ceilometer-0\" (UID: \"8a0c35a4-54fb-48cf-8ead-1dab82509893\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:20:19 crc kubenswrapper[4752]: I0929 11:20:19.089177 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a0c35a4-54fb-48cf-8ead-1dab82509893-run-httpd\") pod \"ceilometer-0\" (UID: \"8a0c35a4-54fb-48cf-8ead-1dab82509893\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:20:19 crc kubenswrapper[4752]: I0929 11:20:19.089217 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a0c35a4-54fb-48cf-8ead-1dab82509893-log-httpd\") pod \"ceilometer-0\" (UID: \"8a0c35a4-54fb-48cf-8ead-1dab82509893\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:20:19 crc kubenswrapper[4752]: I0929 11:20:19.089351 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a0c35a4-54fb-48cf-8ead-1dab82509893-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"8a0c35a4-54fb-48cf-8ead-1dab82509893\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:20:19 crc kubenswrapper[4752]: I0929 11:20:19.191194 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a0c35a4-54fb-48cf-8ead-1dab82509893-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8a0c35a4-54fb-48cf-8ead-1dab82509893\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:20:19 crc kubenswrapper[4752]: I0929 11:20:19.191256 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jnnz\" (UniqueName: \"kubernetes.io/projected/8a0c35a4-54fb-48cf-8ead-1dab82509893-kube-api-access-5jnnz\") pod \"ceilometer-0\" (UID: \"8a0c35a4-54fb-48cf-8ead-1dab82509893\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:20:19 crc kubenswrapper[4752]: I0929 11:20:19.191307 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a0c35a4-54fb-48cf-8ead-1dab82509893-run-httpd\") pod \"ceilometer-0\" (UID: \"8a0c35a4-54fb-48cf-8ead-1dab82509893\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:20:19 crc kubenswrapper[4752]: I0929 11:20:19.191324 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a0c35a4-54fb-48cf-8ead-1dab82509893-log-httpd\") pod \"ceilometer-0\" (UID: \"8a0c35a4-54fb-48cf-8ead-1dab82509893\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:20:19 crc kubenswrapper[4752]: I0929 11:20:19.191351 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a0c35a4-54fb-48cf-8ead-1dab82509893-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"8a0c35a4-54fb-48cf-8ead-1dab82509893\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:20:19 crc kubenswrapper[4752]: I0929 11:20:19.191410 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8a0c35a4-54fb-48cf-8ead-1dab82509893-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8a0c35a4-54fb-48cf-8ead-1dab82509893\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:20:19 crc kubenswrapper[4752]: I0929 11:20:19.191439 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a0c35a4-54fb-48cf-8ead-1dab82509893-config-data\") pod \"ceilometer-0\" (UID: \"8a0c35a4-54fb-48cf-8ead-1dab82509893\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:20:19 crc kubenswrapper[4752]: I0929 11:20:19.191477 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a0c35a4-54fb-48cf-8ead-1dab82509893-scripts\") pod \"ceilometer-0\" (UID: \"8a0c35a4-54fb-48cf-8ead-1dab82509893\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:20:19 crc kubenswrapper[4752]: I0929 11:20:19.191835 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a0c35a4-54fb-48cf-8ead-1dab82509893-run-httpd\") pod \"ceilometer-0\" (UID: \"8a0c35a4-54fb-48cf-8ead-1dab82509893\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:20:19 crc kubenswrapper[4752]: I0929 11:20:19.192518 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a0c35a4-54fb-48cf-8ead-1dab82509893-log-httpd\") pod \"ceilometer-0\" (UID: \"8a0c35a4-54fb-48cf-8ead-1dab82509893\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:20:19 crc kubenswrapper[4752]: I0929 11:20:19.196283 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a0c35a4-54fb-48cf-8ead-1dab82509893-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"8a0c35a4-54fb-48cf-8ead-1dab82509893\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:20:19 crc kubenswrapper[4752]: I0929 11:20:19.196387 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a0c35a4-54fb-48cf-8ead-1dab82509893-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8a0c35a4-54fb-48cf-8ead-1dab82509893\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:20:19 crc kubenswrapper[4752]: I0929 11:20:19.196473 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a0c35a4-54fb-48cf-8ead-1dab82509893-config-data\") pod \"ceilometer-0\" (UID: \"8a0c35a4-54fb-48cf-8ead-1dab82509893\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:20:19 crc kubenswrapper[4752]: I0929 11:20:19.196893 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a0c35a4-54fb-48cf-8ead-1dab82509893-scripts\") pod \"ceilometer-0\" (UID: \"8a0c35a4-54fb-48cf-8ead-1dab82509893\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:20:19 crc kubenswrapper[4752]: I0929 11:20:19.197238 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8a0c35a4-54fb-48cf-8ead-1dab82509893-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8a0c35a4-54fb-48cf-8ead-1dab82509893\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:20:19 crc kubenswrapper[4752]: I0929 11:20:19.210364 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jnnz\" (UniqueName: \"kubernetes.io/projected/8a0c35a4-54fb-48cf-8ead-1dab82509893-kube-api-access-5jnnz\") pod \"ceilometer-0\" (UID: \"8a0c35a4-54fb-48cf-8ead-1dab82509893\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:20:19 crc kubenswrapper[4752]: I0929 11:20:19.277087 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:20:19 crc kubenswrapper[4752]: I0929 11:20:19.815353 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Sep 29 11:20:19 crc kubenswrapper[4752]: I0929 11:20:19.895686 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"8a0c35a4-54fb-48cf-8ead-1dab82509893","Type":"ContainerStarted","Data":"82eed08bce346d24b2388e166bfc55e2ede547defd69fe6c92c1897ec925b989"} Sep 29 11:20:20 crc kubenswrapper[4752]: I0929 11:20:20.046058 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec5e871e-bdda-4e23-8272-885dc11508b0" path="/var/lib/kubelet/pods/ec5e871e-bdda-4e23-8272-885dc11508b0/volumes" Sep 29 11:20:20 crc kubenswrapper[4752]: I0929 11:20:20.204172 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-rphvs" Sep 29 11:20:20 crc kubenswrapper[4752]: I0929 11:20:20.308964 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9kxml\" (UniqueName: \"kubernetes.io/projected/879c3c04-5844-497c-81ae-c5218cdcd806-kube-api-access-9kxml\") pod \"879c3c04-5844-497c-81ae-c5218cdcd806\" (UID: \"879c3c04-5844-497c-81ae-c5218cdcd806\") " Sep 29 11:20:20 crc kubenswrapper[4752]: I0929 11:20:20.314135 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/879c3c04-5844-497c-81ae-c5218cdcd806-kube-api-access-9kxml" (OuterVolumeSpecName: "kube-api-access-9kxml") pod "879c3c04-5844-497c-81ae-c5218cdcd806" (UID: "879c3c04-5844-497c-81ae-c5218cdcd806"). InnerVolumeSpecName "kube-api-access-9kxml". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 11:20:20 crc kubenswrapper[4752]: I0929 11:20:20.411314 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9kxml\" (UniqueName: \"kubernetes.io/projected/879c3c04-5844-497c-81ae-c5218cdcd806-kube-api-access-9kxml\") on node \"crc\" DevicePath \"\"" Sep 29 11:20:20 crc kubenswrapper[4752]: I0929 11:20:20.909438 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-rphvs" event={"ID":"879c3c04-5844-497c-81ae-c5218cdcd806","Type":"ContainerDied","Data":"bccb1616a5d572c1fcbe5614c5470344a3448b43da7692ecbfd11d0cabb70e94"} Sep 29 11:20:20 crc kubenswrapper[4752]: I0929 11:20:20.909698 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bccb1616a5d572c1fcbe5614c5470344a3448b43da7692ecbfd11d0cabb70e94" Sep 29 11:20:20 crc kubenswrapper[4752]: I0929 11:20:20.909706 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-rphvs" Sep 29 11:20:21 crc kubenswrapper[4752]: I0929 11:20:21.919362 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"8a0c35a4-54fb-48cf-8ead-1dab82509893","Type":"ContainerStarted","Data":"6e4482e9ccb24286f1472daaae355e9e4140fcaeadc83ba16633469bb5b69c60"} Sep 29 11:20:22 crc kubenswrapper[4752]: I0929 11:20:22.929556 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"8a0c35a4-54fb-48cf-8ead-1dab82509893","Type":"ContainerStarted","Data":"8ce9834b371b8dbf84399b7f52ffa05a8e3c250b1bfc3cf2182abb12a958f8fe"} Sep 29 11:20:22 crc kubenswrapper[4752]: I0929 11:20:22.929915 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"8a0c35a4-54fb-48cf-8ead-1dab82509893","Type":"ContainerStarted","Data":"574ffa2e61d591a17326b83fa0cc3b75d916a4b261680dd3d439dca34c14f20c"} Sep 29 11:20:24 crc kubenswrapper[4752]: I0929 11:20:24.946906 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"8a0c35a4-54fb-48cf-8ead-1dab82509893","Type":"ContainerStarted","Data":"b860ce00d8b55e48951cec932077c9833ddb0c119831b7af972e7a19e2003420"} Sep 29 11:20:24 crc kubenswrapper[4752]: I0929 11:20:24.947410 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:20:24 crc kubenswrapper[4752]: I0929 11:20:24.970963 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/ceilometer-0" podStartSLOduration=2.491728674 podStartE2EDuration="6.970943582s" podCreationTimestamp="2025-09-29 11:20:18 +0000 UTC" firstStartedPulling="2025-09-29 11:20:19.803407595 +0000 UTC m=+2160.592549262" lastFinishedPulling="2025-09-29 11:20:24.282622503 +0000 UTC m=+2165.071764170" observedRunningTime="2025-09-29 11:20:24.968410726 +0000 UTC m=+2165.757552403" watchObservedRunningTime="2025-09-29 11:20:24.970943582 +0000 UTC m=+2165.760085259" Sep 29 11:20:26 crc kubenswrapper[4752]: I0929 11:20:26.175492 4752 patch_prober.go:28] interesting pod/machine-config-daemon-mgrvs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 11:20:26 crc kubenswrapper[4752]: I0929 11:20:26.175554 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 11:20:26 crc kubenswrapper[4752]: I0929 11:20:26.175599 4752 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" Sep 29 11:20:26 crc kubenswrapper[4752]: I0929 11:20:26.176227 4752 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"93752bca1235c82c7e20c88ea68e0afd59b9dc59d3315066b08789cc37a87e37"} pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 29 11:20:26 crc kubenswrapper[4752]: I0929 11:20:26.176270 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" containerName="machine-config-daemon" containerID="cri-o://93752bca1235c82c7e20c88ea68e0afd59b9dc59d3315066b08789cc37a87e37" gracePeriod=600 Sep 29 11:20:26 crc kubenswrapper[4752]: E0929 11:20:26.308477 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgrvs_openshift-machine-config-operator(5863c243-797d-462a-b11f-71aaf005f8d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" Sep 29 11:20:26 crc kubenswrapper[4752]: I0929 11:20:26.423706 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-3ba4-account-create-gjw7n"] Sep 29 11:20:26 crc kubenswrapper[4752]: E0929 11:20:26.424159 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="879c3c04-5844-497c-81ae-c5218cdcd806" containerName="mariadb-database-create" Sep 29 11:20:26 crc kubenswrapper[4752]: I0929 11:20:26.424182 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="879c3c04-5844-497c-81ae-c5218cdcd806" containerName="mariadb-database-create" Sep 29 11:20:26 crc kubenswrapper[4752]: I0929 11:20:26.424417 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="879c3c04-5844-497c-81ae-c5218cdcd806" containerName="mariadb-database-create" Sep 29 11:20:26 crc kubenswrapper[4752]: I0929 11:20:26.425137 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-3ba4-account-create-gjw7n" Sep 29 11:20:26 crc kubenswrapper[4752]: I0929 11:20:26.430422 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-db-secret" Sep 29 11:20:26 crc kubenswrapper[4752]: I0929 11:20:26.438481 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-3ba4-account-create-gjw7n"] Sep 29 11:20:26 crc kubenswrapper[4752]: I0929 11:20:26.512408 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brfw8\" (UniqueName: \"kubernetes.io/projected/761747e9-4df9-49ce-acdc-dac1809f07b7-kube-api-access-brfw8\") pod \"watcher-3ba4-account-create-gjw7n\" (UID: \"761747e9-4df9-49ce-acdc-dac1809f07b7\") " pod="watcher-kuttl-default/watcher-3ba4-account-create-gjw7n" Sep 29 11:20:26 crc kubenswrapper[4752]: I0929 11:20:26.614439 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brfw8\" (UniqueName: \"kubernetes.io/projected/761747e9-4df9-49ce-acdc-dac1809f07b7-kube-api-access-brfw8\") pod \"watcher-3ba4-account-create-gjw7n\" (UID: \"761747e9-4df9-49ce-acdc-dac1809f07b7\") " pod="watcher-kuttl-default/watcher-3ba4-account-create-gjw7n" Sep 29 11:20:26 crc kubenswrapper[4752]: I0929 11:20:26.635927 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brfw8\" (UniqueName: \"kubernetes.io/projected/761747e9-4df9-49ce-acdc-dac1809f07b7-kube-api-access-brfw8\") pod \"watcher-3ba4-account-create-gjw7n\" (UID: \"761747e9-4df9-49ce-acdc-dac1809f07b7\") " pod="watcher-kuttl-default/watcher-3ba4-account-create-gjw7n" Sep 29 11:20:26 crc kubenswrapper[4752]: I0929 11:20:26.740515 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-3ba4-account-create-gjw7n" Sep 29 11:20:26 crc kubenswrapper[4752]: I0929 11:20:26.973921 4752 generic.go:334] "Generic (PLEG): container finished" podID="5863c243-797d-462a-b11f-71aaf005f8d1" containerID="93752bca1235c82c7e20c88ea68e0afd59b9dc59d3315066b08789cc37a87e37" exitCode=0 Sep 29 11:20:26 crc kubenswrapper[4752]: I0929 11:20:26.974230 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" event={"ID":"5863c243-797d-462a-b11f-71aaf005f8d1","Type":"ContainerDied","Data":"93752bca1235c82c7e20c88ea68e0afd59b9dc59d3315066b08789cc37a87e37"} Sep 29 11:20:26 crc kubenswrapper[4752]: I0929 11:20:26.974262 4752 scope.go:117] "RemoveContainer" containerID="910e9715ff191c5fe48e666106182c9ab8ee872d75d2fda3908f263b58dd32be" Sep 29 11:20:26 crc kubenswrapper[4752]: I0929 11:20:26.974773 4752 scope.go:117] "RemoveContainer" containerID="93752bca1235c82c7e20c88ea68e0afd59b9dc59d3315066b08789cc37a87e37" Sep 29 11:20:26 crc kubenswrapper[4752]: E0929 11:20:26.975116 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgrvs_openshift-machine-config-operator(5863c243-797d-462a-b11f-71aaf005f8d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" Sep 29 11:20:27 crc kubenswrapper[4752]: I0929 11:20:27.177025 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-3ba4-account-create-gjw7n"] Sep 29 11:20:27 crc kubenswrapper[4752]: W0929 11:20:27.181670 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod761747e9_4df9_49ce_acdc_dac1809f07b7.slice/crio-91d1fea30a9fca7105c0086b8c37d18e4af5477e567050be22db5079490a83f1 WatchSource:0}: Error finding container 91d1fea30a9fca7105c0086b8c37d18e4af5477e567050be22db5079490a83f1: Status 404 returned error can't find the container with id 91d1fea30a9fca7105c0086b8c37d18e4af5477e567050be22db5079490a83f1 Sep 29 11:20:27 crc kubenswrapper[4752]: I0929 11:20:27.987549 4752 generic.go:334] "Generic (PLEG): container finished" podID="761747e9-4df9-49ce-acdc-dac1809f07b7" containerID="4fbf3f9f90bbde8caa7b4d7d58e2151c32ac096a9e406c00832a3a1df376cb28" exitCode=0 Sep 29 11:20:27 crc kubenswrapper[4752]: I0929 11:20:27.987774 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-3ba4-account-create-gjw7n" event={"ID":"761747e9-4df9-49ce-acdc-dac1809f07b7","Type":"ContainerDied","Data":"4fbf3f9f90bbde8caa7b4d7d58e2151c32ac096a9e406c00832a3a1df376cb28"} Sep 29 11:20:27 crc kubenswrapper[4752]: I0929 11:20:27.988102 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-3ba4-account-create-gjw7n" event={"ID":"761747e9-4df9-49ce-acdc-dac1809f07b7","Type":"ContainerStarted","Data":"91d1fea30a9fca7105c0086b8c37d18e4af5477e567050be22db5079490a83f1"} Sep 29 11:20:29 crc kubenswrapper[4752]: I0929 11:20:29.331675 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-3ba4-account-create-gjw7n" Sep 29 11:20:29 crc kubenswrapper[4752]: I0929 11:20:29.489771 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-brfw8\" (UniqueName: \"kubernetes.io/projected/761747e9-4df9-49ce-acdc-dac1809f07b7-kube-api-access-brfw8\") pod \"761747e9-4df9-49ce-acdc-dac1809f07b7\" (UID: \"761747e9-4df9-49ce-acdc-dac1809f07b7\") " Sep 29 11:20:29 crc kubenswrapper[4752]: I0929 11:20:29.495992 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/761747e9-4df9-49ce-acdc-dac1809f07b7-kube-api-access-brfw8" (OuterVolumeSpecName: "kube-api-access-brfw8") pod "761747e9-4df9-49ce-acdc-dac1809f07b7" (UID: "761747e9-4df9-49ce-acdc-dac1809f07b7"). InnerVolumeSpecName "kube-api-access-brfw8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 11:20:29 crc kubenswrapper[4752]: I0929 11:20:29.591515 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-brfw8\" (UniqueName: \"kubernetes.io/projected/761747e9-4df9-49ce-acdc-dac1809f07b7-kube-api-access-brfw8\") on node \"crc\" DevicePath \"\"" Sep 29 11:20:30 crc kubenswrapper[4752]: I0929 11:20:30.015887 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-3ba4-account-create-gjw7n" event={"ID":"761747e9-4df9-49ce-acdc-dac1809f07b7","Type":"ContainerDied","Data":"91d1fea30a9fca7105c0086b8c37d18e4af5477e567050be22db5079490a83f1"} Sep 29 11:20:30 crc kubenswrapper[4752]: I0929 11:20:30.015945 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="91d1fea30a9fca7105c0086b8c37d18e4af5477e567050be22db5079490a83f1" Sep 29 11:20:30 crc kubenswrapper[4752]: I0929 11:20:30.016037 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-3ba4-account-create-gjw7n" Sep 29 11:20:31 crc kubenswrapper[4752]: I0929 11:20:31.858658 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-b6v7d"] Sep 29 11:20:31 crc kubenswrapper[4752]: E0929 11:20:31.859262 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="761747e9-4df9-49ce-acdc-dac1809f07b7" containerName="mariadb-account-create" Sep 29 11:20:31 crc kubenswrapper[4752]: I0929 11:20:31.859277 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="761747e9-4df9-49ce-acdc-dac1809f07b7" containerName="mariadb-account-create" Sep 29 11:20:31 crc kubenswrapper[4752]: I0929 11:20:31.859491 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="761747e9-4df9-49ce-acdc-dac1809f07b7" containerName="mariadb-account-create" Sep 29 11:20:31 crc kubenswrapper[4752]: I0929 11:20:31.860143 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-b6v7d" Sep 29 11:20:31 crc kubenswrapper[4752]: I0929 11:20:31.862257 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-config-data" Sep 29 11:20:31 crc kubenswrapper[4752]: I0929 11:20:31.862605 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-watcher-kuttl-dockercfg-g2slg" Sep 29 11:20:31 crc kubenswrapper[4752]: I0929 11:20:31.875945 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-b6v7d"] Sep 29 11:20:32 crc kubenswrapper[4752]: I0929 11:20:32.029222 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4872b8c9-dd16-411f-a040-58971a7cf987-db-sync-config-data\") pod \"watcher-kuttl-db-sync-b6v7d\" (UID: \"4872b8c9-dd16-411f-a040-58971a7cf987\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-b6v7d" Sep 29 11:20:32 crc kubenswrapper[4752]: I0929 11:20:32.029501 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4872b8c9-dd16-411f-a040-58971a7cf987-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-b6v7d\" (UID: \"4872b8c9-dd16-411f-a040-58971a7cf987\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-b6v7d" Sep 29 11:20:32 crc kubenswrapper[4752]: I0929 11:20:32.029675 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4872b8c9-dd16-411f-a040-58971a7cf987-config-data\") pod \"watcher-kuttl-db-sync-b6v7d\" (UID: \"4872b8c9-dd16-411f-a040-58971a7cf987\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-b6v7d" Sep 29 11:20:32 crc kubenswrapper[4752]: I0929 11:20:32.029889 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqcxb\" (UniqueName: \"kubernetes.io/projected/4872b8c9-dd16-411f-a040-58971a7cf987-kube-api-access-zqcxb\") pod \"watcher-kuttl-db-sync-b6v7d\" (UID: \"4872b8c9-dd16-411f-a040-58971a7cf987\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-b6v7d" Sep 29 11:20:32 crc kubenswrapper[4752]: I0929 11:20:32.133729 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqcxb\" (UniqueName: \"kubernetes.io/projected/4872b8c9-dd16-411f-a040-58971a7cf987-kube-api-access-zqcxb\") pod \"watcher-kuttl-db-sync-b6v7d\" (UID: \"4872b8c9-dd16-411f-a040-58971a7cf987\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-b6v7d" Sep 29 11:20:32 crc kubenswrapper[4752]: I0929 11:20:32.133891 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4872b8c9-dd16-411f-a040-58971a7cf987-db-sync-config-data\") pod \"watcher-kuttl-db-sync-b6v7d\" (UID: \"4872b8c9-dd16-411f-a040-58971a7cf987\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-b6v7d" Sep 29 11:20:32 crc kubenswrapper[4752]: I0929 11:20:32.133920 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4872b8c9-dd16-411f-a040-58971a7cf987-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-b6v7d\" (UID: \"4872b8c9-dd16-411f-a040-58971a7cf987\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-b6v7d" Sep 29 11:20:32 crc kubenswrapper[4752]: I0929 11:20:32.135755 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4872b8c9-dd16-411f-a040-58971a7cf987-config-data\") pod \"watcher-kuttl-db-sync-b6v7d\" (UID: \"4872b8c9-dd16-411f-a040-58971a7cf987\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-b6v7d" Sep 29 11:20:32 crc kubenswrapper[4752]: I0929 11:20:32.139406 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4872b8c9-dd16-411f-a040-58971a7cf987-db-sync-config-data\") pod \"watcher-kuttl-db-sync-b6v7d\" (UID: \"4872b8c9-dd16-411f-a040-58971a7cf987\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-b6v7d" Sep 29 11:20:32 crc kubenswrapper[4752]: I0929 11:20:32.140147 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4872b8c9-dd16-411f-a040-58971a7cf987-config-data\") pod \"watcher-kuttl-db-sync-b6v7d\" (UID: \"4872b8c9-dd16-411f-a040-58971a7cf987\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-b6v7d" Sep 29 11:20:32 crc kubenswrapper[4752]: I0929 11:20:32.140862 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4872b8c9-dd16-411f-a040-58971a7cf987-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-b6v7d\" (UID: \"4872b8c9-dd16-411f-a040-58971a7cf987\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-b6v7d" Sep 29 11:20:32 crc kubenswrapper[4752]: I0929 11:20:32.153709 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqcxb\" (UniqueName: \"kubernetes.io/projected/4872b8c9-dd16-411f-a040-58971a7cf987-kube-api-access-zqcxb\") pod \"watcher-kuttl-db-sync-b6v7d\" (UID: \"4872b8c9-dd16-411f-a040-58971a7cf987\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-b6v7d" Sep 29 11:20:32 crc kubenswrapper[4752]: I0929 11:20:32.182183 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-b6v7d" Sep 29 11:20:32 crc kubenswrapper[4752]: I0929 11:20:32.618425 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-b6v7d"] Sep 29 11:20:32 crc kubenswrapper[4752]: W0929 11:20:32.622070 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4872b8c9_dd16_411f_a040_58971a7cf987.slice/crio-b20665d3ae0eb40fffab0aa18f787d4d98da943de8156928c06b7556d26edcd6 WatchSource:0}: Error finding container b20665d3ae0eb40fffab0aa18f787d4d98da943de8156928c06b7556d26edcd6: Status 404 returned error can't find the container with id b20665d3ae0eb40fffab0aa18f787d4d98da943de8156928c06b7556d26edcd6 Sep 29 11:20:33 crc kubenswrapper[4752]: I0929 11:20:33.040734 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-b6v7d" event={"ID":"4872b8c9-dd16-411f-a040-58971a7cf987","Type":"ContainerStarted","Data":"da63c6596140475a28463c9fad39f6b1c0241c51dbb01c52f19a84c3a9a26c6e"} Sep 29 11:20:33 crc kubenswrapper[4752]: I0929 11:20:33.040827 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-b6v7d" event={"ID":"4872b8c9-dd16-411f-a040-58971a7cf987","Type":"ContainerStarted","Data":"b20665d3ae0eb40fffab0aa18f787d4d98da943de8156928c06b7556d26edcd6"} Sep 29 11:20:34 crc kubenswrapper[4752]: I0929 11:20:34.063233 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-db-sync-b6v7d" podStartSLOduration=3.06321519 podStartE2EDuration="3.06321519s" podCreationTimestamp="2025-09-29 11:20:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 11:20:34.061390123 +0000 UTC m=+2174.850531800" watchObservedRunningTime="2025-09-29 11:20:34.06321519 +0000 UTC m=+2174.852356857" Sep 29 11:20:36 crc kubenswrapper[4752]: I0929 11:20:36.067206 4752 generic.go:334] "Generic (PLEG): container finished" podID="4872b8c9-dd16-411f-a040-58971a7cf987" containerID="da63c6596140475a28463c9fad39f6b1c0241c51dbb01c52f19a84c3a9a26c6e" exitCode=0 Sep 29 11:20:36 crc kubenswrapper[4752]: I0929 11:20:36.067303 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-b6v7d" event={"ID":"4872b8c9-dd16-411f-a040-58971a7cf987","Type":"ContainerDied","Data":"da63c6596140475a28463c9fad39f6b1c0241c51dbb01c52f19a84c3a9a26c6e"} Sep 29 11:20:37 crc kubenswrapper[4752]: I0929 11:20:37.408197 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-b6v7d" Sep 29 11:20:37 crc kubenswrapper[4752]: I0929 11:20:37.521200 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4872b8c9-dd16-411f-a040-58971a7cf987-config-data\") pod \"4872b8c9-dd16-411f-a040-58971a7cf987\" (UID: \"4872b8c9-dd16-411f-a040-58971a7cf987\") " Sep 29 11:20:37 crc kubenswrapper[4752]: I0929 11:20:37.521280 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4872b8c9-dd16-411f-a040-58971a7cf987-combined-ca-bundle\") pod \"4872b8c9-dd16-411f-a040-58971a7cf987\" (UID: \"4872b8c9-dd16-411f-a040-58971a7cf987\") " Sep 29 11:20:37 crc kubenswrapper[4752]: I0929 11:20:37.521348 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4872b8c9-dd16-411f-a040-58971a7cf987-db-sync-config-data\") pod \"4872b8c9-dd16-411f-a040-58971a7cf987\" (UID: \"4872b8c9-dd16-411f-a040-58971a7cf987\") " Sep 29 11:20:37 crc kubenswrapper[4752]: I0929 11:20:37.521371 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zqcxb\" (UniqueName: \"kubernetes.io/projected/4872b8c9-dd16-411f-a040-58971a7cf987-kube-api-access-zqcxb\") pod \"4872b8c9-dd16-411f-a040-58971a7cf987\" (UID: \"4872b8c9-dd16-411f-a040-58971a7cf987\") " Sep 29 11:20:37 crc kubenswrapper[4752]: I0929 11:20:37.527581 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4872b8c9-dd16-411f-a040-58971a7cf987-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "4872b8c9-dd16-411f-a040-58971a7cf987" (UID: "4872b8c9-dd16-411f-a040-58971a7cf987"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:20:37 crc kubenswrapper[4752]: I0929 11:20:37.527709 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4872b8c9-dd16-411f-a040-58971a7cf987-kube-api-access-zqcxb" (OuterVolumeSpecName: "kube-api-access-zqcxb") pod "4872b8c9-dd16-411f-a040-58971a7cf987" (UID: "4872b8c9-dd16-411f-a040-58971a7cf987"). InnerVolumeSpecName "kube-api-access-zqcxb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 11:20:37 crc kubenswrapper[4752]: I0929 11:20:37.550686 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4872b8c9-dd16-411f-a040-58971a7cf987-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4872b8c9-dd16-411f-a040-58971a7cf987" (UID: "4872b8c9-dd16-411f-a040-58971a7cf987"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:20:37 crc kubenswrapper[4752]: I0929 11:20:37.567282 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4872b8c9-dd16-411f-a040-58971a7cf987-config-data" (OuterVolumeSpecName: "config-data") pod "4872b8c9-dd16-411f-a040-58971a7cf987" (UID: "4872b8c9-dd16-411f-a040-58971a7cf987"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:20:37 crc kubenswrapper[4752]: I0929 11:20:37.623571 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4872b8c9-dd16-411f-a040-58971a7cf987-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 11:20:37 crc kubenswrapper[4752]: I0929 11:20:37.623610 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4872b8c9-dd16-411f-a040-58971a7cf987-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 11:20:37 crc kubenswrapper[4752]: I0929 11:20:37.623621 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zqcxb\" (UniqueName: \"kubernetes.io/projected/4872b8c9-dd16-411f-a040-58971a7cf987-kube-api-access-zqcxb\") on node \"crc\" DevicePath \"\"" Sep 29 11:20:37 crc kubenswrapper[4752]: I0929 11:20:37.623631 4752 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4872b8c9-dd16-411f-a040-58971a7cf987-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 11:20:38 crc kubenswrapper[4752]: I0929 11:20:38.031621 4752 scope.go:117] "RemoveContainer" containerID="93752bca1235c82c7e20c88ea68e0afd59b9dc59d3315066b08789cc37a87e37" Sep 29 11:20:38 crc kubenswrapper[4752]: E0929 11:20:38.032153 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgrvs_openshift-machine-config-operator(5863c243-797d-462a-b11f-71aaf005f8d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" Sep 29 11:20:38 crc kubenswrapper[4752]: I0929 11:20:38.084537 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-b6v7d" event={"ID":"4872b8c9-dd16-411f-a040-58971a7cf987","Type":"ContainerDied","Data":"b20665d3ae0eb40fffab0aa18f787d4d98da943de8156928c06b7556d26edcd6"} Sep 29 11:20:38 crc kubenswrapper[4752]: I0929 11:20:38.084583 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b20665d3ae0eb40fffab0aa18f787d4d98da943de8156928c06b7556d26edcd6" Sep 29 11:20:38 crc kubenswrapper[4752]: I0929 11:20:38.084603 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-b6v7d" Sep 29 11:20:38 crc kubenswrapper[4752]: I0929 11:20:38.428619 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Sep 29 11:20:38 crc kubenswrapper[4752]: E0929 11:20:38.428970 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4872b8c9-dd16-411f-a040-58971a7cf987" containerName="watcher-kuttl-db-sync" Sep 29 11:20:38 crc kubenswrapper[4752]: I0929 11:20:38.428982 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="4872b8c9-dd16-411f-a040-58971a7cf987" containerName="watcher-kuttl-db-sync" Sep 29 11:20:38 crc kubenswrapper[4752]: I0929 11:20:38.429371 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="4872b8c9-dd16-411f-a040-58971a7cf987" containerName="watcher-kuttl-db-sync" Sep 29 11:20:38 crc kubenswrapper[4752]: I0929 11:20:38.430406 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:20:38 crc kubenswrapper[4752]: I0929 11:20:38.436895 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-watcher-kuttl-dockercfg-g2slg" Sep 29 11:20:38 crc kubenswrapper[4752]: I0929 11:20:38.440680 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Sep 29 11:20:38 crc kubenswrapper[4752]: I0929 11:20:38.440984 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-api-config-data" Sep 29 11:20:38 crc kubenswrapper[4752]: I0929 11:20:38.452409 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Sep 29 11:20:38 crc kubenswrapper[4752]: I0929 11:20:38.453431 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:20:38 crc kubenswrapper[4752]: I0929 11:20:38.455191 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-applier-config-data" Sep 29 11:20:38 crc kubenswrapper[4752]: I0929 11:20:38.492596 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Sep 29 11:20:38 crc kubenswrapper[4752]: I0929 11:20:38.517043 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Sep 29 11:20:38 crc kubenswrapper[4752]: I0929 11:20:38.526384 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:20:38 crc kubenswrapper[4752]: I0929 11:20:38.529551 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-decision-engine-config-data" Sep 29 11:20:38 crc kubenswrapper[4752]: I0929 11:20:38.538357 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a22b23af-2fc7-46f3-9c73-87bc740dad1d-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"a22b23af-2fc7-46f3-9c73-87bc740dad1d\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:20:38 crc kubenswrapper[4752]: I0929 11:20:38.538589 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f99592e-51b1-44bd-bd79-d7d3d3bc7c26-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"9f99592e-51b1-44bd-bd79-d7d3d3bc7c26\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:20:38 crc kubenswrapper[4752]: I0929 11:20:38.538705 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f99592e-51b1-44bd-bd79-d7d3d3bc7c26-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"9f99592e-51b1-44bd-bd79-d7d3d3bc7c26\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:20:38 crc kubenswrapper[4752]: I0929 11:20:38.538985 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f99592e-51b1-44bd-bd79-d7d3d3bc7c26-logs\") pod \"watcher-kuttl-api-0\" (UID: \"9f99592e-51b1-44bd-bd79-d7d3d3bc7c26\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:20:38 crc kubenswrapper[4752]: I0929 11:20:38.539123 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vc6ws\" (UniqueName: \"kubernetes.io/projected/9f99592e-51b1-44bd-bd79-d7d3d3bc7c26-kube-api-access-vc6ws\") pod \"watcher-kuttl-api-0\" (UID: \"9f99592e-51b1-44bd-bd79-d7d3d3bc7c26\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:20:38 crc kubenswrapper[4752]: I0929 11:20:38.539248 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a22b23af-2fc7-46f3-9c73-87bc740dad1d-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"a22b23af-2fc7-46f3-9c73-87bc740dad1d\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:20:38 crc kubenswrapper[4752]: I0929 11:20:38.539391 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99tw4\" (UniqueName: \"kubernetes.io/projected/a22b23af-2fc7-46f3-9c73-87bc740dad1d-kube-api-access-99tw4\") pod \"watcher-kuttl-applier-0\" (UID: \"a22b23af-2fc7-46f3-9c73-87bc740dad1d\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:20:38 crc kubenswrapper[4752]: I0929 11:20:38.539502 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/9f99592e-51b1-44bd-bd79-d7d3d3bc7c26-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"9f99592e-51b1-44bd-bd79-d7d3d3bc7c26\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:20:38 crc kubenswrapper[4752]: I0929 11:20:38.539601 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a22b23af-2fc7-46f3-9c73-87bc740dad1d-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"a22b23af-2fc7-46f3-9c73-87bc740dad1d\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:20:38 crc kubenswrapper[4752]: I0929 11:20:38.562114 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Sep 29 11:20:38 crc kubenswrapper[4752]: I0929 11:20:38.640667 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59b9b329-fe3a-4af8-8234-2d560d557569-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"59b9b329-fe3a-4af8-8234-2d560d557569\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:20:38 crc kubenswrapper[4752]: I0929 11:20:38.640730 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f99592e-51b1-44bd-bd79-d7d3d3bc7c26-logs\") pod \"watcher-kuttl-api-0\" (UID: \"9f99592e-51b1-44bd-bd79-d7d3d3bc7c26\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:20:38 crc kubenswrapper[4752]: I0929 11:20:38.640757 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vc6ws\" (UniqueName: \"kubernetes.io/projected/9f99592e-51b1-44bd-bd79-d7d3d3bc7c26-kube-api-access-vc6ws\") pod \"watcher-kuttl-api-0\" (UID: \"9f99592e-51b1-44bd-bd79-d7d3d3bc7c26\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:20:38 crc kubenswrapper[4752]: I0929 11:20:38.640787 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a22b23af-2fc7-46f3-9c73-87bc740dad1d-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"a22b23af-2fc7-46f3-9c73-87bc740dad1d\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:20:38 crc kubenswrapper[4752]: I0929 11:20:38.640828 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59b9b329-fe3a-4af8-8234-2d560d557569-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"59b9b329-fe3a-4af8-8234-2d560d557569\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:20:38 crc kubenswrapper[4752]: I0929 11:20:38.640851 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99tw4\" (UniqueName: \"kubernetes.io/projected/a22b23af-2fc7-46f3-9c73-87bc740dad1d-kube-api-access-99tw4\") pod \"watcher-kuttl-applier-0\" (UID: \"a22b23af-2fc7-46f3-9c73-87bc740dad1d\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:20:38 crc kubenswrapper[4752]: I0929 11:20:38.640868 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/59b9b329-fe3a-4af8-8234-2d560d557569-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"59b9b329-fe3a-4af8-8234-2d560d557569\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:20:38 crc kubenswrapper[4752]: I0929 11:20:38.640894 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/9f99592e-51b1-44bd-bd79-d7d3d3bc7c26-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"9f99592e-51b1-44bd-bd79-d7d3d3bc7c26\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:20:38 crc kubenswrapper[4752]: I0929 11:20:38.640918 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6tlk\" (UniqueName: \"kubernetes.io/projected/59b9b329-fe3a-4af8-8234-2d560d557569-kube-api-access-k6tlk\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"59b9b329-fe3a-4af8-8234-2d560d557569\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:20:38 crc kubenswrapper[4752]: I0929 11:20:38.640944 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a22b23af-2fc7-46f3-9c73-87bc740dad1d-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"a22b23af-2fc7-46f3-9c73-87bc740dad1d\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:20:38 crc kubenswrapper[4752]: I0929 11:20:38.640966 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59b9b329-fe3a-4af8-8234-2d560d557569-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"59b9b329-fe3a-4af8-8234-2d560d557569\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:20:38 crc kubenswrapper[4752]: I0929 11:20:38.640988 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a22b23af-2fc7-46f3-9c73-87bc740dad1d-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"a22b23af-2fc7-46f3-9c73-87bc740dad1d\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:20:38 crc kubenswrapper[4752]: I0929 11:20:38.641008 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f99592e-51b1-44bd-bd79-d7d3d3bc7c26-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"9f99592e-51b1-44bd-bd79-d7d3d3bc7c26\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:20:38 crc kubenswrapper[4752]: I0929 11:20:38.641030 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f99592e-51b1-44bd-bd79-d7d3d3bc7c26-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"9f99592e-51b1-44bd-bd79-d7d3d3bc7c26\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:20:38 crc kubenswrapper[4752]: I0929 11:20:38.641541 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f99592e-51b1-44bd-bd79-d7d3d3bc7c26-logs\") pod \"watcher-kuttl-api-0\" (UID: \"9f99592e-51b1-44bd-bd79-d7d3d3bc7c26\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:20:38 crc kubenswrapper[4752]: I0929 11:20:38.642448 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a22b23af-2fc7-46f3-9c73-87bc740dad1d-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"a22b23af-2fc7-46f3-9c73-87bc740dad1d\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:20:38 crc kubenswrapper[4752]: I0929 11:20:38.645423 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f99592e-51b1-44bd-bd79-d7d3d3bc7c26-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"9f99592e-51b1-44bd-bd79-d7d3d3bc7c26\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:20:38 crc kubenswrapper[4752]: I0929 11:20:38.645964 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a22b23af-2fc7-46f3-9c73-87bc740dad1d-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"a22b23af-2fc7-46f3-9c73-87bc740dad1d\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:20:38 crc kubenswrapper[4752]: I0929 11:20:38.647150 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f99592e-51b1-44bd-bd79-d7d3d3bc7c26-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"9f99592e-51b1-44bd-bd79-d7d3d3bc7c26\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:20:38 crc kubenswrapper[4752]: I0929 11:20:38.647156 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a22b23af-2fc7-46f3-9c73-87bc740dad1d-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"a22b23af-2fc7-46f3-9c73-87bc740dad1d\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:20:38 crc kubenswrapper[4752]: I0929 11:20:38.647282 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/9f99592e-51b1-44bd-bd79-d7d3d3bc7c26-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"9f99592e-51b1-44bd-bd79-d7d3d3bc7c26\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:20:38 crc kubenswrapper[4752]: I0929 11:20:38.669132 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99tw4\" (UniqueName: \"kubernetes.io/projected/a22b23af-2fc7-46f3-9c73-87bc740dad1d-kube-api-access-99tw4\") pod \"watcher-kuttl-applier-0\" (UID: \"a22b23af-2fc7-46f3-9c73-87bc740dad1d\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:20:38 crc kubenswrapper[4752]: I0929 11:20:38.670423 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vc6ws\" (UniqueName: \"kubernetes.io/projected/9f99592e-51b1-44bd-bd79-d7d3d3bc7c26-kube-api-access-vc6ws\") pod \"watcher-kuttl-api-0\" (UID: \"9f99592e-51b1-44bd-bd79-d7d3d3bc7c26\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:20:38 crc kubenswrapper[4752]: I0929 11:20:38.742479 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59b9b329-fe3a-4af8-8234-2d560d557569-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"59b9b329-fe3a-4af8-8234-2d560d557569\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:20:38 crc kubenswrapper[4752]: I0929 11:20:38.742546 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/59b9b329-fe3a-4af8-8234-2d560d557569-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"59b9b329-fe3a-4af8-8234-2d560d557569\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:20:38 crc kubenswrapper[4752]: I0929 11:20:38.742593 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6tlk\" (UniqueName: \"kubernetes.io/projected/59b9b329-fe3a-4af8-8234-2d560d557569-kube-api-access-k6tlk\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"59b9b329-fe3a-4af8-8234-2d560d557569\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:20:38 crc kubenswrapper[4752]: I0929 11:20:38.742629 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59b9b329-fe3a-4af8-8234-2d560d557569-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"59b9b329-fe3a-4af8-8234-2d560d557569\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:20:38 crc kubenswrapper[4752]: I0929 11:20:38.742677 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59b9b329-fe3a-4af8-8234-2d560d557569-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"59b9b329-fe3a-4af8-8234-2d560d557569\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:20:38 crc kubenswrapper[4752]: I0929 11:20:38.743668 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59b9b329-fe3a-4af8-8234-2d560d557569-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"59b9b329-fe3a-4af8-8234-2d560d557569\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:20:38 crc kubenswrapper[4752]: I0929 11:20:38.746150 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59b9b329-fe3a-4af8-8234-2d560d557569-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"59b9b329-fe3a-4af8-8234-2d560d557569\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:20:38 crc kubenswrapper[4752]: I0929 11:20:38.746260 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:20:38 crc kubenswrapper[4752]: I0929 11:20:38.746636 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59b9b329-fe3a-4af8-8234-2d560d557569-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"59b9b329-fe3a-4af8-8234-2d560d557569\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:20:38 crc kubenswrapper[4752]: I0929 11:20:38.759359 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/59b9b329-fe3a-4af8-8234-2d560d557569-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"59b9b329-fe3a-4af8-8234-2d560d557569\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:20:38 crc kubenswrapper[4752]: I0929 11:20:38.765002 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6tlk\" (UniqueName: \"kubernetes.io/projected/59b9b329-fe3a-4af8-8234-2d560d557569-kube-api-access-k6tlk\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"59b9b329-fe3a-4af8-8234-2d560d557569\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:20:38 crc kubenswrapper[4752]: I0929 11:20:38.793498 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:20:38 crc kubenswrapper[4752]: I0929 11:20:38.855751 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:20:39 crc kubenswrapper[4752]: I0929 11:20:39.204236 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Sep 29 11:20:39 crc kubenswrapper[4752]: I0929 11:20:39.310526 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Sep 29 11:20:39 crc kubenswrapper[4752]: I0929 11:20:39.346869 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Sep 29 11:20:39 crc kubenswrapper[4752]: W0929 11:20:39.354142 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod59b9b329_fe3a_4af8_8234_2d560d557569.slice/crio-68430d5dd8dc79065edf20cb63d0ccdffccfd570c7f7bc7a6f3e21e0e42b2bee WatchSource:0}: Error finding container 68430d5dd8dc79065edf20cb63d0ccdffccfd570c7f7bc7a6f3e21e0e42b2bee: Status 404 returned error can't find the container with id 68430d5dd8dc79065edf20cb63d0ccdffccfd570c7f7bc7a6f3e21e0e42b2bee Sep 29 11:20:40 crc kubenswrapper[4752]: I0929 11:20:40.109668 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"59b9b329-fe3a-4af8-8234-2d560d557569","Type":"ContainerStarted","Data":"99294096ba745b02ccdde54944f3af21a500b75829775a82857c3603fb9f1cdb"} Sep 29 11:20:40 crc kubenswrapper[4752]: I0929 11:20:40.110077 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"59b9b329-fe3a-4af8-8234-2d560d557569","Type":"ContainerStarted","Data":"68430d5dd8dc79065edf20cb63d0ccdffccfd570c7f7bc7a6f3e21e0e42b2bee"} Sep 29 11:20:40 crc kubenswrapper[4752]: I0929 11:20:40.111030 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"9f99592e-51b1-44bd-bd79-d7d3d3bc7c26","Type":"ContainerStarted","Data":"f3635e71220d0d708499ef8ad13d80c71cea4e85679662528dfdfebd177287d0"} Sep 29 11:20:40 crc kubenswrapper[4752]: I0929 11:20:40.111058 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"9f99592e-51b1-44bd-bd79-d7d3d3bc7c26","Type":"ContainerStarted","Data":"00e421de18d9c5da494915a2bc6fa1b4be38bdb38fb78418700a814db4933a18"} Sep 29 11:20:40 crc kubenswrapper[4752]: I0929 11:20:40.112289 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"a22b23af-2fc7-46f3-9c73-87bc740dad1d","Type":"ContainerStarted","Data":"18eccdbe0dd6a0b5ffa48b63bee01e4af645e35adbb34d5064e809e647e8fe8b"} Sep 29 11:20:40 crc kubenswrapper[4752]: I0929 11:20:40.112330 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"a22b23af-2fc7-46f3-9c73-87bc740dad1d","Type":"ContainerStarted","Data":"6cff88a35a4dfcb83c2488fefc4461c35ec44fc1325e4133ad265c2e53254ffc"} Sep 29 11:20:40 crc kubenswrapper[4752]: I0929 11:20:40.132258 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podStartSLOduration=2.132242901 podStartE2EDuration="2.132242901s" podCreationTimestamp="2025-09-29 11:20:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 11:20:40.124622562 +0000 UTC m=+2180.913764229" watchObservedRunningTime="2025-09-29 11:20:40.132242901 +0000 UTC m=+2180.921384568" Sep 29 11:20:40 crc kubenswrapper[4752]: I0929 11:20:40.144762 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podStartSLOduration=2.144744136 podStartE2EDuration="2.144744136s" podCreationTimestamp="2025-09-29 11:20:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 11:20:40.141970804 +0000 UTC m=+2180.931112471" watchObservedRunningTime="2025-09-29 11:20:40.144744136 +0000 UTC m=+2180.933885803" Sep 29 11:20:41 crc kubenswrapper[4752]: I0929 11:20:41.120889 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"9f99592e-51b1-44bd-bd79-d7d3d3bc7c26","Type":"ContainerStarted","Data":"2b16e28236bf1fce5ca0b48f7551fddbb674dc12077a05c1597a0697beb0068e"} Sep 29 11:20:41 crc kubenswrapper[4752]: I0929 11:20:41.160572 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-api-0" podStartSLOduration=3.16054121 podStartE2EDuration="3.16054121s" podCreationTimestamp="2025-09-29 11:20:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 11:20:41.152043229 +0000 UTC m=+2181.941184906" watchObservedRunningTime="2025-09-29 11:20:41.16054121 +0000 UTC m=+2181.949682907" Sep 29 11:20:42 crc kubenswrapper[4752]: I0929 11:20:42.128144 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:20:43 crc kubenswrapper[4752]: I0929 11:20:43.746844 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:20:43 crc kubenswrapper[4752]: I0929 11:20:43.794548 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:20:44 crc kubenswrapper[4752]: I0929 11:20:44.142797 4752 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 29 11:20:44 crc kubenswrapper[4752]: I0929 11:20:44.286153 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:20:48 crc kubenswrapper[4752]: I0929 11:20:48.747031 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:20:48 crc kubenswrapper[4752]: I0929 11:20:48.751369 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:20:48 crc kubenswrapper[4752]: I0929 11:20:48.794452 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:20:48 crc kubenswrapper[4752]: I0929 11:20:48.821369 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:20:48 crc kubenswrapper[4752]: I0929 11:20:48.856716 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:20:48 crc kubenswrapper[4752]: I0929 11:20:48.880485 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:20:49 crc kubenswrapper[4752]: I0929 11:20:49.186775 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:20:49 crc kubenswrapper[4752]: I0929 11:20:49.190879 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:20:49 crc kubenswrapper[4752]: I0929 11:20:49.222926 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:20:49 crc kubenswrapper[4752]: I0929 11:20:49.229231 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:20:49 crc kubenswrapper[4752]: I0929 11:20:49.306660 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:20:51 crc kubenswrapper[4752]: I0929 11:20:51.378728 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Sep 29 11:20:51 crc kubenswrapper[4752]: I0929 11:20:51.379360 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="8a0c35a4-54fb-48cf-8ead-1dab82509893" containerName="ceilometer-central-agent" containerID="cri-o://6e4482e9ccb24286f1472daaae355e9e4140fcaeadc83ba16633469bb5b69c60" gracePeriod=30 Sep 29 11:20:51 crc kubenswrapper[4752]: I0929 11:20:51.379426 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="8a0c35a4-54fb-48cf-8ead-1dab82509893" containerName="proxy-httpd" containerID="cri-o://b860ce00d8b55e48951cec932077c9833ddb0c119831b7af972e7a19e2003420" gracePeriod=30 Sep 29 11:20:51 crc kubenswrapper[4752]: I0929 11:20:51.379497 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="8a0c35a4-54fb-48cf-8ead-1dab82509893" containerName="ceilometer-notification-agent" containerID="cri-o://574ffa2e61d591a17326b83fa0cc3b75d916a4b261680dd3d439dca34c14f20c" gracePeriod=30 Sep 29 11:20:51 crc kubenswrapper[4752]: I0929 11:20:51.379426 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="8a0c35a4-54fb-48cf-8ead-1dab82509893" containerName="sg-core" containerID="cri-o://8ce9834b371b8dbf84399b7f52ffa05a8e3c250b1bfc3cf2182abb12a958f8fe" gracePeriod=30 Sep 29 11:20:52 crc kubenswrapper[4752]: I0929 11:20:52.031653 4752 scope.go:117] "RemoveContainer" containerID="93752bca1235c82c7e20c88ea68e0afd59b9dc59d3315066b08789cc37a87e37" Sep 29 11:20:52 crc kubenswrapper[4752]: E0929 11:20:52.031916 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgrvs_openshift-machine-config-operator(5863c243-797d-462a-b11f-71aaf005f8d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" Sep 29 11:20:52 crc kubenswrapper[4752]: I0929 11:20:52.215627 4752 generic.go:334] "Generic (PLEG): container finished" podID="8a0c35a4-54fb-48cf-8ead-1dab82509893" containerID="b860ce00d8b55e48951cec932077c9833ddb0c119831b7af972e7a19e2003420" exitCode=0 Sep 29 11:20:52 crc kubenswrapper[4752]: I0929 11:20:52.215663 4752 generic.go:334] "Generic (PLEG): container finished" podID="8a0c35a4-54fb-48cf-8ead-1dab82509893" containerID="8ce9834b371b8dbf84399b7f52ffa05a8e3c250b1bfc3cf2182abb12a958f8fe" exitCode=2 Sep 29 11:20:52 crc kubenswrapper[4752]: I0929 11:20:52.215676 4752 generic.go:334] "Generic (PLEG): container finished" podID="8a0c35a4-54fb-48cf-8ead-1dab82509893" containerID="6e4482e9ccb24286f1472daaae355e9e4140fcaeadc83ba16633469bb5b69c60" exitCode=0 Sep 29 11:20:52 crc kubenswrapper[4752]: I0929 11:20:52.215673 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"8a0c35a4-54fb-48cf-8ead-1dab82509893","Type":"ContainerDied","Data":"b860ce00d8b55e48951cec932077c9833ddb0c119831b7af972e7a19e2003420"} Sep 29 11:20:52 crc kubenswrapper[4752]: I0929 11:20:52.215779 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"8a0c35a4-54fb-48cf-8ead-1dab82509893","Type":"ContainerDied","Data":"8ce9834b371b8dbf84399b7f52ffa05a8e3c250b1bfc3cf2182abb12a958f8fe"} Sep 29 11:20:52 crc kubenswrapper[4752]: I0929 11:20:52.215848 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"8a0c35a4-54fb-48cf-8ead-1dab82509893","Type":"ContainerDied","Data":"6e4482e9ccb24286f1472daaae355e9e4140fcaeadc83ba16633469bb5b69c60"} Sep 29 11:20:53 crc kubenswrapper[4752]: I0929 11:20:53.226026 4752 generic.go:334] "Generic (PLEG): container finished" podID="8a0c35a4-54fb-48cf-8ead-1dab82509893" containerID="574ffa2e61d591a17326b83fa0cc3b75d916a4b261680dd3d439dca34c14f20c" exitCode=0 Sep 29 11:20:53 crc kubenswrapper[4752]: I0929 11:20:53.226188 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"8a0c35a4-54fb-48cf-8ead-1dab82509893","Type":"ContainerDied","Data":"574ffa2e61d591a17326b83fa0cc3b75d916a4b261680dd3d439dca34c14f20c"} Sep 29 11:20:53 crc kubenswrapper[4752]: I0929 11:20:53.411611 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:20:53 crc kubenswrapper[4752]: I0929 11:20:53.472114 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8a0c35a4-54fb-48cf-8ead-1dab82509893-sg-core-conf-yaml\") pod \"8a0c35a4-54fb-48cf-8ead-1dab82509893\" (UID: \"8a0c35a4-54fb-48cf-8ead-1dab82509893\") " Sep 29 11:20:53 crc kubenswrapper[4752]: I0929 11:20:53.472168 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a0c35a4-54fb-48cf-8ead-1dab82509893-combined-ca-bundle\") pod \"8a0c35a4-54fb-48cf-8ead-1dab82509893\" (UID: \"8a0c35a4-54fb-48cf-8ead-1dab82509893\") " Sep 29 11:20:53 crc kubenswrapper[4752]: I0929 11:20:53.472235 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a0c35a4-54fb-48cf-8ead-1dab82509893-run-httpd\") pod \"8a0c35a4-54fb-48cf-8ead-1dab82509893\" (UID: \"8a0c35a4-54fb-48cf-8ead-1dab82509893\") " Sep 29 11:20:53 crc kubenswrapper[4752]: I0929 11:20:53.472265 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a0c35a4-54fb-48cf-8ead-1dab82509893-scripts\") pod \"8a0c35a4-54fb-48cf-8ead-1dab82509893\" (UID: \"8a0c35a4-54fb-48cf-8ead-1dab82509893\") " Sep 29 11:20:53 crc kubenswrapper[4752]: I0929 11:20:53.472305 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a0c35a4-54fb-48cf-8ead-1dab82509893-ceilometer-tls-certs\") pod \"8a0c35a4-54fb-48cf-8ead-1dab82509893\" (UID: \"8a0c35a4-54fb-48cf-8ead-1dab82509893\") " Sep 29 11:20:53 crc kubenswrapper[4752]: I0929 11:20:53.472333 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5jnnz\" (UniqueName: \"kubernetes.io/projected/8a0c35a4-54fb-48cf-8ead-1dab82509893-kube-api-access-5jnnz\") pod \"8a0c35a4-54fb-48cf-8ead-1dab82509893\" (UID: \"8a0c35a4-54fb-48cf-8ead-1dab82509893\") " Sep 29 11:20:53 crc kubenswrapper[4752]: I0929 11:20:53.472353 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a0c35a4-54fb-48cf-8ead-1dab82509893-config-data\") pod \"8a0c35a4-54fb-48cf-8ead-1dab82509893\" (UID: \"8a0c35a4-54fb-48cf-8ead-1dab82509893\") " Sep 29 11:20:53 crc kubenswrapper[4752]: I0929 11:20:53.472377 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a0c35a4-54fb-48cf-8ead-1dab82509893-log-httpd\") pod \"8a0c35a4-54fb-48cf-8ead-1dab82509893\" (UID: \"8a0c35a4-54fb-48cf-8ead-1dab82509893\") " Sep 29 11:20:53 crc kubenswrapper[4752]: I0929 11:20:53.473119 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a0c35a4-54fb-48cf-8ead-1dab82509893-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "8a0c35a4-54fb-48cf-8ead-1dab82509893" (UID: "8a0c35a4-54fb-48cf-8ead-1dab82509893"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 11:20:53 crc kubenswrapper[4752]: I0929 11:20:53.474285 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a0c35a4-54fb-48cf-8ead-1dab82509893-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "8a0c35a4-54fb-48cf-8ead-1dab82509893" (UID: "8a0c35a4-54fb-48cf-8ead-1dab82509893"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 11:20:53 crc kubenswrapper[4752]: I0929 11:20:53.479850 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a0c35a4-54fb-48cf-8ead-1dab82509893-kube-api-access-5jnnz" (OuterVolumeSpecName: "kube-api-access-5jnnz") pod "8a0c35a4-54fb-48cf-8ead-1dab82509893" (UID: "8a0c35a4-54fb-48cf-8ead-1dab82509893"). InnerVolumeSpecName "kube-api-access-5jnnz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 11:20:53 crc kubenswrapper[4752]: I0929 11:20:53.493710 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a0c35a4-54fb-48cf-8ead-1dab82509893-scripts" (OuterVolumeSpecName: "scripts") pod "8a0c35a4-54fb-48cf-8ead-1dab82509893" (UID: "8a0c35a4-54fb-48cf-8ead-1dab82509893"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:20:53 crc kubenswrapper[4752]: I0929 11:20:53.499294 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a0c35a4-54fb-48cf-8ead-1dab82509893-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "8a0c35a4-54fb-48cf-8ead-1dab82509893" (UID: "8a0c35a4-54fb-48cf-8ead-1dab82509893"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:20:53 crc kubenswrapper[4752]: I0929 11:20:53.537961 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a0c35a4-54fb-48cf-8ead-1dab82509893-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8a0c35a4-54fb-48cf-8ead-1dab82509893" (UID: "8a0c35a4-54fb-48cf-8ead-1dab82509893"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:20:53 crc kubenswrapper[4752]: I0929 11:20:53.550081 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a0c35a4-54fb-48cf-8ead-1dab82509893-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "8a0c35a4-54fb-48cf-8ead-1dab82509893" (UID: "8a0c35a4-54fb-48cf-8ead-1dab82509893"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:20:53 crc kubenswrapper[4752]: I0929 11:20:53.557251 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a0c35a4-54fb-48cf-8ead-1dab82509893-config-data" (OuterVolumeSpecName: "config-data") pod "8a0c35a4-54fb-48cf-8ead-1dab82509893" (UID: "8a0c35a4-54fb-48cf-8ead-1dab82509893"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:20:53 crc kubenswrapper[4752]: I0929 11:20:53.574209 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a0c35a4-54fb-48cf-8ead-1dab82509893-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 11:20:53 crc kubenswrapper[4752]: I0929 11:20:53.574238 4752 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a0c35a4-54fb-48cf-8ead-1dab82509893-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 29 11:20:53 crc kubenswrapper[4752]: I0929 11:20:53.574251 4752 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8a0c35a4-54fb-48cf-8ead-1dab82509893-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 29 11:20:53 crc kubenswrapper[4752]: I0929 11:20:53.574261 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a0c35a4-54fb-48cf-8ead-1dab82509893-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 11:20:53 crc kubenswrapper[4752]: I0929 11:20:53.574268 4752 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a0c35a4-54fb-48cf-8ead-1dab82509893-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 29 11:20:53 crc kubenswrapper[4752]: I0929 11:20:53.574276 4752 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a0c35a4-54fb-48cf-8ead-1dab82509893-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 11:20:53 crc kubenswrapper[4752]: I0929 11:20:53.574284 4752 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a0c35a4-54fb-48cf-8ead-1dab82509893-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 29 11:20:53 crc kubenswrapper[4752]: I0929 11:20:53.574292 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5jnnz\" (UniqueName: \"kubernetes.io/projected/8a0c35a4-54fb-48cf-8ead-1dab82509893-kube-api-access-5jnnz\") on node \"crc\" DevicePath \"\"" Sep 29 11:20:54 crc kubenswrapper[4752]: I0929 11:20:54.237540 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"8a0c35a4-54fb-48cf-8ead-1dab82509893","Type":"ContainerDied","Data":"82eed08bce346d24b2388e166bfc55e2ede547defd69fe6c92c1897ec925b989"} Sep 29 11:20:54 crc kubenswrapper[4752]: I0929 11:20:54.237632 4752 scope.go:117] "RemoveContainer" containerID="b860ce00d8b55e48951cec932077c9833ddb0c119831b7af972e7a19e2003420" Sep 29 11:20:54 crc kubenswrapper[4752]: I0929 11:20:54.237665 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:20:54 crc kubenswrapper[4752]: I0929 11:20:54.266733 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Sep 29 11:20:54 crc kubenswrapper[4752]: I0929 11:20:54.279955 4752 scope.go:117] "RemoveContainer" containerID="8ce9834b371b8dbf84399b7f52ffa05a8e3c250b1bfc3cf2182abb12a958f8fe" Sep 29 11:20:54 crc kubenswrapper[4752]: I0929 11:20:54.281205 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Sep 29 11:20:54 crc kubenswrapper[4752]: I0929 11:20:54.345362 4752 scope.go:117] "RemoveContainer" containerID="574ffa2e61d591a17326b83fa0cc3b75d916a4b261680dd3d439dca34c14f20c" Sep 29 11:20:54 crc kubenswrapper[4752]: I0929 11:20:54.351254 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Sep 29 11:20:54 crc kubenswrapper[4752]: E0929 11:20:54.352130 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a0c35a4-54fb-48cf-8ead-1dab82509893" containerName="proxy-httpd" Sep 29 11:20:54 crc kubenswrapper[4752]: I0929 11:20:54.352175 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a0c35a4-54fb-48cf-8ead-1dab82509893" containerName="proxy-httpd" Sep 29 11:20:54 crc kubenswrapper[4752]: E0929 11:20:54.352193 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a0c35a4-54fb-48cf-8ead-1dab82509893" containerName="ceilometer-notification-agent" Sep 29 11:20:54 crc kubenswrapper[4752]: I0929 11:20:54.352203 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a0c35a4-54fb-48cf-8ead-1dab82509893" containerName="ceilometer-notification-agent" Sep 29 11:20:54 crc kubenswrapper[4752]: E0929 11:20:54.352238 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a0c35a4-54fb-48cf-8ead-1dab82509893" containerName="ceilometer-central-agent" Sep 29 11:20:54 crc kubenswrapper[4752]: I0929 11:20:54.352246 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a0c35a4-54fb-48cf-8ead-1dab82509893" containerName="ceilometer-central-agent" Sep 29 11:20:54 crc kubenswrapper[4752]: E0929 11:20:54.352261 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a0c35a4-54fb-48cf-8ead-1dab82509893" containerName="sg-core" Sep 29 11:20:54 crc kubenswrapper[4752]: I0929 11:20:54.352269 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a0c35a4-54fb-48cf-8ead-1dab82509893" containerName="sg-core" Sep 29 11:20:54 crc kubenswrapper[4752]: I0929 11:20:54.352883 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a0c35a4-54fb-48cf-8ead-1dab82509893" containerName="ceilometer-central-agent" Sep 29 11:20:54 crc kubenswrapper[4752]: I0929 11:20:54.352997 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a0c35a4-54fb-48cf-8ead-1dab82509893" containerName="proxy-httpd" Sep 29 11:20:54 crc kubenswrapper[4752]: I0929 11:20:54.353058 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a0c35a4-54fb-48cf-8ead-1dab82509893" containerName="ceilometer-notification-agent" Sep 29 11:20:54 crc kubenswrapper[4752]: I0929 11:20:54.353114 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a0c35a4-54fb-48cf-8ead-1dab82509893" containerName="sg-core" Sep 29 11:20:54 crc kubenswrapper[4752]: I0929 11:20:54.356319 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:20:54 crc kubenswrapper[4752]: I0929 11:20:54.358213 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Sep 29 11:20:54 crc kubenswrapper[4752]: I0929 11:20:54.358763 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-ceilometer-internal-svc" Sep 29 11:20:54 crc kubenswrapper[4752]: I0929 11:20:54.358957 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Sep 29 11:20:54 crc kubenswrapper[4752]: I0929 11:20:54.366174 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Sep 29 11:20:54 crc kubenswrapper[4752]: I0929 11:20:54.371628 4752 scope.go:117] "RemoveContainer" containerID="6e4482e9ccb24286f1472daaae355e9e4140fcaeadc83ba16633469bb5b69c60" Sep 29 11:20:54 crc kubenswrapper[4752]: I0929 11:20:54.504054 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/64ab5c9b-f499-4e45-a9a5-82d2a44d7a29-log-httpd\") pod \"ceilometer-0\" (UID: \"64ab5c9b-f499-4e45-a9a5-82d2a44d7a29\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:20:54 crc kubenswrapper[4752]: I0929 11:20:54.504311 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/64ab5c9b-f499-4e45-a9a5-82d2a44d7a29-run-httpd\") pod \"ceilometer-0\" (UID: \"64ab5c9b-f499-4e45-a9a5-82d2a44d7a29\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:20:54 crc kubenswrapper[4752]: I0929 11:20:54.504384 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64ab5c9b-f499-4e45-a9a5-82d2a44d7a29-config-data\") pod \"ceilometer-0\" (UID: \"64ab5c9b-f499-4e45-a9a5-82d2a44d7a29\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:20:54 crc kubenswrapper[4752]: I0929 11:20:54.504456 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/64ab5c9b-f499-4e45-a9a5-82d2a44d7a29-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"64ab5c9b-f499-4e45-a9a5-82d2a44d7a29\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:20:54 crc kubenswrapper[4752]: I0929 11:20:54.504665 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64ab5c9b-f499-4e45-a9a5-82d2a44d7a29-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"64ab5c9b-f499-4e45-a9a5-82d2a44d7a29\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:20:54 crc kubenswrapper[4752]: I0929 11:20:54.504913 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qq2wl\" (UniqueName: \"kubernetes.io/projected/64ab5c9b-f499-4e45-a9a5-82d2a44d7a29-kube-api-access-qq2wl\") pod \"ceilometer-0\" (UID: \"64ab5c9b-f499-4e45-a9a5-82d2a44d7a29\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:20:54 crc kubenswrapper[4752]: I0929 11:20:54.504981 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/64ab5c9b-f499-4e45-a9a5-82d2a44d7a29-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"64ab5c9b-f499-4e45-a9a5-82d2a44d7a29\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:20:54 crc kubenswrapper[4752]: I0929 11:20:54.505108 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64ab5c9b-f499-4e45-a9a5-82d2a44d7a29-scripts\") pod \"ceilometer-0\" (UID: \"64ab5c9b-f499-4e45-a9a5-82d2a44d7a29\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:20:54 crc kubenswrapper[4752]: I0929 11:20:54.606643 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/64ab5c9b-f499-4e45-a9a5-82d2a44d7a29-run-httpd\") pod \"ceilometer-0\" (UID: \"64ab5c9b-f499-4e45-a9a5-82d2a44d7a29\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:20:54 crc kubenswrapper[4752]: I0929 11:20:54.606694 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64ab5c9b-f499-4e45-a9a5-82d2a44d7a29-config-data\") pod \"ceilometer-0\" (UID: \"64ab5c9b-f499-4e45-a9a5-82d2a44d7a29\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:20:54 crc kubenswrapper[4752]: I0929 11:20:54.606718 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/64ab5c9b-f499-4e45-a9a5-82d2a44d7a29-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"64ab5c9b-f499-4e45-a9a5-82d2a44d7a29\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:20:54 crc kubenswrapper[4752]: I0929 11:20:54.606820 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64ab5c9b-f499-4e45-a9a5-82d2a44d7a29-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"64ab5c9b-f499-4e45-a9a5-82d2a44d7a29\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:20:54 crc kubenswrapper[4752]: I0929 11:20:54.606871 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qq2wl\" (UniqueName: \"kubernetes.io/projected/64ab5c9b-f499-4e45-a9a5-82d2a44d7a29-kube-api-access-qq2wl\") pod \"ceilometer-0\" (UID: \"64ab5c9b-f499-4e45-a9a5-82d2a44d7a29\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:20:54 crc kubenswrapper[4752]: I0929 11:20:54.606894 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/64ab5c9b-f499-4e45-a9a5-82d2a44d7a29-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"64ab5c9b-f499-4e45-a9a5-82d2a44d7a29\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:20:54 crc kubenswrapper[4752]: I0929 11:20:54.606916 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64ab5c9b-f499-4e45-a9a5-82d2a44d7a29-scripts\") pod \"ceilometer-0\" (UID: \"64ab5c9b-f499-4e45-a9a5-82d2a44d7a29\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:20:54 crc kubenswrapper[4752]: I0929 11:20:54.606963 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/64ab5c9b-f499-4e45-a9a5-82d2a44d7a29-log-httpd\") pod \"ceilometer-0\" (UID: \"64ab5c9b-f499-4e45-a9a5-82d2a44d7a29\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:20:54 crc kubenswrapper[4752]: I0929 11:20:54.607542 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/64ab5c9b-f499-4e45-a9a5-82d2a44d7a29-log-httpd\") pod \"ceilometer-0\" (UID: \"64ab5c9b-f499-4e45-a9a5-82d2a44d7a29\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:20:54 crc kubenswrapper[4752]: I0929 11:20:54.607570 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/64ab5c9b-f499-4e45-a9a5-82d2a44d7a29-run-httpd\") pod \"ceilometer-0\" (UID: \"64ab5c9b-f499-4e45-a9a5-82d2a44d7a29\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:20:54 crc kubenswrapper[4752]: I0929 11:20:54.612829 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/64ab5c9b-f499-4e45-a9a5-82d2a44d7a29-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"64ab5c9b-f499-4e45-a9a5-82d2a44d7a29\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:20:54 crc kubenswrapper[4752]: I0929 11:20:54.613935 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64ab5c9b-f499-4e45-a9a5-82d2a44d7a29-config-data\") pod \"ceilometer-0\" (UID: \"64ab5c9b-f499-4e45-a9a5-82d2a44d7a29\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:20:54 crc kubenswrapper[4752]: I0929 11:20:54.614059 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64ab5c9b-f499-4e45-a9a5-82d2a44d7a29-scripts\") pod \"ceilometer-0\" (UID: \"64ab5c9b-f499-4e45-a9a5-82d2a44d7a29\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:20:54 crc kubenswrapper[4752]: I0929 11:20:54.614682 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64ab5c9b-f499-4e45-a9a5-82d2a44d7a29-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"64ab5c9b-f499-4e45-a9a5-82d2a44d7a29\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:20:54 crc kubenswrapper[4752]: I0929 11:20:54.614861 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/64ab5c9b-f499-4e45-a9a5-82d2a44d7a29-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"64ab5c9b-f499-4e45-a9a5-82d2a44d7a29\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:20:54 crc kubenswrapper[4752]: I0929 11:20:54.628018 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qq2wl\" (UniqueName: \"kubernetes.io/projected/64ab5c9b-f499-4e45-a9a5-82d2a44d7a29-kube-api-access-qq2wl\") pod \"ceilometer-0\" (UID: \"64ab5c9b-f499-4e45-a9a5-82d2a44d7a29\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:20:54 crc kubenswrapper[4752]: I0929 11:20:54.675532 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:20:55 crc kubenswrapper[4752]: I0929 11:20:55.162934 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Sep 29 11:20:55 crc kubenswrapper[4752]: W0929 11:20:55.163277 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod64ab5c9b_f499_4e45_a9a5_82d2a44d7a29.slice/crio-fc0a6467d5868225cab54c24e69a1e1cada95adfcac643d30a0b61c326c4b241 WatchSource:0}: Error finding container fc0a6467d5868225cab54c24e69a1e1cada95adfcac643d30a0b61c326c4b241: Status 404 returned error can't find the container with id fc0a6467d5868225cab54c24e69a1e1cada95adfcac643d30a0b61c326c4b241 Sep 29 11:20:55 crc kubenswrapper[4752]: I0929 11:20:55.248836 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"64ab5c9b-f499-4e45-a9a5-82d2a44d7a29","Type":"ContainerStarted","Data":"fc0a6467d5868225cab54c24e69a1e1cada95adfcac643d30a0b61c326c4b241"} Sep 29 11:20:56 crc kubenswrapper[4752]: I0929 11:20:56.075747 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a0c35a4-54fb-48cf-8ead-1dab82509893" path="/var/lib/kubelet/pods/8a0c35a4-54fb-48cf-8ead-1dab82509893/volumes" Sep 29 11:20:56 crc kubenswrapper[4752]: I0929 11:20:56.258045 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"64ab5c9b-f499-4e45-a9a5-82d2a44d7a29","Type":"ContainerStarted","Data":"29a90c8915ed05357a7831ca4c06378144417f842bc1fba44504af4d3e816a39"} Sep 29 11:20:56 crc kubenswrapper[4752]: I0929 11:20:56.522272 4752 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 29 11:20:57 crc kubenswrapper[4752]: I0929 11:20:57.267045 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"64ab5c9b-f499-4e45-a9a5-82d2a44d7a29","Type":"ContainerStarted","Data":"76f6435988eec008338092b7cbed1aa1442ca8a8c1350b4b32d903bd7b983ff6"} Sep 29 11:20:58 crc kubenswrapper[4752]: I0929 11:20:58.280249 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"64ab5c9b-f499-4e45-a9a5-82d2a44d7a29","Type":"ContainerStarted","Data":"dca2a889b7881e2450483c36abd8faf6f3b25b0deb45ae32407782b74f596d58"} Sep 29 11:21:01 crc kubenswrapper[4752]: I0929 11:21:01.311232 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"64ab5c9b-f499-4e45-a9a5-82d2a44d7a29","Type":"ContainerStarted","Data":"52b37aab03b1cef60bb61ff8aa4e7017a35aa8d8a53cec2a68f08f89b92707be"} Sep 29 11:21:01 crc kubenswrapper[4752]: I0929 11:21:01.311922 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:21:01 crc kubenswrapper[4752]: I0929 11:21:01.340610 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/ceilometer-0" podStartSLOduration=2.026027958 podStartE2EDuration="7.340590658s" podCreationTimestamp="2025-09-29 11:20:54 +0000 UTC" firstStartedPulling="2025-09-29 11:20:55.16551303 +0000 UTC m=+2195.954654697" lastFinishedPulling="2025-09-29 11:21:00.48007573 +0000 UTC m=+2201.269217397" observedRunningTime="2025-09-29 11:21:01.340218789 +0000 UTC m=+2202.129360486" watchObservedRunningTime="2025-09-29 11:21:01.340590658 +0000 UTC m=+2202.129732325" Sep 29 11:21:06 crc kubenswrapper[4752]: I0929 11:21:06.031446 4752 scope.go:117] "RemoveContainer" containerID="93752bca1235c82c7e20c88ea68e0afd59b9dc59d3315066b08789cc37a87e37" Sep 29 11:21:06 crc kubenswrapper[4752]: E0929 11:21:06.032907 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgrvs_openshift-machine-config-operator(5863c243-797d-462a-b11f-71aaf005f8d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" Sep 29 11:21:19 crc kubenswrapper[4752]: I0929 11:21:19.031907 4752 scope.go:117] "RemoveContainer" containerID="93752bca1235c82c7e20c88ea68e0afd59b9dc59d3315066b08789cc37a87e37" Sep 29 11:21:19 crc kubenswrapper[4752]: E0929 11:21:19.032937 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgrvs_openshift-machine-config-operator(5863c243-797d-462a-b11f-71aaf005f8d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" Sep 29 11:21:21 crc kubenswrapper[4752]: I0929 11:21:21.753703 4752 scope.go:117] "RemoveContainer" containerID="49b5e72ade9fa195e2885a3effa0b343cde6622f15828e925bd8c0c020891fa1" Sep 29 11:21:21 crc kubenswrapper[4752]: I0929 11:21:21.776041 4752 scope.go:117] "RemoveContainer" containerID="413a58f9ffaecd4c0151757e902267b0090dd6db37fc31ff45e03744b2d4c1ed" Sep 29 11:21:24 crc kubenswrapper[4752]: I0929 11:21:24.703641 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:21:30 crc kubenswrapper[4752]: I0929 11:21:30.035999 4752 scope.go:117] "RemoveContainer" containerID="93752bca1235c82c7e20c88ea68e0afd59b9dc59d3315066b08789cc37a87e37" Sep 29 11:21:30 crc kubenswrapper[4752]: E0929 11:21:30.037272 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgrvs_openshift-machine-config-operator(5863c243-797d-462a-b11f-71aaf005f8d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" Sep 29 11:21:43 crc kubenswrapper[4752]: I0929 11:21:43.031127 4752 scope.go:117] "RemoveContainer" containerID="93752bca1235c82c7e20c88ea68e0afd59b9dc59d3315066b08789cc37a87e37" Sep 29 11:21:43 crc kubenswrapper[4752]: E0929 11:21:43.031959 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgrvs_openshift-machine-config-operator(5863c243-797d-462a-b11f-71aaf005f8d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" Sep 29 11:21:57 crc kubenswrapper[4752]: I0929 11:21:57.031191 4752 scope.go:117] "RemoveContainer" containerID="93752bca1235c82c7e20c88ea68e0afd59b9dc59d3315066b08789cc37a87e37" Sep 29 11:21:57 crc kubenswrapper[4752]: E0929 11:21:57.033429 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgrvs_openshift-machine-config-operator(5863c243-797d-462a-b11f-71aaf005f8d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" Sep 29 11:22:12 crc kubenswrapper[4752]: I0929 11:22:12.031242 4752 scope.go:117] "RemoveContainer" containerID="93752bca1235c82c7e20c88ea68e0afd59b9dc59d3315066b08789cc37a87e37" Sep 29 11:22:12 crc kubenswrapper[4752]: E0929 11:22:12.031921 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgrvs_openshift-machine-config-operator(5863c243-797d-462a-b11f-71aaf005f8d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" Sep 29 11:22:21 crc kubenswrapper[4752]: I0929 11:22:21.867249 4752 scope.go:117] "RemoveContainer" containerID="bb590b8744c608a6ef5cdfc1dc8db5a55a975093e91c7c95e5fec12d0bc116b7" Sep 29 11:22:21 crc kubenswrapper[4752]: I0929 11:22:21.900056 4752 scope.go:117] "RemoveContainer" containerID="d9ce4c48f5a8205cef2943a7896798e3861683be3a68bec961bcc5a03155448a" Sep 29 11:22:27 crc kubenswrapper[4752]: I0929 11:22:27.031427 4752 scope.go:117] "RemoveContainer" containerID="93752bca1235c82c7e20c88ea68e0afd59b9dc59d3315066b08789cc37a87e37" Sep 29 11:22:27 crc kubenswrapper[4752]: E0929 11:22:27.032215 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgrvs_openshift-machine-config-operator(5863c243-797d-462a-b11f-71aaf005f8d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" Sep 29 11:22:42 crc kubenswrapper[4752]: I0929 11:22:42.031239 4752 scope.go:117] "RemoveContainer" containerID="93752bca1235c82c7e20c88ea68e0afd59b9dc59d3315066b08789cc37a87e37" Sep 29 11:22:42 crc kubenswrapper[4752]: E0929 11:22:42.032043 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgrvs_openshift-machine-config-operator(5863c243-797d-462a-b11f-71aaf005f8d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" Sep 29 11:22:54 crc kubenswrapper[4752]: I0929 11:22:54.032568 4752 scope.go:117] "RemoveContainer" containerID="93752bca1235c82c7e20c88ea68e0afd59b9dc59d3315066b08789cc37a87e37" Sep 29 11:22:54 crc kubenswrapper[4752]: E0929 11:22:54.034547 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgrvs_openshift-machine-config-operator(5863c243-797d-462a-b11f-71aaf005f8d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" Sep 29 11:23:07 crc kubenswrapper[4752]: I0929 11:23:07.031239 4752 scope.go:117] "RemoveContainer" containerID="93752bca1235c82c7e20c88ea68e0afd59b9dc59d3315066b08789cc37a87e37" Sep 29 11:23:07 crc kubenswrapper[4752]: E0929 11:23:07.031966 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgrvs_openshift-machine-config-operator(5863c243-797d-462a-b11f-71aaf005f8d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" Sep 29 11:23:18 crc kubenswrapper[4752]: I0929 11:23:18.391320 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-sm8xr"] Sep 29 11:23:18 crc kubenswrapper[4752]: I0929 11:23:18.393947 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sm8xr" Sep 29 11:23:18 crc kubenswrapper[4752]: I0929 11:23:18.405918 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sm8xr"] Sep 29 11:23:18 crc kubenswrapper[4752]: I0929 11:23:18.432578 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2938c9c3-83a5-4b77-b910-5a0b03b59ad1-catalog-content\") pod \"redhat-operators-sm8xr\" (UID: \"2938c9c3-83a5-4b77-b910-5a0b03b59ad1\") " pod="openshift-marketplace/redhat-operators-sm8xr" Sep 29 11:23:18 crc kubenswrapper[4752]: I0929 11:23:18.432638 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2938c9c3-83a5-4b77-b910-5a0b03b59ad1-utilities\") pod \"redhat-operators-sm8xr\" (UID: \"2938c9c3-83a5-4b77-b910-5a0b03b59ad1\") " pod="openshift-marketplace/redhat-operators-sm8xr" Sep 29 11:23:18 crc kubenswrapper[4752]: I0929 11:23:18.432673 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-br6t9\" (UniqueName: \"kubernetes.io/projected/2938c9c3-83a5-4b77-b910-5a0b03b59ad1-kube-api-access-br6t9\") pod \"redhat-operators-sm8xr\" (UID: \"2938c9c3-83a5-4b77-b910-5a0b03b59ad1\") " pod="openshift-marketplace/redhat-operators-sm8xr" Sep 29 11:23:18 crc kubenswrapper[4752]: I0929 11:23:18.534861 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2938c9c3-83a5-4b77-b910-5a0b03b59ad1-catalog-content\") pod \"redhat-operators-sm8xr\" (UID: \"2938c9c3-83a5-4b77-b910-5a0b03b59ad1\") " pod="openshift-marketplace/redhat-operators-sm8xr" Sep 29 11:23:18 crc kubenswrapper[4752]: I0929 11:23:18.534943 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2938c9c3-83a5-4b77-b910-5a0b03b59ad1-utilities\") pod \"redhat-operators-sm8xr\" (UID: \"2938c9c3-83a5-4b77-b910-5a0b03b59ad1\") " pod="openshift-marketplace/redhat-operators-sm8xr" Sep 29 11:23:18 crc kubenswrapper[4752]: I0929 11:23:18.534997 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-br6t9\" (UniqueName: \"kubernetes.io/projected/2938c9c3-83a5-4b77-b910-5a0b03b59ad1-kube-api-access-br6t9\") pod \"redhat-operators-sm8xr\" (UID: \"2938c9c3-83a5-4b77-b910-5a0b03b59ad1\") " pod="openshift-marketplace/redhat-operators-sm8xr" Sep 29 11:23:18 crc kubenswrapper[4752]: I0929 11:23:18.535399 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2938c9c3-83a5-4b77-b910-5a0b03b59ad1-catalog-content\") pod \"redhat-operators-sm8xr\" (UID: \"2938c9c3-83a5-4b77-b910-5a0b03b59ad1\") " pod="openshift-marketplace/redhat-operators-sm8xr" Sep 29 11:23:18 crc kubenswrapper[4752]: I0929 11:23:18.535526 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2938c9c3-83a5-4b77-b910-5a0b03b59ad1-utilities\") pod \"redhat-operators-sm8xr\" (UID: \"2938c9c3-83a5-4b77-b910-5a0b03b59ad1\") " pod="openshift-marketplace/redhat-operators-sm8xr" Sep 29 11:23:18 crc kubenswrapper[4752]: I0929 11:23:18.555112 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-br6t9\" (UniqueName: \"kubernetes.io/projected/2938c9c3-83a5-4b77-b910-5a0b03b59ad1-kube-api-access-br6t9\") pod \"redhat-operators-sm8xr\" (UID: \"2938c9c3-83a5-4b77-b910-5a0b03b59ad1\") " pod="openshift-marketplace/redhat-operators-sm8xr" Sep 29 11:23:18 crc kubenswrapper[4752]: I0929 11:23:18.716223 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sm8xr" Sep 29 11:23:19 crc kubenswrapper[4752]: I0929 11:23:19.031059 4752 scope.go:117] "RemoveContainer" containerID="93752bca1235c82c7e20c88ea68e0afd59b9dc59d3315066b08789cc37a87e37" Sep 29 11:23:19 crc kubenswrapper[4752]: E0929 11:23:19.031341 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgrvs_openshift-machine-config-operator(5863c243-797d-462a-b11f-71aaf005f8d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" Sep 29 11:23:19 crc kubenswrapper[4752]: I0929 11:23:19.179598 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sm8xr"] Sep 29 11:23:19 crc kubenswrapper[4752]: I0929 11:23:19.497875 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sm8xr" event={"ID":"2938c9c3-83a5-4b77-b910-5a0b03b59ad1","Type":"ContainerStarted","Data":"8ea31aa1ee015b8b55e1948ec04df21fb95dc2aa135788494038f4c2315eabf5"} Sep 29 11:23:20 crc kubenswrapper[4752]: I0929 11:23:20.509063 4752 generic.go:334] "Generic (PLEG): container finished" podID="2938c9c3-83a5-4b77-b910-5a0b03b59ad1" containerID="fb92b3a4a57cfca3386cc71bf3b3ca2a5193780ea4d688470a9858348cf980c0" exitCode=0 Sep 29 11:23:20 crc kubenswrapper[4752]: I0929 11:23:20.509161 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sm8xr" event={"ID":"2938c9c3-83a5-4b77-b910-5a0b03b59ad1","Type":"ContainerDied","Data":"fb92b3a4a57cfca3386cc71bf3b3ca2a5193780ea4d688470a9858348cf980c0"} Sep 29 11:23:21 crc kubenswrapper[4752]: I0929 11:23:21.523173 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sm8xr" event={"ID":"2938c9c3-83a5-4b77-b910-5a0b03b59ad1","Type":"ContainerStarted","Data":"cc718e910abbc01dd2b183e42a8b5fe01af333504216d4bf35bd27c045e823d6"} Sep 29 11:23:22 crc kubenswrapper[4752]: I0929 11:23:22.533323 4752 generic.go:334] "Generic (PLEG): container finished" podID="2938c9c3-83a5-4b77-b910-5a0b03b59ad1" containerID="cc718e910abbc01dd2b183e42a8b5fe01af333504216d4bf35bd27c045e823d6" exitCode=0 Sep 29 11:23:22 crc kubenswrapper[4752]: I0929 11:23:22.533405 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sm8xr" event={"ID":"2938c9c3-83a5-4b77-b910-5a0b03b59ad1","Type":"ContainerDied","Data":"cc718e910abbc01dd2b183e42a8b5fe01af333504216d4bf35bd27c045e823d6"} Sep 29 11:23:24 crc kubenswrapper[4752]: I0929 11:23:24.549505 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sm8xr" event={"ID":"2938c9c3-83a5-4b77-b910-5a0b03b59ad1","Type":"ContainerStarted","Data":"b82d14df663f32666e02ea398b87d770cdce3615ef777a9f53e3412c74861672"} Sep 29 11:23:24 crc kubenswrapper[4752]: I0929 11:23:24.572692 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-sm8xr" podStartSLOduration=3.446177282 podStartE2EDuration="6.572672894s" podCreationTimestamp="2025-09-29 11:23:18 +0000 UTC" firstStartedPulling="2025-09-29 11:23:20.511110546 +0000 UTC m=+2341.300252213" lastFinishedPulling="2025-09-29 11:23:23.637606158 +0000 UTC m=+2344.426747825" observedRunningTime="2025-09-29 11:23:24.565669152 +0000 UTC m=+2345.354810809" watchObservedRunningTime="2025-09-29 11:23:24.572672894 +0000 UTC m=+2345.361814561" Sep 29 11:23:28 crc kubenswrapper[4752]: I0929 11:23:28.716946 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-sm8xr" Sep 29 11:23:28 crc kubenswrapper[4752]: I0929 11:23:28.717569 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-sm8xr" Sep 29 11:23:28 crc kubenswrapper[4752]: I0929 11:23:28.768442 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-sm8xr" Sep 29 11:23:29 crc kubenswrapper[4752]: I0929 11:23:29.653403 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-sm8xr" Sep 29 11:23:30 crc kubenswrapper[4752]: I0929 11:23:30.979878 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-sm8xr"] Sep 29 11:23:31 crc kubenswrapper[4752]: I0929 11:23:31.618588 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-sm8xr" podUID="2938c9c3-83a5-4b77-b910-5a0b03b59ad1" containerName="registry-server" containerID="cri-o://b82d14df663f32666e02ea398b87d770cdce3615ef777a9f53e3412c74861672" gracePeriod=2 Sep 29 11:23:32 crc kubenswrapper[4752]: I0929 11:23:32.031155 4752 scope.go:117] "RemoveContainer" containerID="93752bca1235c82c7e20c88ea68e0afd59b9dc59d3315066b08789cc37a87e37" Sep 29 11:23:32 crc kubenswrapper[4752]: E0929 11:23:32.031704 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgrvs_openshift-machine-config-operator(5863c243-797d-462a-b11f-71aaf005f8d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" Sep 29 11:23:32 crc kubenswrapper[4752]: I0929 11:23:32.041755 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sm8xr" Sep 29 11:23:32 crc kubenswrapper[4752]: I0929 11:23:32.058922 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-br6t9\" (UniqueName: \"kubernetes.io/projected/2938c9c3-83a5-4b77-b910-5a0b03b59ad1-kube-api-access-br6t9\") pod \"2938c9c3-83a5-4b77-b910-5a0b03b59ad1\" (UID: \"2938c9c3-83a5-4b77-b910-5a0b03b59ad1\") " Sep 29 11:23:32 crc kubenswrapper[4752]: I0929 11:23:32.058990 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2938c9c3-83a5-4b77-b910-5a0b03b59ad1-utilities\") pod \"2938c9c3-83a5-4b77-b910-5a0b03b59ad1\" (UID: \"2938c9c3-83a5-4b77-b910-5a0b03b59ad1\") " Sep 29 11:23:32 crc kubenswrapper[4752]: I0929 11:23:32.059159 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2938c9c3-83a5-4b77-b910-5a0b03b59ad1-catalog-content\") pod \"2938c9c3-83a5-4b77-b910-5a0b03b59ad1\" (UID: \"2938c9c3-83a5-4b77-b910-5a0b03b59ad1\") " Sep 29 11:23:32 crc kubenswrapper[4752]: I0929 11:23:32.060187 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2938c9c3-83a5-4b77-b910-5a0b03b59ad1-utilities" (OuterVolumeSpecName: "utilities") pod "2938c9c3-83a5-4b77-b910-5a0b03b59ad1" (UID: "2938c9c3-83a5-4b77-b910-5a0b03b59ad1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 11:23:32 crc kubenswrapper[4752]: I0929 11:23:32.065433 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2938c9c3-83a5-4b77-b910-5a0b03b59ad1-kube-api-access-br6t9" (OuterVolumeSpecName: "kube-api-access-br6t9") pod "2938c9c3-83a5-4b77-b910-5a0b03b59ad1" (UID: "2938c9c3-83a5-4b77-b910-5a0b03b59ad1"). InnerVolumeSpecName "kube-api-access-br6t9". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 11:23:32 crc kubenswrapper[4752]: I0929 11:23:32.154960 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2938c9c3-83a5-4b77-b910-5a0b03b59ad1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2938c9c3-83a5-4b77-b910-5a0b03b59ad1" (UID: "2938c9c3-83a5-4b77-b910-5a0b03b59ad1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 11:23:32 crc kubenswrapper[4752]: I0929 11:23:32.161619 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-br6t9\" (UniqueName: \"kubernetes.io/projected/2938c9c3-83a5-4b77-b910-5a0b03b59ad1-kube-api-access-br6t9\") on node \"crc\" DevicePath \"\"" Sep 29 11:23:32 crc kubenswrapper[4752]: I0929 11:23:32.161648 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2938c9c3-83a5-4b77-b910-5a0b03b59ad1-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 11:23:32 crc kubenswrapper[4752]: I0929 11:23:32.161656 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2938c9c3-83a5-4b77-b910-5a0b03b59ad1-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 11:23:32 crc kubenswrapper[4752]: I0929 11:23:32.628299 4752 generic.go:334] "Generic (PLEG): container finished" podID="2938c9c3-83a5-4b77-b910-5a0b03b59ad1" containerID="b82d14df663f32666e02ea398b87d770cdce3615ef777a9f53e3412c74861672" exitCode=0 Sep 29 11:23:32 crc kubenswrapper[4752]: I0929 11:23:32.628362 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sm8xr" event={"ID":"2938c9c3-83a5-4b77-b910-5a0b03b59ad1","Type":"ContainerDied","Data":"b82d14df663f32666e02ea398b87d770cdce3615ef777a9f53e3412c74861672"} Sep 29 11:23:32 crc kubenswrapper[4752]: I0929 11:23:32.628559 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sm8xr" event={"ID":"2938c9c3-83a5-4b77-b910-5a0b03b59ad1","Type":"ContainerDied","Data":"8ea31aa1ee015b8b55e1948ec04df21fb95dc2aa135788494038f4c2315eabf5"} Sep 29 11:23:32 crc kubenswrapper[4752]: I0929 11:23:32.628589 4752 scope.go:117] "RemoveContainer" containerID="b82d14df663f32666e02ea398b87d770cdce3615ef777a9f53e3412c74861672" Sep 29 11:23:32 crc kubenswrapper[4752]: I0929 11:23:32.628381 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sm8xr" Sep 29 11:23:32 crc kubenswrapper[4752]: I0929 11:23:32.649743 4752 scope.go:117] "RemoveContainer" containerID="cc718e910abbc01dd2b183e42a8b5fe01af333504216d4bf35bd27c045e823d6" Sep 29 11:23:32 crc kubenswrapper[4752]: I0929 11:23:32.659588 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-sm8xr"] Sep 29 11:23:32 crc kubenswrapper[4752]: I0929 11:23:32.666948 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-sm8xr"] Sep 29 11:23:32 crc kubenswrapper[4752]: I0929 11:23:32.683459 4752 scope.go:117] "RemoveContainer" containerID="fb92b3a4a57cfca3386cc71bf3b3ca2a5193780ea4d688470a9858348cf980c0" Sep 29 11:23:32 crc kubenswrapper[4752]: I0929 11:23:32.704414 4752 scope.go:117] "RemoveContainer" containerID="b82d14df663f32666e02ea398b87d770cdce3615ef777a9f53e3412c74861672" Sep 29 11:23:32 crc kubenswrapper[4752]: E0929 11:23:32.705089 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b82d14df663f32666e02ea398b87d770cdce3615ef777a9f53e3412c74861672\": container with ID starting with b82d14df663f32666e02ea398b87d770cdce3615ef777a9f53e3412c74861672 not found: ID does not exist" containerID="b82d14df663f32666e02ea398b87d770cdce3615ef777a9f53e3412c74861672" Sep 29 11:23:32 crc kubenswrapper[4752]: I0929 11:23:32.705136 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b82d14df663f32666e02ea398b87d770cdce3615ef777a9f53e3412c74861672"} err="failed to get container status \"b82d14df663f32666e02ea398b87d770cdce3615ef777a9f53e3412c74861672\": rpc error: code = NotFound desc = could not find container \"b82d14df663f32666e02ea398b87d770cdce3615ef777a9f53e3412c74861672\": container with ID starting with b82d14df663f32666e02ea398b87d770cdce3615ef777a9f53e3412c74861672 not found: ID does not exist" Sep 29 11:23:32 crc kubenswrapper[4752]: I0929 11:23:32.705163 4752 scope.go:117] "RemoveContainer" containerID="cc718e910abbc01dd2b183e42a8b5fe01af333504216d4bf35bd27c045e823d6" Sep 29 11:23:32 crc kubenswrapper[4752]: E0929 11:23:32.705592 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc718e910abbc01dd2b183e42a8b5fe01af333504216d4bf35bd27c045e823d6\": container with ID starting with cc718e910abbc01dd2b183e42a8b5fe01af333504216d4bf35bd27c045e823d6 not found: ID does not exist" containerID="cc718e910abbc01dd2b183e42a8b5fe01af333504216d4bf35bd27c045e823d6" Sep 29 11:23:32 crc kubenswrapper[4752]: I0929 11:23:32.705644 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc718e910abbc01dd2b183e42a8b5fe01af333504216d4bf35bd27c045e823d6"} err="failed to get container status \"cc718e910abbc01dd2b183e42a8b5fe01af333504216d4bf35bd27c045e823d6\": rpc error: code = NotFound desc = could not find container \"cc718e910abbc01dd2b183e42a8b5fe01af333504216d4bf35bd27c045e823d6\": container with ID starting with cc718e910abbc01dd2b183e42a8b5fe01af333504216d4bf35bd27c045e823d6 not found: ID does not exist" Sep 29 11:23:32 crc kubenswrapper[4752]: I0929 11:23:32.705680 4752 scope.go:117] "RemoveContainer" containerID="fb92b3a4a57cfca3386cc71bf3b3ca2a5193780ea4d688470a9858348cf980c0" Sep 29 11:23:32 crc kubenswrapper[4752]: E0929 11:23:32.706248 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb92b3a4a57cfca3386cc71bf3b3ca2a5193780ea4d688470a9858348cf980c0\": container with ID starting with fb92b3a4a57cfca3386cc71bf3b3ca2a5193780ea4d688470a9858348cf980c0 not found: ID does not exist" containerID="fb92b3a4a57cfca3386cc71bf3b3ca2a5193780ea4d688470a9858348cf980c0" Sep 29 11:23:32 crc kubenswrapper[4752]: I0929 11:23:32.706277 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb92b3a4a57cfca3386cc71bf3b3ca2a5193780ea4d688470a9858348cf980c0"} err="failed to get container status \"fb92b3a4a57cfca3386cc71bf3b3ca2a5193780ea4d688470a9858348cf980c0\": rpc error: code = NotFound desc = could not find container \"fb92b3a4a57cfca3386cc71bf3b3ca2a5193780ea4d688470a9858348cf980c0\": container with ID starting with fb92b3a4a57cfca3386cc71bf3b3ca2a5193780ea4d688470a9858348cf980c0 not found: ID does not exist" Sep 29 11:23:34 crc kubenswrapper[4752]: I0929 11:23:34.044299 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2938c9c3-83a5-4b77-b910-5a0b03b59ad1" path="/var/lib/kubelet/pods/2938c9c3-83a5-4b77-b910-5a0b03b59ad1/volumes" Sep 29 11:23:46 crc kubenswrapper[4752]: I0929 11:23:46.031652 4752 scope.go:117] "RemoveContainer" containerID="93752bca1235c82c7e20c88ea68e0afd59b9dc59d3315066b08789cc37a87e37" Sep 29 11:23:46 crc kubenswrapper[4752]: E0929 11:23:46.032672 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgrvs_openshift-machine-config-operator(5863c243-797d-462a-b11f-71aaf005f8d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" Sep 29 11:23:59 crc kubenswrapper[4752]: I0929 11:23:59.030999 4752 scope.go:117] "RemoveContainer" containerID="93752bca1235c82c7e20c88ea68e0afd59b9dc59d3315066b08789cc37a87e37" Sep 29 11:23:59 crc kubenswrapper[4752]: E0929 11:23:59.031759 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgrvs_openshift-machine-config-operator(5863c243-797d-462a-b11f-71aaf005f8d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" Sep 29 11:24:10 crc kubenswrapper[4752]: I0929 11:24:10.036936 4752 scope.go:117] "RemoveContainer" containerID="93752bca1235c82c7e20c88ea68e0afd59b9dc59d3315066b08789cc37a87e37" Sep 29 11:24:10 crc kubenswrapper[4752]: E0929 11:24:10.038269 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgrvs_openshift-machine-config-operator(5863c243-797d-462a-b11f-71aaf005f8d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" Sep 29 11:24:24 crc kubenswrapper[4752]: I0929 11:24:24.030974 4752 scope.go:117] "RemoveContainer" containerID="93752bca1235c82c7e20c88ea68e0afd59b9dc59d3315066b08789cc37a87e37" Sep 29 11:24:24 crc kubenswrapper[4752]: E0929 11:24:24.031748 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgrvs_openshift-machine-config-operator(5863c243-797d-462a-b11f-71aaf005f8d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" Sep 29 11:24:36 crc kubenswrapper[4752]: I0929 11:24:36.031102 4752 scope.go:117] "RemoveContainer" containerID="93752bca1235c82c7e20c88ea68e0afd59b9dc59d3315066b08789cc37a87e37" Sep 29 11:24:36 crc kubenswrapper[4752]: E0929 11:24:36.032962 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgrvs_openshift-machine-config-operator(5863c243-797d-462a-b11f-71aaf005f8d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" Sep 29 11:24:51 crc kubenswrapper[4752]: I0929 11:24:51.030773 4752 scope.go:117] "RemoveContainer" containerID="93752bca1235c82c7e20c88ea68e0afd59b9dc59d3315066b08789cc37a87e37" Sep 29 11:24:51 crc kubenswrapper[4752]: E0929 11:24:51.031867 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgrvs_openshift-machine-config-operator(5863c243-797d-462a-b11f-71aaf005f8d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" Sep 29 11:25:02 crc kubenswrapper[4752]: I0929 11:25:02.031525 4752 scope.go:117] "RemoveContainer" containerID="93752bca1235c82c7e20c88ea68e0afd59b9dc59d3315066b08789cc37a87e37" Sep 29 11:25:02 crc kubenswrapper[4752]: E0929 11:25:02.032305 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgrvs_openshift-machine-config-operator(5863c243-797d-462a-b11f-71aaf005f8d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" Sep 29 11:25:12 crc kubenswrapper[4752]: I0929 11:25:12.811675 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-b6v7d"] Sep 29 11:25:12 crc kubenswrapper[4752]: I0929 11:25:12.821313 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-b6v7d"] Sep 29 11:25:12 crc kubenswrapper[4752]: I0929 11:25:12.875909 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher3ba4-account-delete-zpl66"] Sep 29 11:25:12 crc kubenswrapper[4752]: E0929 11:25:12.876266 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2938c9c3-83a5-4b77-b910-5a0b03b59ad1" containerName="registry-server" Sep 29 11:25:12 crc kubenswrapper[4752]: I0929 11:25:12.876284 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="2938c9c3-83a5-4b77-b910-5a0b03b59ad1" containerName="registry-server" Sep 29 11:25:12 crc kubenswrapper[4752]: E0929 11:25:12.876296 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2938c9c3-83a5-4b77-b910-5a0b03b59ad1" containerName="extract-utilities" Sep 29 11:25:12 crc kubenswrapper[4752]: I0929 11:25:12.876304 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="2938c9c3-83a5-4b77-b910-5a0b03b59ad1" containerName="extract-utilities" Sep 29 11:25:12 crc kubenswrapper[4752]: E0929 11:25:12.876318 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2938c9c3-83a5-4b77-b910-5a0b03b59ad1" containerName="extract-content" Sep 29 11:25:12 crc kubenswrapper[4752]: I0929 11:25:12.876324 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="2938c9c3-83a5-4b77-b910-5a0b03b59ad1" containerName="extract-content" Sep 29 11:25:12 crc kubenswrapper[4752]: I0929 11:25:12.876462 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="2938c9c3-83a5-4b77-b910-5a0b03b59ad1" containerName="registry-server" Sep 29 11:25:12 crc kubenswrapper[4752]: I0929 11:25:12.877044 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher3ba4-account-delete-zpl66" Sep 29 11:25:12 crc kubenswrapper[4752]: I0929 11:25:12.890166 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher3ba4-account-delete-zpl66"] Sep 29 11:25:12 crc kubenswrapper[4752]: I0929 11:25:12.899093 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Sep 29 11:25:12 crc kubenswrapper[4752]: I0929 11:25:12.899393 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podUID="a22b23af-2fc7-46f3-9c73-87bc740dad1d" containerName="watcher-applier" containerID="cri-o://18eccdbe0dd6a0b5ffa48b63bee01e4af645e35adbb34d5064e809e647e8fe8b" gracePeriod=30 Sep 29 11:25:12 crc kubenswrapper[4752]: I0929 11:25:12.923787 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-db-create-rphvs"] Sep 29 11:25:12 crc kubenswrapper[4752]: I0929 11:25:12.925434 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7k44\" (UniqueName: \"kubernetes.io/projected/35a2f395-752e-45f2-8c82-6816af965761-kube-api-access-b7k44\") pod \"watcher3ba4-account-delete-zpl66\" (UID: \"35a2f395-752e-45f2-8c82-6816af965761\") " pod="watcher-kuttl-default/watcher3ba4-account-delete-zpl66" Sep 29 11:25:12 crc kubenswrapper[4752]: I0929 11:25:12.939381 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-db-create-rphvs"] Sep 29 11:25:12 crc kubenswrapper[4752]: I0929 11:25:12.989285 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher3ba4-account-delete-zpl66"] Sep 29 11:25:12 crc kubenswrapper[4752]: E0929 11:25:12.989907 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-b7k44], unattached volumes=[], failed to process volumes=[]: context canceled" pod="watcher-kuttl-default/watcher3ba4-account-delete-zpl66" podUID="35a2f395-752e-45f2-8c82-6816af965761" Sep 29 11:25:13 crc kubenswrapper[4752]: I0929 11:25:13.003815 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-3ba4-account-create-gjw7n"] Sep 29 11:25:13 crc kubenswrapper[4752]: I0929 11:25:13.010866 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-3ba4-account-create-gjw7n"] Sep 29 11:25:13 crc kubenswrapper[4752]: I0929 11:25:13.024466 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Sep 29 11:25:13 crc kubenswrapper[4752]: I0929 11:25:13.024856 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="9f99592e-51b1-44bd-bd79-d7d3d3bc7c26" containerName="watcher-kuttl-api-log" containerID="cri-o://f3635e71220d0d708499ef8ad13d80c71cea4e85679662528dfdfebd177287d0" gracePeriod=30 Sep 29 11:25:13 crc kubenswrapper[4752]: I0929 11:25:13.024969 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="9f99592e-51b1-44bd-bd79-d7d3d3bc7c26" containerName="watcher-api" containerID="cri-o://2b16e28236bf1fce5ca0b48f7551fddbb674dc12077a05c1597a0697beb0068e" gracePeriod=30 Sep 29 11:25:13 crc kubenswrapper[4752]: I0929 11:25:13.028918 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7k44\" (UniqueName: \"kubernetes.io/projected/35a2f395-752e-45f2-8c82-6816af965761-kube-api-access-b7k44\") pod \"watcher3ba4-account-delete-zpl66\" (UID: \"35a2f395-752e-45f2-8c82-6816af965761\") " pod="watcher-kuttl-default/watcher3ba4-account-delete-zpl66" Sep 29 11:25:13 crc kubenswrapper[4752]: I0929 11:25:13.037539 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Sep 29 11:25:13 crc kubenswrapper[4752]: I0929 11:25:13.037771 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="59b9b329-fe3a-4af8-8234-2d560d557569" containerName="watcher-decision-engine" containerID="cri-o://99294096ba745b02ccdde54944f3af21a500b75829775a82857c3603fb9f1cdb" gracePeriod=30 Sep 29 11:25:13 crc kubenswrapper[4752]: I0929 11:25:13.059491 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7k44\" (UniqueName: \"kubernetes.io/projected/35a2f395-752e-45f2-8c82-6816af965761-kube-api-access-b7k44\") pod \"watcher3ba4-account-delete-zpl66\" (UID: \"35a2f395-752e-45f2-8c82-6816af965761\") " pod="watcher-kuttl-default/watcher3ba4-account-delete-zpl66" Sep 29 11:25:13 crc kubenswrapper[4752]: I0929 11:25:13.243875 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-db-create-97jzl"] Sep 29 11:25:13 crc kubenswrapper[4752]: I0929 11:25:13.245038 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-97jzl" Sep 29 11:25:13 crc kubenswrapper[4752]: I0929 11:25:13.258444 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-db-create-97jzl"] Sep 29 11:25:13 crc kubenswrapper[4752]: I0929 11:25:13.336821 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xc7g6\" (UniqueName: \"kubernetes.io/projected/45988c51-6072-40ba-91cf-dce53d796c66-kube-api-access-xc7g6\") pod \"watcher-db-create-97jzl\" (UID: \"45988c51-6072-40ba-91cf-dce53d796c66\") " pod="watcher-kuttl-default/watcher-db-create-97jzl" Sep 29 11:25:13 crc kubenswrapper[4752]: I0929 11:25:13.438442 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xc7g6\" (UniqueName: \"kubernetes.io/projected/45988c51-6072-40ba-91cf-dce53d796c66-kube-api-access-xc7g6\") pod \"watcher-db-create-97jzl\" (UID: \"45988c51-6072-40ba-91cf-dce53d796c66\") " pod="watcher-kuttl-default/watcher-db-create-97jzl" Sep 29 11:25:13 crc kubenswrapper[4752]: I0929 11:25:13.459835 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xc7g6\" (UniqueName: \"kubernetes.io/projected/45988c51-6072-40ba-91cf-dce53d796c66-kube-api-access-xc7g6\") pod \"watcher-db-create-97jzl\" (UID: \"45988c51-6072-40ba-91cf-dce53d796c66\") " pod="watcher-kuttl-default/watcher-db-create-97jzl" Sep 29 11:25:13 crc kubenswrapper[4752]: I0929 11:25:13.469921 4752 generic.go:334] "Generic (PLEG): container finished" podID="9f99592e-51b1-44bd-bd79-d7d3d3bc7c26" containerID="f3635e71220d0d708499ef8ad13d80c71cea4e85679662528dfdfebd177287d0" exitCode=143 Sep 29 11:25:13 crc kubenswrapper[4752]: I0929 11:25:13.469984 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher3ba4-account-delete-zpl66" Sep 29 11:25:13 crc kubenswrapper[4752]: I0929 11:25:13.470585 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"9f99592e-51b1-44bd-bd79-d7d3d3bc7c26","Type":"ContainerDied","Data":"f3635e71220d0d708499ef8ad13d80c71cea4e85679662528dfdfebd177287d0"} Sep 29 11:25:13 crc kubenswrapper[4752]: I0929 11:25:13.507254 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher3ba4-account-delete-zpl66" Sep 29 11:25:13 crc kubenswrapper[4752]: I0929 11:25:13.559400 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-97jzl" Sep 29 11:25:13 crc kubenswrapper[4752]: I0929 11:25:13.642442 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b7k44\" (UniqueName: \"kubernetes.io/projected/35a2f395-752e-45f2-8c82-6816af965761-kube-api-access-b7k44\") pod \"35a2f395-752e-45f2-8c82-6816af965761\" (UID: \"35a2f395-752e-45f2-8c82-6816af965761\") " Sep 29 11:25:13 crc kubenswrapper[4752]: I0929 11:25:13.649294 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35a2f395-752e-45f2-8c82-6816af965761-kube-api-access-b7k44" (OuterVolumeSpecName: "kube-api-access-b7k44") pod "35a2f395-752e-45f2-8c82-6816af965761" (UID: "35a2f395-752e-45f2-8c82-6816af965761"). InnerVolumeSpecName "kube-api-access-b7k44". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 11:25:13 crc kubenswrapper[4752]: I0929 11:25:13.747469 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b7k44\" (UniqueName: \"kubernetes.io/projected/35a2f395-752e-45f2-8c82-6816af965761-kube-api-access-b7k44\") on node \"crc\" DevicePath \"\"" Sep 29 11:25:13 crc kubenswrapper[4752]: E0929 11:25:13.796275 4752 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="18eccdbe0dd6a0b5ffa48b63bee01e4af645e35adbb34d5064e809e647e8fe8b" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Sep 29 11:25:13 crc kubenswrapper[4752]: E0929 11:25:13.797484 4752 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="18eccdbe0dd6a0b5ffa48b63bee01e4af645e35adbb34d5064e809e647e8fe8b" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Sep 29 11:25:13 crc kubenswrapper[4752]: E0929 11:25:13.799179 4752 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="18eccdbe0dd6a0b5ffa48b63bee01e4af645e35adbb34d5064e809e647e8fe8b" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Sep 29 11:25:13 crc kubenswrapper[4752]: E0929 11:25:13.799220 4752 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podUID="a22b23af-2fc7-46f3-9c73-87bc740dad1d" containerName="watcher-applier" Sep 29 11:25:14 crc kubenswrapper[4752]: I0929 11:25:14.053887 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4872b8c9-dd16-411f-a040-58971a7cf987" path="/var/lib/kubelet/pods/4872b8c9-dd16-411f-a040-58971a7cf987/volumes" Sep 29 11:25:14 crc kubenswrapper[4752]: I0929 11:25:14.054795 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="761747e9-4df9-49ce-acdc-dac1809f07b7" path="/var/lib/kubelet/pods/761747e9-4df9-49ce-acdc-dac1809f07b7/volumes" Sep 29 11:25:14 crc kubenswrapper[4752]: I0929 11:25:14.055213 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="879c3c04-5844-497c-81ae-c5218cdcd806" path="/var/lib/kubelet/pods/879c3c04-5844-497c-81ae-c5218cdcd806/volumes" Sep 29 11:25:14 crc kubenswrapper[4752]: I0929 11:25:14.055750 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-db-create-97jzl"] Sep 29 11:25:14 crc kubenswrapper[4752]: I0929 11:25:14.165958 4752 prober.go:107] "Probe failed" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="9f99592e-51b1-44bd-bd79-d7d3d3bc7c26" containerName="watcher-kuttl-api-log" probeResult="failure" output="Get \"http://10.217.0.169:9322/\": read tcp 10.217.0.2:49290->10.217.0.169:9322: read: connection reset by peer" Sep 29 11:25:14 crc kubenswrapper[4752]: I0929 11:25:14.166076 4752 prober.go:107] "Probe failed" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="9f99592e-51b1-44bd-bd79-d7d3d3bc7c26" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.169:9322/\": read tcp 10.217.0.2:49284->10.217.0.169:9322: read: connection reset by peer" Sep 29 11:25:14 crc kubenswrapper[4752]: I0929 11:25:14.481281 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-97jzl" event={"ID":"45988c51-6072-40ba-91cf-dce53d796c66","Type":"ContainerStarted","Data":"460a2a199251eca257f5d76305a4ea39ad8b86f7cc868993b6371f1270ca6437"} Sep 29 11:25:14 crc kubenswrapper[4752]: I0929 11:25:14.482137 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-97jzl" event={"ID":"45988c51-6072-40ba-91cf-dce53d796c66","Type":"ContainerStarted","Data":"1af112391bdb3e9760227882520cf1f90af71ca7a148c2512b18c75b91b9ad0c"} Sep 29 11:25:14 crc kubenswrapper[4752]: I0929 11:25:14.494280 4752 generic.go:334] "Generic (PLEG): container finished" podID="9f99592e-51b1-44bd-bd79-d7d3d3bc7c26" containerID="2b16e28236bf1fce5ca0b48f7551fddbb674dc12077a05c1597a0697beb0068e" exitCode=0 Sep 29 11:25:14 crc kubenswrapper[4752]: I0929 11:25:14.494431 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher3ba4-account-delete-zpl66" Sep 29 11:25:14 crc kubenswrapper[4752]: I0929 11:25:14.495397 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"9f99592e-51b1-44bd-bd79-d7d3d3bc7c26","Type":"ContainerDied","Data":"2b16e28236bf1fce5ca0b48f7551fddbb674dc12077a05c1597a0697beb0068e"} Sep 29 11:25:14 crc kubenswrapper[4752]: I0929 11:25:14.599645 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher3ba4-account-delete-zpl66"] Sep 29 11:25:14 crc kubenswrapper[4752]: I0929 11:25:14.616760 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher3ba4-account-delete-zpl66"] Sep 29 11:25:14 crc kubenswrapper[4752]: I0929 11:25:14.655445 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:25:14 crc kubenswrapper[4752]: I0929 11:25:14.763310 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/9f99592e-51b1-44bd-bd79-d7d3d3bc7c26-custom-prometheus-ca\") pod \"9f99592e-51b1-44bd-bd79-d7d3d3bc7c26\" (UID: \"9f99592e-51b1-44bd-bd79-d7d3d3bc7c26\") " Sep 29 11:25:14 crc kubenswrapper[4752]: I0929 11:25:14.763418 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vc6ws\" (UniqueName: \"kubernetes.io/projected/9f99592e-51b1-44bd-bd79-d7d3d3bc7c26-kube-api-access-vc6ws\") pod \"9f99592e-51b1-44bd-bd79-d7d3d3bc7c26\" (UID: \"9f99592e-51b1-44bd-bd79-d7d3d3bc7c26\") " Sep 29 11:25:14 crc kubenswrapper[4752]: I0929 11:25:14.763445 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f99592e-51b1-44bd-bd79-d7d3d3bc7c26-combined-ca-bundle\") pod \"9f99592e-51b1-44bd-bd79-d7d3d3bc7c26\" (UID: \"9f99592e-51b1-44bd-bd79-d7d3d3bc7c26\") " Sep 29 11:25:14 crc kubenswrapper[4752]: I0929 11:25:14.763496 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f99592e-51b1-44bd-bd79-d7d3d3bc7c26-logs\") pod \"9f99592e-51b1-44bd-bd79-d7d3d3bc7c26\" (UID: \"9f99592e-51b1-44bd-bd79-d7d3d3bc7c26\") " Sep 29 11:25:14 crc kubenswrapper[4752]: I0929 11:25:14.763612 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f99592e-51b1-44bd-bd79-d7d3d3bc7c26-config-data\") pod \"9f99592e-51b1-44bd-bd79-d7d3d3bc7c26\" (UID: \"9f99592e-51b1-44bd-bd79-d7d3d3bc7c26\") " Sep 29 11:25:14 crc kubenswrapper[4752]: I0929 11:25:14.767263 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f99592e-51b1-44bd-bd79-d7d3d3bc7c26-logs" (OuterVolumeSpecName: "logs") pod "9f99592e-51b1-44bd-bd79-d7d3d3bc7c26" (UID: "9f99592e-51b1-44bd-bd79-d7d3d3bc7c26"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 11:25:14 crc kubenswrapper[4752]: I0929 11:25:14.774035 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f99592e-51b1-44bd-bd79-d7d3d3bc7c26-kube-api-access-vc6ws" (OuterVolumeSpecName: "kube-api-access-vc6ws") pod "9f99592e-51b1-44bd-bd79-d7d3d3bc7c26" (UID: "9f99592e-51b1-44bd-bd79-d7d3d3bc7c26"). InnerVolumeSpecName "kube-api-access-vc6ws". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 11:25:14 crc kubenswrapper[4752]: I0929 11:25:14.861067 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f99592e-51b1-44bd-bd79-d7d3d3bc7c26-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "9f99592e-51b1-44bd-bd79-d7d3d3bc7c26" (UID: "9f99592e-51b1-44bd-bd79-d7d3d3bc7c26"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:25:14 crc kubenswrapper[4752]: I0929 11:25:14.866429 4752 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/9f99592e-51b1-44bd-bd79-d7d3d3bc7c26-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Sep 29 11:25:14 crc kubenswrapper[4752]: I0929 11:25:14.866476 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vc6ws\" (UniqueName: \"kubernetes.io/projected/9f99592e-51b1-44bd-bd79-d7d3d3bc7c26-kube-api-access-vc6ws\") on node \"crc\" DevicePath \"\"" Sep 29 11:25:14 crc kubenswrapper[4752]: I0929 11:25:14.866493 4752 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f99592e-51b1-44bd-bd79-d7d3d3bc7c26-logs\") on node \"crc\" DevicePath \"\"" Sep 29 11:25:14 crc kubenswrapper[4752]: I0929 11:25:14.867088 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f99592e-51b1-44bd-bd79-d7d3d3bc7c26-config-data" (OuterVolumeSpecName: "config-data") pod "9f99592e-51b1-44bd-bd79-d7d3d3bc7c26" (UID: "9f99592e-51b1-44bd-bd79-d7d3d3bc7c26"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:25:14 crc kubenswrapper[4752]: I0929 11:25:14.877927 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f99592e-51b1-44bd-bd79-d7d3d3bc7c26-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9f99592e-51b1-44bd-bd79-d7d3d3bc7c26" (UID: "9f99592e-51b1-44bd-bd79-d7d3d3bc7c26"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:25:14 crc kubenswrapper[4752]: I0929 11:25:14.968144 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f99592e-51b1-44bd-bd79-d7d3d3bc7c26-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 11:25:14 crc kubenswrapper[4752]: I0929 11:25:14.968204 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f99592e-51b1-44bd-bd79-d7d3d3bc7c26-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 11:25:15 crc kubenswrapper[4752]: I0929 11:25:15.390047 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-m7dts"] Sep 29 11:25:15 crc kubenswrapper[4752]: E0929 11:25:15.390633 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f99592e-51b1-44bd-bd79-d7d3d3bc7c26" containerName="watcher-kuttl-api-log" Sep 29 11:25:15 crc kubenswrapper[4752]: I0929 11:25:15.390643 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f99592e-51b1-44bd-bd79-d7d3d3bc7c26" containerName="watcher-kuttl-api-log" Sep 29 11:25:15 crc kubenswrapper[4752]: E0929 11:25:15.390658 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f99592e-51b1-44bd-bd79-d7d3d3bc7c26" containerName="watcher-api" Sep 29 11:25:15 crc kubenswrapper[4752]: I0929 11:25:15.390664 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f99592e-51b1-44bd-bd79-d7d3d3bc7c26" containerName="watcher-api" Sep 29 11:25:15 crc kubenswrapper[4752]: I0929 11:25:15.390835 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f99592e-51b1-44bd-bd79-d7d3d3bc7c26" containerName="watcher-api" Sep 29 11:25:15 crc kubenswrapper[4752]: I0929 11:25:15.390848 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f99592e-51b1-44bd-bd79-d7d3d3bc7c26" containerName="watcher-kuttl-api-log" Sep 29 11:25:15 crc kubenswrapper[4752]: I0929 11:25:15.391991 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m7dts" Sep 29 11:25:15 crc kubenswrapper[4752]: I0929 11:25:15.401067 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-m7dts"] Sep 29 11:25:15 crc kubenswrapper[4752]: I0929 11:25:15.475325 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xbgl\" (UniqueName: \"kubernetes.io/projected/1454f20a-1684-4223-b8d5-b171a26471f7-kube-api-access-2xbgl\") pod \"certified-operators-m7dts\" (UID: \"1454f20a-1684-4223-b8d5-b171a26471f7\") " pod="openshift-marketplace/certified-operators-m7dts" Sep 29 11:25:15 crc kubenswrapper[4752]: I0929 11:25:15.475404 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1454f20a-1684-4223-b8d5-b171a26471f7-catalog-content\") pod \"certified-operators-m7dts\" (UID: \"1454f20a-1684-4223-b8d5-b171a26471f7\") " pod="openshift-marketplace/certified-operators-m7dts" Sep 29 11:25:15 crc kubenswrapper[4752]: I0929 11:25:15.475673 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1454f20a-1684-4223-b8d5-b171a26471f7-utilities\") pod \"certified-operators-m7dts\" (UID: \"1454f20a-1684-4223-b8d5-b171a26471f7\") " pod="openshift-marketplace/certified-operators-m7dts" Sep 29 11:25:15 crc kubenswrapper[4752]: I0929 11:25:15.503432 4752 generic.go:334] "Generic (PLEG): container finished" podID="45988c51-6072-40ba-91cf-dce53d796c66" containerID="460a2a199251eca257f5d76305a4ea39ad8b86f7cc868993b6371f1270ca6437" exitCode=0 Sep 29 11:25:15 crc kubenswrapper[4752]: I0929 11:25:15.503516 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-97jzl" event={"ID":"45988c51-6072-40ba-91cf-dce53d796c66","Type":"ContainerDied","Data":"460a2a199251eca257f5d76305a4ea39ad8b86f7cc868993b6371f1270ca6437"} Sep 29 11:25:15 crc kubenswrapper[4752]: I0929 11:25:15.506716 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"9f99592e-51b1-44bd-bd79-d7d3d3bc7c26","Type":"ContainerDied","Data":"00e421de18d9c5da494915a2bc6fa1b4be38bdb38fb78418700a814db4933a18"} Sep 29 11:25:15 crc kubenswrapper[4752]: I0929 11:25:15.506753 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:25:15 crc kubenswrapper[4752]: I0929 11:25:15.506772 4752 scope.go:117] "RemoveContainer" containerID="2b16e28236bf1fce5ca0b48f7551fddbb674dc12077a05c1597a0697beb0068e" Sep 29 11:25:15 crc kubenswrapper[4752]: I0929 11:25:15.541934 4752 scope.go:117] "RemoveContainer" containerID="f3635e71220d0d708499ef8ad13d80c71cea4e85679662528dfdfebd177287d0" Sep 29 11:25:15 crc kubenswrapper[4752]: I0929 11:25:15.544466 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Sep 29 11:25:15 crc kubenswrapper[4752]: I0929 11:25:15.552233 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Sep 29 11:25:15 crc kubenswrapper[4752]: I0929 11:25:15.577661 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1454f20a-1684-4223-b8d5-b171a26471f7-catalog-content\") pod \"certified-operators-m7dts\" (UID: \"1454f20a-1684-4223-b8d5-b171a26471f7\") " pod="openshift-marketplace/certified-operators-m7dts" Sep 29 11:25:15 crc kubenswrapper[4752]: I0929 11:25:15.577764 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1454f20a-1684-4223-b8d5-b171a26471f7-utilities\") pod \"certified-operators-m7dts\" (UID: \"1454f20a-1684-4223-b8d5-b171a26471f7\") " pod="openshift-marketplace/certified-operators-m7dts" Sep 29 11:25:15 crc kubenswrapper[4752]: I0929 11:25:15.577873 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xbgl\" (UniqueName: \"kubernetes.io/projected/1454f20a-1684-4223-b8d5-b171a26471f7-kube-api-access-2xbgl\") pod \"certified-operators-m7dts\" (UID: \"1454f20a-1684-4223-b8d5-b171a26471f7\") " pod="openshift-marketplace/certified-operators-m7dts" Sep 29 11:25:15 crc kubenswrapper[4752]: I0929 11:25:15.578374 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1454f20a-1684-4223-b8d5-b171a26471f7-catalog-content\") pod \"certified-operators-m7dts\" (UID: \"1454f20a-1684-4223-b8d5-b171a26471f7\") " pod="openshift-marketplace/certified-operators-m7dts" Sep 29 11:25:15 crc kubenswrapper[4752]: I0929 11:25:15.578608 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1454f20a-1684-4223-b8d5-b171a26471f7-utilities\") pod \"certified-operators-m7dts\" (UID: \"1454f20a-1684-4223-b8d5-b171a26471f7\") " pod="openshift-marketplace/certified-operators-m7dts" Sep 29 11:25:15 crc kubenswrapper[4752]: I0929 11:25:15.611348 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xbgl\" (UniqueName: \"kubernetes.io/projected/1454f20a-1684-4223-b8d5-b171a26471f7-kube-api-access-2xbgl\") pod \"certified-operators-m7dts\" (UID: \"1454f20a-1684-4223-b8d5-b171a26471f7\") " pod="openshift-marketplace/certified-operators-m7dts" Sep 29 11:25:15 crc kubenswrapper[4752]: I0929 11:25:15.710858 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m7dts" Sep 29 11:25:15 crc kubenswrapper[4752]: I0929 11:25:15.920685 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-97jzl" Sep 29 11:25:15 crc kubenswrapper[4752]: I0929 11:25:15.992816 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xc7g6\" (UniqueName: \"kubernetes.io/projected/45988c51-6072-40ba-91cf-dce53d796c66-kube-api-access-xc7g6\") pod \"45988c51-6072-40ba-91cf-dce53d796c66\" (UID: \"45988c51-6072-40ba-91cf-dce53d796c66\") " Sep 29 11:25:16 crc kubenswrapper[4752]: I0929 11:25:16.005382 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45988c51-6072-40ba-91cf-dce53d796c66-kube-api-access-xc7g6" (OuterVolumeSpecName: "kube-api-access-xc7g6") pod "45988c51-6072-40ba-91cf-dce53d796c66" (UID: "45988c51-6072-40ba-91cf-dce53d796c66"). InnerVolumeSpecName "kube-api-access-xc7g6". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 11:25:16 crc kubenswrapper[4752]: I0929 11:25:16.036926 4752 scope.go:117] "RemoveContainer" containerID="93752bca1235c82c7e20c88ea68e0afd59b9dc59d3315066b08789cc37a87e37" Sep 29 11:25:16 crc kubenswrapper[4752]: E0929 11:25:16.037282 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgrvs_openshift-machine-config-operator(5863c243-797d-462a-b11f-71aaf005f8d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" Sep 29 11:25:16 crc kubenswrapper[4752]: I0929 11:25:16.053486 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35a2f395-752e-45f2-8c82-6816af965761" path="/var/lib/kubelet/pods/35a2f395-752e-45f2-8c82-6816af965761/volumes" Sep 29 11:25:16 crc kubenswrapper[4752]: I0929 11:25:16.053893 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f99592e-51b1-44bd-bd79-d7d3d3bc7c26" path="/var/lib/kubelet/pods/9f99592e-51b1-44bd-bd79-d7d3d3bc7c26/volumes" Sep 29 11:25:16 crc kubenswrapper[4752]: I0929 11:25:16.094969 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xc7g6\" (UniqueName: \"kubernetes.io/projected/45988c51-6072-40ba-91cf-dce53d796c66-kube-api-access-xc7g6\") on node \"crc\" DevicePath \"\"" Sep 29 11:25:16 crc kubenswrapper[4752]: I0929 11:25:16.258379 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-m7dts"] Sep 29 11:25:16 crc kubenswrapper[4752]: W0929 11:25:16.261737 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1454f20a_1684_4223_b8d5_b171a26471f7.slice/crio-ea49e3d73270f3d08fa4ccafb11e46460cb548ba50cc69a35af2de4255d80de4 WatchSource:0}: Error finding container ea49e3d73270f3d08fa4ccafb11e46460cb548ba50cc69a35af2de4255d80de4: Status 404 returned error can't find the container with id ea49e3d73270f3d08fa4ccafb11e46460cb548ba50cc69a35af2de4255d80de4 Sep 29 11:25:16 crc kubenswrapper[4752]: I0929 11:25:16.365870 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Sep 29 11:25:16 crc kubenswrapper[4752]: I0929 11:25:16.366196 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="64ab5c9b-f499-4e45-a9a5-82d2a44d7a29" containerName="ceilometer-central-agent" containerID="cri-o://29a90c8915ed05357a7831ca4c06378144417f842bc1fba44504af4d3e816a39" gracePeriod=30 Sep 29 11:25:16 crc kubenswrapper[4752]: I0929 11:25:16.366346 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="64ab5c9b-f499-4e45-a9a5-82d2a44d7a29" containerName="proxy-httpd" containerID="cri-o://52b37aab03b1cef60bb61ff8aa4e7017a35aa8d8a53cec2a68f08f89b92707be" gracePeriod=30 Sep 29 11:25:16 crc kubenswrapper[4752]: I0929 11:25:16.366403 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="64ab5c9b-f499-4e45-a9a5-82d2a44d7a29" containerName="sg-core" containerID="cri-o://dca2a889b7881e2450483c36abd8faf6f3b25b0deb45ae32407782b74f596d58" gracePeriod=30 Sep 29 11:25:16 crc kubenswrapper[4752]: I0929 11:25:16.366452 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="64ab5c9b-f499-4e45-a9a5-82d2a44d7a29" containerName="ceilometer-notification-agent" containerID="cri-o://76f6435988eec008338092b7cbed1aa1442ca8a8c1350b4b32d903bd7b983ff6" gracePeriod=30 Sep 29 11:25:16 crc kubenswrapper[4752]: I0929 11:25:16.540613 4752 generic.go:334] "Generic (PLEG): container finished" podID="64ab5c9b-f499-4e45-a9a5-82d2a44d7a29" containerID="dca2a889b7881e2450483c36abd8faf6f3b25b0deb45ae32407782b74f596d58" exitCode=2 Sep 29 11:25:16 crc kubenswrapper[4752]: I0929 11:25:16.546422 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"64ab5c9b-f499-4e45-a9a5-82d2a44d7a29","Type":"ContainerDied","Data":"dca2a889b7881e2450483c36abd8faf6f3b25b0deb45ae32407782b74f596d58"} Sep 29 11:25:16 crc kubenswrapper[4752]: I0929 11:25:16.552578 4752 generic.go:334] "Generic (PLEG): container finished" podID="1454f20a-1684-4223-b8d5-b171a26471f7" containerID="a73faccd294b880b7fe63ab59e3c771c853d10de90e82a084c7e1b2986691e63" exitCode=0 Sep 29 11:25:16 crc kubenswrapper[4752]: I0929 11:25:16.553455 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m7dts" event={"ID":"1454f20a-1684-4223-b8d5-b171a26471f7","Type":"ContainerDied","Data":"a73faccd294b880b7fe63ab59e3c771c853d10de90e82a084c7e1b2986691e63"} Sep 29 11:25:16 crc kubenswrapper[4752]: I0929 11:25:16.553478 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m7dts" event={"ID":"1454f20a-1684-4223-b8d5-b171a26471f7","Type":"ContainerStarted","Data":"ea49e3d73270f3d08fa4ccafb11e46460cb548ba50cc69a35af2de4255d80de4"} Sep 29 11:25:16 crc kubenswrapper[4752]: I0929 11:25:16.556518 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-97jzl" event={"ID":"45988c51-6072-40ba-91cf-dce53d796c66","Type":"ContainerDied","Data":"1af112391bdb3e9760227882520cf1f90af71ca7a148c2512b18c75b91b9ad0c"} Sep 29 11:25:16 crc kubenswrapper[4752]: I0929 11:25:16.556550 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1af112391bdb3e9760227882520cf1f90af71ca7a148c2512b18c75b91b9ad0c" Sep 29 11:25:16 crc kubenswrapper[4752]: I0929 11:25:16.556611 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-97jzl" Sep 29 11:25:17 crc kubenswrapper[4752]: I0929 11:25:17.567886 4752 generic.go:334] "Generic (PLEG): container finished" podID="a22b23af-2fc7-46f3-9c73-87bc740dad1d" containerID="18eccdbe0dd6a0b5ffa48b63bee01e4af645e35adbb34d5064e809e647e8fe8b" exitCode=0 Sep 29 11:25:17 crc kubenswrapper[4752]: I0929 11:25:17.567961 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"a22b23af-2fc7-46f3-9c73-87bc740dad1d","Type":"ContainerDied","Data":"18eccdbe0dd6a0b5ffa48b63bee01e4af645e35adbb34d5064e809e647e8fe8b"} Sep 29 11:25:17 crc kubenswrapper[4752]: I0929 11:25:17.570692 4752 generic.go:334] "Generic (PLEG): container finished" podID="64ab5c9b-f499-4e45-a9a5-82d2a44d7a29" containerID="52b37aab03b1cef60bb61ff8aa4e7017a35aa8d8a53cec2a68f08f89b92707be" exitCode=0 Sep 29 11:25:17 crc kubenswrapper[4752]: I0929 11:25:17.570721 4752 generic.go:334] "Generic (PLEG): container finished" podID="64ab5c9b-f499-4e45-a9a5-82d2a44d7a29" containerID="29a90c8915ed05357a7831ca4c06378144417f842bc1fba44504af4d3e816a39" exitCode=0 Sep 29 11:25:17 crc kubenswrapper[4752]: I0929 11:25:17.570741 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"64ab5c9b-f499-4e45-a9a5-82d2a44d7a29","Type":"ContainerDied","Data":"52b37aab03b1cef60bb61ff8aa4e7017a35aa8d8a53cec2a68f08f89b92707be"} Sep 29 11:25:17 crc kubenswrapper[4752]: I0929 11:25:17.570767 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"64ab5c9b-f499-4e45-a9a5-82d2a44d7a29","Type":"ContainerDied","Data":"29a90c8915ed05357a7831ca4c06378144417f842bc1fba44504af4d3e816a39"} Sep 29 11:25:17 crc kubenswrapper[4752]: I0929 11:25:17.884587 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:25:17 crc kubenswrapper[4752]: I0929 11:25:17.937337 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99tw4\" (UniqueName: \"kubernetes.io/projected/a22b23af-2fc7-46f3-9c73-87bc740dad1d-kube-api-access-99tw4\") pod \"a22b23af-2fc7-46f3-9c73-87bc740dad1d\" (UID: \"a22b23af-2fc7-46f3-9c73-87bc740dad1d\") " Sep 29 11:25:17 crc kubenswrapper[4752]: I0929 11:25:17.937429 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a22b23af-2fc7-46f3-9c73-87bc740dad1d-combined-ca-bundle\") pod \"a22b23af-2fc7-46f3-9c73-87bc740dad1d\" (UID: \"a22b23af-2fc7-46f3-9c73-87bc740dad1d\") " Sep 29 11:25:17 crc kubenswrapper[4752]: I0929 11:25:17.937494 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a22b23af-2fc7-46f3-9c73-87bc740dad1d-logs\") pod \"a22b23af-2fc7-46f3-9c73-87bc740dad1d\" (UID: \"a22b23af-2fc7-46f3-9c73-87bc740dad1d\") " Sep 29 11:25:17 crc kubenswrapper[4752]: I0929 11:25:17.937624 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a22b23af-2fc7-46f3-9c73-87bc740dad1d-config-data\") pod \"a22b23af-2fc7-46f3-9c73-87bc740dad1d\" (UID: \"a22b23af-2fc7-46f3-9c73-87bc740dad1d\") " Sep 29 11:25:17 crc kubenswrapper[4752]: I0929 11:25:17.938083 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a22b23af-2fc7-46f3-9c73-87bc740dad1d-logs" (OuterVolumeSpecName: "logs") pod "a22b23af-2fc7-46f3-9c73-87bc740dad1d" (UID: "a22b23af-2fc7-46f3-9c73-87bc740dad1d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 11:25:17 crc kubenswrapper[4752]: I0929 11:25:17.946032 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a22b23af-2fc7-46f3-9c73-87bc740dad1d-kube-api-access-99tw4" (OuterVolumeSpecName: "kube-api-access-99tw4") pod "a22b23af-2fc7-46f3-9c73-87bc740dad1d" (UID: "a22b23af-2fc7-46f3-9c73-87bc740dad1d"). InnerVolumeSpecName "kube-api-access-99tw4". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 11:25:17 crc kubenswrapper[4752]: I0929 11:25:17.969320 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a22b23af-2fc7-46f3-9c73-87bc740dad1d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a22b23af-2fc7-46f3-9c73-87bc740dad1d" (UID: "a22b23af-2fc7-46f3-9c73-87bc740dad1d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:25:17 crc kubenswrapper[4752]: I0929 11:25:17.989644 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a22b23af-2fc7-46f3-9c73-87bc740dad1d-config-data" (OuterVolumeSpecName: "config-data") pod "a22b23af-2fc7-46f3-9c73-87bc740dad1d" (UID: "a22b23af-2fc7-46f3-9c73-87bc740dad1d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:25:18 crc kubenswrapper[4752]: I0929 11:25:18.040861 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-99tw4\" (UniqueName: \"kubernetes.io/projected/a22b23af-2fc7-46f3-9c73-87bc740dad1d-kube-api-access-99tw4\") on node \"crc\" DevicePath \"\"" Sep 29 11:25:18 crc kubenswrapper[4752]: I0929 11:25:18.041209 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a22b23af-2fc7-46f3-9c73-87bc740dad1d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 11:25:18 crc kubenswrapper[4752]: I0929 11:25:18.041222 4752 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a22b23af-2fc7-46f3-9c73-87bc740dad1d-logs\") on node \"crc\" DevicePath \"\"" Sep 29 11:25:18 crc kubenswrapper[4752]: I0929 11:25:18.041234 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a22b23af-2fc7-46f3-9c73-87bc740dad1d-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 11:25:18 crc kubenswrapper[4752]: I0929 11:25:18.582611 4752 generic.go:334] "Generic (PLEG): container finished" podID="59b9b329-fe3a-4af8-8234-2d560d557569" containerID="99294096ba745b02ccdde54944f3af21a500b75829775a82857c3603fb9f1cdb" exitCode=0 Sep 29 11:25:18 crc kubenswrapper[4752]: I0929 11:25:18.582665 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"59b9b329-fe3a-4af8-8234-2d560d557569","Type":"ContainerDied","Data":"99294096ba745b02ccdde54944f3af21a500b75829775a82857c3603fb9f1cdb"} Sep 29 11:25:18 crc kubenswrapper[4752]: I0929 11:25:18.584401 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"a22b23af-2fc7-46f3-9c73-87bc740dad1d","Type":"ContainerDied","Data":"6cff88a35a4dfcb83c2488fefc4461c35ec44fc1325e4133ad265c2e53254ffc"} Sep 29 11:25:18 crc kubenswrapper[4752]: I0929 11:25:18.584433 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:25:18 crc kubenswrapper[4752]: I0929 11:25:18.584474 4752 scope.go:117] "RemoveContainer" containerID="18eccdbe0dd6a0b5ffa48b63bee01e4af645e35adbb34d5064e809e647e8fe8b" Sep 29 11:25:18 crc kubenswrapper[4752]: I0929 11:25:18.587620 4752 generic.go:334] "Generic (PLEG): container finished" podID="1454f20a-1684-4223-b8d5-b171a26471f7" containerID="7f02411cc8e40123ef978f225f832fe0ee4c67580564c7468028acfa6412a0a1" exitCode=0 Sep 29 11:25:18 crc kubenswrapper[4752]: I0929 11:25:18.587658 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m7dts" event={"ID":"1454f20a-1684-4223-b8d5-b171a26471f7","Type":"ContainerDied","Data":"7f02411cc8e40123ef978f225f832fe0ee4c67580564c7468028acfa6412a0a1"} Sep 29 11:25:18 crc kubenswrapper[4752]: I0929 11:25:18.636976 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Sep 29 11:25:18 crc kubenswrapper[4752]: I0929 11:25:18.657462 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Sep 29 11:25:18 crc kubenswrapper[4752]: I0929 11:25:18.783464 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:25:18 crc kubenswrapper[4752]: I0929 11:25:18.854762 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k6tlk\" (UniqueName: \"kubernetes.io/projected/59b9b329-fe3a-4af8-8234-2d560d557569-kube-api-access-k6tlk\") pod \"59b9b329-fe3a-4af8-8234-2d560d557569\" (UID: \"59b9b329-fe3a-4af8-8234-2d560d557569\") " Sep 29 11:25:18 crc kubenswrapper[4752]: I0929 11:25:18.854979 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59b9b329-fe3a-4af8-8234-2d560d557569-logs\") pod \"59b9b329-fe3a-4af8-8234-2d560d557569\" (UID: \"59b9b329-fe3a-4af8-8234-2d560d557569\") " Sep 29 11:25:18 crc kubenswrapper[4752]: I0929 11:25:18.855068 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59b9b329-fe3a-4af8-8234-2d560d557569-config-data\") pod \"59b9b329-fe3a-4af8-8234-2d560d557569\" (UID: \"59b9b329-fe3a-4af8-8234-2d560d557569\") " Sep 29 11:25:18 crc kubenswrapper[4752]: I0929 11:25:18.855096 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59b9b329-fe3a-4af8-8234-2d560d557569-combined-ca-bundle\") pod \"59b9b329-fe3a-4af8-8234-2d560d557569\" (UID: \"59b9b329-fe3a-4af8-8234-2d560d557569\") " Sep 29 11:25:18 crc kubenswrapper[4752]: I0929 11:25:18.855129 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/59b9b329-fe3a-4af8-8234-2d560d557569-custom-prometheus-ca\") pod \"59b9b329-fe3a-4af8-8234-2d560d557569\" (UID: \"59b9b329-fe3a-4af8-8234-2d560d557569\") " Sep 29 11:25:18 crc kubenswrapper[4752]: I0929 11:25:18.855451 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59b9b329-fe3a-4af8-8234-2d560d557569-logs" (OuterVolumeSpecName: "logs") pod "59b9b329-fe3a-4af8-8234-2d560d557569" (UID: "59b9b329-fe3a-4af8-8234-2d560d557569"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 11:25:18 crc kubenswrapper[4752]: I0929 11:25:18.859865 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59b9b329-fe3a-4af8-8234-2d560d557569-kube-api-access-k6tlk" (OuterVolumeSpecName: "kube-api-access-k6tlk") pod "59b9b329-fe3a-4af8-8234-2d560d557569" (UID: "59b9b329-fe3a-4af8-8234-2d560d557569"). InnerVolumeSpecName "kube-api-access-k6tlk". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 11:25:18 crc kubenswrapper[4752]: I0929 11:25:18.877108 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59b9b329-fe3a-4af8-8234-2d560d557569-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "59b9b329-fe3a-4af8-8234-2d560d557569" (UID: "59b9b329-fe3a-4af8-8234-2d560d557569"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:25:18 crc kubenswrapper[4752]: I0929 11:25:18.888720 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59b9b329-fe3a-4af8-8234-2d560d557569-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "59b9b329-fe3a-4af8-8234-2d560d557569" (UID: "59b9b329-fe3a-4af8-8234-2d560d557569"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:25:18 crc kubenswrapper[4752]: I0929 11:25:18.895667 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59b9b329-fe3a-4af8-8234-2d560d557569-config-data" (OuterVolumeSpecName: "config-data") pod "59b9b329-fe3a-4af8-8234-2d560d557569" (UID: "59b9b329-fe3a-4af8-8234-2d560d557569"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:25:18 crc kubenswrapper[4752]: I0929 11:25:18.956788 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59b9b329-fe3a-4af8-8234-2d560d557569-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 11:25:18 crc kubenswrapper[4752]: I0929 11:25:18.956859 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59b9b329-fe3a-4af8-8234-2d560d557569-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 11:25:18 crc kubenswrapper[4752]: I0929 11:25:18.956873 4752 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/59b9b329-fe3a-4af8-8234-2d560d557569-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Sep 29 11:25:18 crc kubenswrapper[4752]: I0929 11:25:18.956883 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k6tlk\" (UniqueName: \"kubernetes.io/projected/59b9b329-fe3a-4af8-8234-2d560d557569-kube-api-access-k6tlk\") on node \"crc\" DevicePath \"\"" Sep 29 11:25:18 crc kubenswrapper[4752]: I0929 11:25:18.956893 4752 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59b9b329-fe3a-4af8-8234-2d560d557569-logs\") on node \"crc\" DevicePath \"\"" Sep 29 11:25:19 crc kubenswrapper[4752]: I0929 11:25:19.463962 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:25:19 crc kubenswrapper[4752]: I0929 11:25:19.565167 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/64ab5c9b-f499-4e45-a9a5-82d2a44d7a29-run-httpd\") pod \"64ab5c9b-f499-4e45-a9a5-82d2a44d7a29\" (UID: \"64ab5c9b-f499-4e45-a9a5-82d2a44d7a29\") " Sep 29 11:25:19 crc kubenswrapper[4752]: I0929 11:25:19.565255 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64ab5c9b-f499-4e45-a9a5-82d2a44d7a29-config-data\") pod \"64ab5c9b-f499-4e45-a9a5-82d2a44d7a29\" (UID: \"64ab5c9b-f499-4e45-a9a5-82d2a44d7a29\") " Sep 29 11:25:19 crc kubenswrapper[4752]: I0929 11:25:19.565365 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qq2wl\" (UniqueName: \"kubernetes.io/projected/64ab5c9b-f499-4e45-a9a5-82d2a44d7a29-kube-api-access-qq2wl\") pod \"64ab5c9b-f499-4e45-a9a5-82d2a44d7a29\" (UID: \"64ab5c9b-f499-4e45-a9a5-82d2a44d7a29\") " Sep 29 11:25:19 crc kubenswrapper[4752]: I0929 11:25:19.565392 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/64ab5c9b-f499-4e45-a9a5-82d2a44d7a29-log-httpd\") pod \"64ab5c9b-f499-4e45-a9a5-82d2a44d7a29\" (UID: \"64ab5c9b-f499-4e45-a9a5-82d2a44d7a29\") " Sep 29 11:25:19 crc kubenswrapper[4752]: I0929 11:25:19.565457 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64ab5c9b-f499-4e45-a9a5-82d2a44d7a29-scripts\") pod \"64ab5c9b-f499-4e45-a9a5-82d2a44d7a29\" (UID: \"64ab5c9b-f499-4e45-a9a5-82d2a44d7a29\") " Sep 29 11:25:19 crc kubenswrapper[4752]: I0929 11:25:19.565481 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/64ab5c9b-f499-4e45-a9a5-82d2a44d7a29-sg-core-conf-yaml\") pod \"64ab5c9b-f499-4e45-a9a5-82d2a44d7a29\" (UID: \"64ab5c9b-f499-4e45-a9a5-82d2a44d7a29\") " Sep 29 11:25:19 crc kubenswrapper[4752]: I0929 11:25:19.565515 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/64ab5c9b-f499-4e45-a9a5-82d2a44d7a29-ceilometer-tls-certs\") pod \"64ab5c9b-f499-4e45-a9a5-82d2a44d7a29\" (UID: \"64ab5c9b-f499-4e45-a9a5-82d2a44d7a29\") " Sep 29 11:25:19 crc kubenswrapper[4752]: I0929 11:25:19.565537 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64ab5c9b-f499-4e45-a9a5-82d2a44d7a29-combined-ca-bundle\") pod \"64ab5c9b-f499-4e45-a9a5-82d2a44d7a29\" (UID: \"64ab5c9b-f499-4e45-a9a5-82d2a44d7a29\") " Sep 29 11:25:19 crc kubenswrapper[4752]: I0929 11:25:19.565650 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64ab5c9b-f499-4e45-a9a5-82d2a44d7a29-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "64ab5c9b-f499-4e45-a9a5-82d2a44d7a29" (UID: "64ab5c9b-f499-4e45-a9a5-82d2a44d7a29"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 11:25:19 crc kubenswrapper[4752]: I0929 11:25:19.565926 4752 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/64ab5c9b-f499-4e45-a9a5-82d2a44d7a29-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 29 11:25:19 crc kubenswrapper[4752]: I0929 11:25:19.565990 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64ab5c9b-f499-4e45-a9a5-82d2a44d7a29-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "64ab5c9b-f499-4e45-a9a5-82d2a44d7a29" (UID: "64ab5c9b-f499-4e45-a9a5-82d2a44d7a29"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 11:25:19 crc kubenswrapper[4752]: I0929 11:25:19.569493 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64ab5c9b-f499-4e45-a9a5-82d2a44d7a29-scripts" (OuterVolumeSpecName: "scripts") pod "64ab5c9b-f499-4e45-a9a5-82d2a44d7a29" (UID: "64ab5c9b-f499-4e45-a9a5-82d2a44d7a29"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:25:19 crc kubenswrapper[4752]: I0929 11:25:19.571945 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64ab5c9b-f499-4e45-a9a5-82d2a44d7a29-kube-api-access-qq2wl" (OuterVolumeSpecName: "kube-api-access-qq2wl") pod "64ab5c9b-f499-4e45-a9a5-82d2a44d7a29" (UID: "64ab5c9b-f499-4e45-a9a5-82d2a44d7a29"). InnerVolumeSpecName "kube-api-access-qq2wl". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 11:25:19 crc kubenswrapper[4752]: I0929 11:25:19.590618 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64ab5c9b-f499-4e45-a9a5-82d2a44d7a29-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "64ab5c9b-f499-4e45-a9a5-82d2a44d7a29" (UID: "64ab5c9b-f499-4e45-a9a5-82d2a44d7a29"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:25:19 crc kubenswrapper[4752]: I0929 11:25:19.599281 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m7dts" event={"ID":"1454f20a-1684-4223-b8d5-b171a26471f7","Type":"ContainerStarted","Data":"8d03988aed9a075fcab2a1f6e02d54f5e160f734a38f4cdba18d027be88f385c"} Sep 29 11:25:19 crc kubenswrapper[4752]: I0929 11:25:19.603566 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"59b9b329-fe3a-4af8-8234-2d560d557569","Type":"ContainerDied","Data":"68430d5dd8dc79065edf20cb63d0ccdffccfd570c7f7bc7a6f3e21e0e42b2bee"} Sep 29 11:25:19 crc kubenswrapper[4752]: I0929 11:25:19.603836 4752 scope.go:117] "RemoveContainer" containerID="99294096ba745b02ccdde54944f3af21a500b75829775a82857c3603fb9f1cdb" Sep 29 11:25:19 crc kubenswrapper[4752]: I0929 11:25:19.604298 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:25:19 crc kubenswrapper[4752]: I0929 11:25:19.611269 4752 generic.go:334] "Generic (PLEG): container finished" podID="64ab5c9b-f499-4e45-a9a5-82d2a44d7a29" containerID="76f6435988eec008338092b7cbed1aa1442ca8a8c1350b4b32d903bd7b983ff6" exitCode=0 Sep 29 11:25:19 crc kubenswrapper[4752]: I0929 11:25:19.611391 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"64ab5c9b-f499-4e45-a9a5-82d2a44d7a29","Type":"ContainerDied","Data":"76f6435988eec008338092b7cbed1aa1442ca8a8c1350b4b32d903bd7b983ff6"} Sep 29 11:25:19 crc kubenswrapper[4752]: I0929 11:25:19.611417 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"64ab5c9b-f499-4e45-a9a5-82d2a44d7a29","Type":"ContainerDied","Data":"fc0a6467d5868225cab54c24e69a1e1cada95adfcac643d30a0b61c326c4b241"} Sep 29 11:25:19 crc kubenswrapper[4752]: I0929 11:25:19.611667 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:25:19 crc kubenswrapper[4752]: I0929 11:25:19.623213 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-m7dts" podStartSLOduration=1.952861519 podStartE2EDuration="4.623177379s" podCreationTimestamp="2025-09-29 11:25:15 +0000 UTC" firstStartedPulling="2025-09-29 11:25:16.554541441 +0000 UTC m=+2457.343683108" lastFinishedPulling="2025-09-29 11:25:19.224857291 +0000 UTC m=+2460.013998968" observedRunningTime="2025-09-29 11:25:19.618456246 +0000 UTC m=+2460.407597913" watchObservedRunningTime="2025-09-29 11:25:19.623177379 +0000 UTC m=+2460.412319056" Sep 29 11:25:19 crc kubenswrapper[4752]: I0929 11:25:19.640450 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64ab5c9b-f499-4e45-a9a5-82d2a44d7a29-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "64ab5c9b-f499-4e45-a9a5-82d2a44d7a29" (UID: "64ab5c9b-f499-4e45-a9a5-82d2a44d7a29"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:25:19 crc kubenswrapper[4752]: I0929 11:25:19.651087 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64ab5c9b-f499-4e45-a9a5-82d2a44d7a29-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "64ab5c9b-f499-4e45-a9a5-82d2a44d7a29" (UID: "64ab5c9b-f499-4e45-a9a5-82d2a44d7a29"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:25:19 crc kubenswrapper[4752]: I0929 11:25:19.651410 4752 scope.go:117] "RemoveContainer" containerID="52b37aab03b1cef60bb61ff8aa4e7017a35aa8d8a53cec2a68f08f89b92707be" Sep 29 11:25:19 crc kubenswrapper[4752]: I0929 11:25:19.657398 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Sep 29 11:25:19 crc kubenswrapper[4752]: I0929 11:25:19.663872 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64ab5c9b-f499-4e45-a9a5-82d2a44d7a29-config-data" (OuterVolumeSpecName: "config-data") pod "64ab5c9b-f499-4e45-a9a5-82d2a44d7a29" (UID: "64ab5c9b-f499-4e45-a9a5-82d2a44d7a29"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:25:19 crc kubenswrapper[4752]: I0929 11:25:19.664643 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Sep 29 11:25:19 crc kubenswrapper[4752]: I0929 11:25:19.668092 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qq2wl\" (UniqueName: \"kubernetes.io/projected/64ab5c9b-f499-4e45-a9a5-82d2a44d7a29-kube-api-access-qq2wl\") on node \"crc\" DevicePath \"\"" Sep 29 11:25:19 crc kubenswrapper[4752]: I0929 11:25:19.668124 4752 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/64ab5c9b-f499-4e45-a9a5-82d2a44d7a29-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 29 11:25:19 crc kubenswrapper[4752]: I0929 11:25:19.668135 4752 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64ab5c9b-f499-4e45-a9a5-82d2a44d7a29-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 11:25:19 crc kubenswrapper[4752]: I0929 11:25:19.668144 4752 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/64ab5c9b-f499-4e45-a9a5-82d2a44d7a29-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 29 11:25:19 crc kubenswrapper[4752]: I0929 11:25:19.668154 4752 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/64ab5c9b-f499-4e45-a9a5-82d2a44d7a29-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 29 11:25:19 crc kubenswrapper[4752]: I0929 11:25:19.668163 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64ab5c9b-f499-4e45-a9a5-82d2a44d7a29-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 11:25:19 crc kubenswrapper[4752]: I0929 11:25:19.668170 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64ab5c9b-f499-4e45-a9a5-82d2a44d7a29-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 11:25:19 crc kubenswrapper[4752]: I0929 11:25:19.671077 4752 scope.go:117] "RemoveContainer" containerID="dca2a889b7881e2450483c36abd8faf6f3b25b0deb45ae32407782b74f596d58" Sep 29 11:25:19 crc kubenswrapper[4752]: I0929 11:25:19.685466 4752 scope.go:117] "RemoveContainer" containerID="76f6435988eec008338092b7cbed1aa1442ca8a8c1350b4b32d903bd7b983ff6" Sep 29 11:25:19 crc kubenswrapper[4752]: I0929 11:25:19.704322 4752 scope.go:117] "RemoveContainer" containerID="29a90c8915ed05357a7831ca4c06378144417f842bc1fba44504af4d3e816a39" Sep 29 11:25:19 crc kubenswrapper[4752]: I0929 11:25:19.722449 4752 scope.go:117] "RemoveContainer" containerID="52b37aab03b1cef60bb61ff8aa4e7017a35aa8d8a53cec2a68f08f89b92707be" Sep 29 11:25:19 crc kubenswrapper[4752]: E0929 11:25:19.722948 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52b37aab03b1cef60bb61ff8aa4e7017a35aa8d8a53cec2a68f08f89b92707be\": container with ID starting with 52b37aab03b1cef60bb61ff8aa4e7017a35aa8d8a53cec2a68f08f89b92707be not found: ID does not exist" containerID="52b37aab03b1cef60bb61ff8aa4e7017a35aa8d8a53cec2a68f08f89b92707be" Sep 29 11:25:19 crc kubenswrapper[4752]: I0929 11:25:19.722990 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52b37aab03b1cef60bb61ff8aa4e7017a35aa8d8a53cec2a68f08f89b92707be"} err="failed to get container status \"52b37aab03b1cef60bb61ff8aa4e7017a35aa8d8a53cec2a68f08f89b92707be\": rpc error: code = NotFound desc = could not find container \"52b37aab03b1cef60bb61ff8aa4e7017a35aa8d8a53cec2a68f08f89b92707be\": container with ID starting with 52b37aab03b1cef60bb61ff8aa4e7017a35aa8d8a53cec2a68f08f89b92707be not found: ID does not exist" Sep 29 11:25:19 crc kubenswrapper[4752]: I0929 11:25:19.723015 4752 scope.go:117] "RemoveContainer" containerID="dca2a889b7881e2450483c36abd8faf6f3b25b0deb45ae32407782b74f596d58" Sep 29 11:25:19 crc kubenswrapper[4752]: E0929 11:25:19.723342 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dca2a889b7881e2450483c36abd8faf6f3b25b0deb45ae32407782b74f596d58\": container with ID starting with dca2a889b7881e2450483c36abd8faf6f3b25b0deb45ae32407782b74f596d58 not found: ID does not exist" containerID="dca2a889b7881e2450483c36abd8faf6f3b25b0deb45ae32407782b74f596d58" Sep 29 11:25:19 crc kubenswrapper[4752]: I0929 11:25:19.723371 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dca2a889b7881e2450483c36abd8faf6f3b25b0deb45ae32407782b74f596d58"} err="failed to get container status \"dca2a889b7881e2450483c36abd8faf6f3b25b0deb45ae32407782b74f596d58\": rpc error: code = NotFound desc = could not find container \"dca2a889b7881e2450483c36abd8faf6f3b25b0deb45ae32407782b74f596d58\": container with ID starting with dca2a889b7881e2450483c36abd8faf6f3b25b0deb45ae32407782b74f596d58 not found: ID does not exist" Sep 29 11:25:19 crc kubenswrapper[4752]: I0929 11:25:19.723390 4752 scope.go:117] "RemoveContainer" containerID="76f6435988eec008338092b7cbed1aa1442ca8a8c1350b4b32d903bd7b983ff6" Sep 29 11:25:19 crc kubenswrapper[4752]: E0929 11:25:19.723724 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76f6435988eec008338092b7cbed1aa1442ca8a8c1350b4b32d903bd7b983ff6\": container with ID starting with 76f6435988eec008338092b7cbed1aa1442ca8a8c1350b4b32d903bd7b983ff6 not found: ID does not exist" containerID="76f6435988eec008338092b7cbed1aa1442ca8a8c1350b4b32d903bd7b983ff6" Sep 29 11:25:19 crc kubenswrapper[4752]: I0929 11:25:19.723747 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76f6435988eec008338092b7cbed1aa1442ca8a8c1350b4b32d903bd7b983ff6"} err="failed to get container status \"76f6435988eec008338092b7cbed1aa1442ca8a8c1350b4b32d903bd7b983ff6\": rpc error: code = NotFound desc = could not find container \"76f6435988eec008338092b7cbed1aa1442ca8a8c1350b4b32d903bd7b983ff6\": container with ID starting with 76f6435988eec008338092b7cbed1aa1442ca8a8c1350b4b32d903bd7b983ff6 not found: ID does not exist" Sep 29 11:25:19 crc kubenswrapper[4752]: I0929 11:25:19.723779 4752 scope.go:117] "RemoveContainer" containerID="29a90c8915ed05357a7831ca4c06378144417f842bc1fba44504af4d3e816a39" Sep 29 11:25:19 crc kubenswrapper[4752]: E0929 11:25:19.724200 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29a90c8915ed05357a7831ca4c06378144417f842bc1fba44504af4d3e816a39\": container with ID starting with 29a90c8915ed05357a7831ca4c06378144417f842bc1fba44504af4d3e816a39 not found: ID does not exist" containerID="29a90c8915ed05357a7831ca4c06378144417f842bc1fba44504af4d3e816a39" Sep 29 11:25:19 crc kubenswrapper[4752]: I0929 11:25:19.724242 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29a90c8915ed05357a7831ca4c06378144417f842bc1fba44504af4d3e816a39"} err="failed to get container status \"29a90c8915ed05357a7831ca4c06378144417f842bc1fba44504af4d3e816a39\": rpc error: code = NotFound desc = could not find container \"29a90c8915ed05357a7831ca4c06378144417f842bc1fba44504af4d3e816a39\": container with ID starting with 29a90c8915ed05357a7831ca4c06378144417f842bc1fba44504af4d3e816a39 not found: ID does not exist" Sep 29 11:25:19 crc kubenswrapper[4752]: I0929 11:25:19.947738 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Sep 29 11:25:19 crc kubenswrapper[4752]: I0929 11:25:19.979763 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Sep 29 11:25:19 crc kubenswrapper[4752]: I0929 11:25:19.999378 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Sep 29 11:25:19 crc kubenswrapper[4752]: E0929 11:25:19.999835 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64ab5c9b-f499-4e45-a9a5-82d2a44d7a29" containerName="proxy-httpd" Sep 29 11:25:19 crc kubenswrapper[4752]: I0929 11:25:19.999859 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="64ab5c9b-f499-4e45-a9a5-82d2a44d7a29" containerName="proxy-httpd" Sep 29 11:25:19 crc kubenswrapper[4752]: E0929 11:25:19.999883 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45988c51-6072-40ba-91cf-dce53d796c66" containerName="mariadb-database-create" Sep 29 11:25:19 crc kubenswrapper[4752]: I0929 11:25:19.999892 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="45988c51-6072-40ba-91cf-dce53d796c66" containerName="mariadb-database-create" Sep 29 11:25:19 crc kubenswrapper[4752]: E0929 11:25:19.999914 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a22b23af-2fc7-46f3-9c73-87bc740dad1d" containerName="watcher-applier" Sep 29 11:25:19 crc kubenswrapper[4752]: I0929 11:25:19.999923 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="a22b23af-2fc7-46f3-9c73-87bc740dad1d" containerName="watcher-applier" Sep 29 11:25:19 crc kubenswrapper[4752]: E0929 11:25:19.999937 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59b9b329-fe3a-4af8-8234-2d560d557569" containerName="watcher-decision-engine" Sep 29 11:25:19 crc kubenswrapper[4752]: I0929 11:25:19.999944 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="59b9b329-fe3a-4af8-8234-2d560d557569" containerName="watcher-decision-engine" Sep 29 11:25:20 crc kubenswrapper[4752]: E0929 11:25:19.999957 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64ab5c9b-f499-4e45-a9a5-82d2a44d7a29" containerName="ceilometer-notification-agent" Sep 29 11:25:20 crc kubenswrapper[4752]: I0929 11:25:19.999966 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="64ab5c9b-f499-4e45-a9a5-82d2a44d7a29" containerName="ceilometer-notification-agent" Sep 29 11:25:20 crc kubenswrapper[4752]: E0929 11:25:19.999977 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64ab5c9b-f499-4e45-a9a5-82d2a44d7a29" containerName="ceilometer-central-agent" Sep 29 11:25:20 crc kubenswrapper[4752]: I0929 11:25:19.999985 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="64ab5c9b-f499-4e45-a9a5-82d2a44d7a29" containerName="ceilometer-central-agent" Sep 29 11:25:20 crc kubenswrapper[4752]: E0929 11:25:19.999995 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64ab5c9b-f499-4e45-a9a5-82d2a44d7a29" containerName="sg-core" Sep 29 11:25:20 crc kubenswrapper[4752]: I0929 11:25:20.000002 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="64ab5c9b-f499-4e45-a9a5-82d2a44d7a29" containerName="sg-core" Sep 29 11:25:20 crc kubenswrapper[4752]: I0929 11:25:20.000226 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="64ab5c9b-f499-4e45-a9a5-82d2a44d7a29" containerName="ceilometer-central-agent" Sep 29 11:25:20 crc kubenswrapper[4752]: I0929 11:25:20.000240 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="59b9b329-fe3a-4af8-8234-2d560d557569" containerName="watcher-decision-engine" Sep 29 11:25:20 crc kubenswrapper[4752]: I0929 11:25:20.000253 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="45988c51-6072-40ba-91cf-dce53d796c66" containerName="mariadb-database-create" Sep 29 11:25:20 crc kubenswrapper[4752]: I0929 11:25:20.000266 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="64ab5c9b-f499-4e45-a9a5-82d2a44d7a29" containerName="proxy-httpd" Sep 29 11:25:20 crc kubenswrapper[4752]: I0929 11:25:20.000277 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="a22b23af-2fc7-46f3-9c73-87bc740dad1d" containerName="watcher-applier" Sep 29 11:25:20 crc kubenswrapper[4752]: I0929 11:25:20.000288 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="64ab5c9b-f499-4e45-a9a5-82d2a44d7a29" containerName="sg-core" Sep 29 11:25:20 crc kubenswrapper[4752]: I0929 11:25:20.000308 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="64ab5c9b-f499-4e45-a9a5-82d2a44d7a29" containerName="ceilometer-notification-agent" Sep 29 11:25:20 crc kubenswrapper[4752]: I0929 11:25:20.002401 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:25:20 crc kubenswrapper[4752]: I0929 11:25:20.007505 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Sep 29 11:25:20 crc kubenswrapper[4752]: I0929 11:25:20.008475 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-ceilometer-internal-svc" Sep 29 11:25:20 crc kubenswrapper[4752]: I0929 11:25:20.008748 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Sep 29 11:25:20 crc kubenswrapper[4752]: I0929 11:25:20.010404 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Sep 29 11:25:20 crc kubenswrapper[4752]: I0929 11:25:20.066943 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59b9b329-fe3a-4af8-8234-2d560d557569" path="/var/lib/kubelet/pods/59b9b329-fe3a-4af8-8234-2d560d557569/volumes" Sep 29 11:25:20 crc kubenswrapper[4752]: I0929 11:25:20.069128 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64ab5c9b-f499-4e45-a9a5-82d2a44d7a29" path="/var/lib/kubelet/pods/64ab5c9b-f499-4e45-a9a5-82d2a44d7a29/volumes" Sep 29 11:25:20 crc kubenswrapper[4752]: I0929 11:25:20.070138 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a22b23af-2fc7-46f3-9c73-87bc740dad1d" path="/var/lib/kubelet/pods/a22b23af-2fc7-46f3-9c73-87bc740dad1d/volumes" Sep 29 11:25:20 crc kubenswrapper[4752]: I0929 11:25:20.076671 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c3da3393-1343-4c89-93ae-0d0c4a42da37-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c3da3393-1343-4c89-93ae-0d0c4a42da37\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:25:20 crc kubenswrapper[4752]: I0929 11:25:20.076873 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3da3393-1343-4c89-93ae-0d0c4a42da37-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c3da3393-1343-4c89-93ae-0d0c4a42da37\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:25:20 crc kubenswrapper[4752]: I0929 11:25:20.076957 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3da3393-1343-4c89-93ae-0d0c4a42da37-config-data\") pod \"ceilometer-0\" (UID: \"c3da3393-1343-4c89-93ae-0d0c4a42da37\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:25:20 crc kubenswrapper[4752]: I0929 11:25:20.077055 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c3da3393-1343-4c89-93ae-0d0c4a42da37-log-httpd\") pod \"ceilometer-0\" (UID: \"c3da3393-1343-4c89-93ae-0d0c4a42da37\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:25:20 crc kubenswrapper[4752]: I0929 11:25:20.077124 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5khdl\" (UniqueName: \"kubernetes.io/projected/c3da3393-1343-4c89-93ae-0d0c4a42da37-kube-api-access-5khdl\") pod \"ceilometer-0\" (UID: \"c3da3393-1343-4c89-93ae-0d0c4a42da37\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:25:20 crc kubenswrapper[4752]: I0929 11:25:20.077255 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3da3393-1343-4c89-93ae-0d0c4a42da37-scripts\") pod \"ceilometer-0\" (UID: \"c3da3393-1343-4c89-93ae-0d0c4a42da37\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:25:20 crc kubenswrapper[4752]: I0929 11:25:20.077330 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3da3393-1343-4c89-93ae-0d0c4a42da37-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"c3da3393-1343-4c89-93ae-0d0c4a42da37\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:25:20 crc kubenswrapper[4752]: I0929 11:25:20.077410 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c3da3393-1343-4c89-93ae-0d0c4a42da37-run-httpd\") pod \"ceilometer-0\" (UID: \"c3da3393-1343-4c89-93ae-0d0c4a42da37\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:25:20 crc kubenswrapper[4752]: I0929 11:25:20.178812 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c3da3393-1343-4c89-93ae-0d0c4a42da37-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c3da3393-1343-4c89-93ae-0d0c4a42da37\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:25:20 crc kubenswrapper[4752]: I0929 11:25:20.178866 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3da3393-1343-4c89-93ae-0d0c4a42da37-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c3da3393-1343-4c89-93ae-0d0c4a42da37\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:25:20 crc kubenswrapper[4752]: I0929 11:25:20.178897 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3da3393-1343-4c89-93ae-0d0c4a42da37-config-data\") pod \"ceilometer-0\" (UID: \"c3da3393-1343-4c89-93ae-0d0c4a42da37\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:25:20 crc kubenswrapper[4752]: I0929 11:25:20.178930 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c3da3393-1343-4c89-93ae-0d0c4a42da37-log-httpd\") pod \"ceilometer-0\" (UID: \"c3da3393-1343-4c89-93ae-0d0c4a42da37\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:25:20 crc kubenswrapper[4752]: I0929 11:25:20.178950 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5khdl\" (UniqueName: \"kubernetes.io/projected/c3da3393-1343-4c89-93ae-0d0c4a42da37-kube-api-access-5khdl\") pod \"ceilometer-0\" (UID: \"c3da3393-1343-4c89-93ae-0d0c4a42da37\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:25:20 crc kubenswrapper[4752]: I0929 11:25:20.178992 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3da3393-1343-4c89-93ae-0d0c4a42da37-scripts\") pod \"ceilometer-0\" (UID: \"c3da3393-1343-4c89-93ae-0d0c4a42da37\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:25:20 crc kubenswrapper[4752]: I0929 11:25:20.179008 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3da3393-1343-4c89-93ae-0d0c4a42da37-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"c3da3393-1343-4c89-93ae-0d0c4a42da37\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:25:20 crc kubenswrapper[4752]: I0929 11:25:20.179026 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c3da3393-1343-4c89-93ae-0d0c4a42da37-run-httpd\") pod \"ceilometer-0\" (UID: \"c3da3393-1343-4c89-93ae-0d0c4a42da37\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:25:20 crc kubenswrapper[4752]: I0929 11:25:20.179371 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c3da3393-1343-4c89-93ae-0d0c4a42da37-run-httpd\") pod \"ceilometer-0\" (UID: \"c3da3393-1343-4c89-93ae-0d0c4a42da37\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:25:20 crc kubenswrapper[4752]: I0929 11:25:20.180098 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c3da3393-1343-4c89-93ae-0d0c4a42da37-log-httpd\") pod \"ceilometer-0\" (UID: \"c3da3393-1343-4c89-93ae-0d0c4a42da37\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:25:20 crc kubenswrapper[4752]: I0929 11:25:20.184399 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3da3393-1343-4c89-93ae-0d0c4a42da37-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"c3da3393-1343-4c89-93ae-0d0c4a42da37\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:25:20 crc kubenswrapper[4752]: I0929 11:25:20.186739 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3da3393-1343-4c89-93ae-0d0c4a42da37-config-data\") pod \"ceilometer-0\" (UID: \"c3da3393-1343-4c89-93ae-0d0c4a42da37\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:25:20 crc kubenswrapper[4752]: I0929 11:25:20.187273 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3da3393-1343-4c89-93ae-0d0c4a42da37-scripts\") pod \"ceilometer-0\" (UID: \"c3da3393-1343-4c89-93ae-0d0c4a42da37\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:25:20 crc kubenswrapper[4752]: I0929 11:25:20.203293 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c3da3393-1343-4c89-93ae-0d0c4a42da37-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c3da3393-1343-4c89-93ae-0d0c4a42da37\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:25:20 crc kubenswrapper[4752]: I0929 11:25:20.208117 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5khdl\" (UniqueName: \"kubernetes.io/projected/c3da3393-1343-4c89-93ae-0d0c4a42da37-kube-api-access-5khdl\") pod \"ceilometer-0\" (UID: \"c3da3393-1343-4c89-93ae-0d0c4a42da37\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:25:20 crc kubenswrapper[4752]: I0929 11:25:20.208567 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3da3393-1343-4c89-93ae-0d0c4a42da37-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c3da3393-1343-4c89-93ae-0d0c4a42da37\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:25:20 crc kubenswrapper[4752]: I0929 11:25:20.348568 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:25:20 crc kubenswrapper[4752]: W0929 11:25:20.817796 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc3da3393_1343_4c89_93ae_0d0c4a42da37.slice/crio-b95acc5a33785c9978d14aff85999c6688ec62a07e3e9206aaf0ab1393c784c4 WatchSource:0}: Error finding container b95acc5a33785c9978d14aff85999c6688ec62a07e3e9206aaf0ab1393c784c4: Status 404 returned error can't find the container with id b95acc5a33785c9978d14aff85999c6688ec62a07e3e9206aaf0ab1393c784c4 Sep 29 11:25:20 crc kubenswrapper[4752]: I0929 11:25:20.821685 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Sep 29 11:25:21 crc kubenswrapper[4752]: I0929 11:25:21.630513 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"c3da3393-1343-4c89-93ae-0d0c4a42da37","Type":"ContainerStarted","Data":"b95acc5a33785c9978d14aff85999c6688ec62a07e3e9206aaf0ab1393c784c4"} Sep 29 11:25:23 crc kubenswrapper[4752]: I0929 11:25:22.639020 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"c3da3393-1343-4c89-93ae-0d0c4a42da37","Type":"ContainerStarted","Data":"b0c61026715200c4271cbed347d293dd1bc92eb08e9c1fe11c13d7b9693ac03c"} Sep 29 11:25:23 crc kubenswrapper[4752]: I0929 11:25:22.639319 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"c3da3393-1343-4c89-93ae-0d0c4a42da37","Type":"ContainerStarted","Data":"d7fd9138103487aa206ea241d49b3f46e8c2c4039494512f333a436957e63a97"} Sep 29 11:25:23 crc kubenswrapper[4752]: I0929 11:25:23.649065 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"c3da3393-1343-4c89-93ae-0d0c4a42da37","Type":"ContainerStarted","Data":"f4832676b8519faaa5f3a7e45280324c615b2a20a859fe3589cf4e298f3b648c"} Sep 29 11:25:25 crc kubenswrapper[4752]: I0929 11:25:25.665266 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"c3da3393-1343-4c89-93ae-0d0c4a42da37","Type":"ContainerStarted","Data":"679190bed769d21c559d53903066087cd67ad43acf856a932c66073c0f268273"} Sep 29 11:25:25 crc kubenswrapper[4752]: I0929 11:25:25.666968 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:25:25 crc kubenswrapper[4752]: I0929 11:25:25.711755 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-m7dts" Sep 29 11:25:25 crc kubenswrapper[4752]: I0929 11:25:25.714479 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-m7dts" Sep 29 11:25:25 crc kubenswrapper[4752]: I0929 11:25:25.718367 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/ceilometer-0" podStartSLOduration=2.759534874 podStartE2EDuration="6.718346219s" podCreationTimestamp="2025-09-29 11:25:19 +0000 UTC" firstStartedPulling="2025-09-29 11:25:20.820138376 +0000 UTC m=+2461.609280043" lastFinishedPulling="2025-09-29 11:25:24.778949711 +0000 UTC m=+2465.568091388" observedRunningTime="2025-09-29 11:25:25.71494667 +0000 UTC m=+2466.504088377" watchObservedRunningTime="2025-09-29 11:25:25.718346219 +0000 UTC m=+2466.507487886" Sep 29 11:25:25 crc kubenswrapper[4752]: I0929 11:25:25.769459 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-m7dts" Sep 29 11:25:26 crc kubenswrapper[4752]: I0929 11:25:26.735927 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-m7dts" Sep 29 11:25:28 crc kubenswrapper[4752]: I0929 11:25:28.002602 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-60ae-account-create-5djbr"] Sep 29 11:25:28 crc kubenswrapper[4752]: I0929 11:25:28.004241 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-60ae-account-create-5djbr" Sep 29 11:25:28 crc kubenswrapper[4752]: I0929 11:25:28.006121 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-db-secret" Sep 29 11:25:28 crc kubenswrapper[4752]: I0929 11:25:28.014630 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-60ae-account-create-5djbr"] Sep 29 11:25:28 crc kubenswrapper[4752]: I0929 11:25:28.032407 4752 scope.go:117] "RemoveContainer" containerID="93752bca1235c82c7e20c88ea68e0afd59b9dc59d3315066b08789cc37a87e37" Sep 29 11:25:28 crc kubenswrapper[4752]: I0929 11:25:28.117347 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92lfp\" (UniqueName: \"kubernetes.io/projected/97c3c1de-9b22-442b-96b0-1df0d483fa12-kube-api-access-92lfp\") pod \"watcher-60ae-account-create-5djbr\" (UID: \"97c3c1de-9b22-442b-96b0-1df0d483fa12\") " pod="watcher-kuttl-default/watcher-60ae-account-create-5djbr" Sep 29 11:25:28 crc kubenswrapper[4752]: I0929 11:25:28.219630 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92lfp\" (UniqueName: \"kubernetes.io/projected/97c3c1de-9b22-442b-96b0-1df0d483fa12-kube-api-access-92lfp\") pod \"watcher-60ae-account-create-5djbr\" (UID: \"97c3c1de-9b22-442b-96b0-1df0d483fa12\") " pod="watcher-kuttl-default/watcher-60ae-account-create-5djbr" Sep 29 11:25:28 crc kubenswrapper[4752]: I0929 11:25:28.245656 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92lfp\" (UniqueName: \"kubernetes.io/projected/97c3c1de-9b22-442b-96b0-1df0d483fa12-kube-api-access-92lfp\") pod \"watcher-60ae-account-create-5djbr\" (UID: \"97c3c1de-9b22-442b-96b0-1df0d483fa12\") " pod="watcher-kuttl-default/watcher-60ae-account-create-5djbr" Sep 29 11:25:28 crc kubenswrapper[4752]: I0929 11:25:28.324327 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-60ae-account-create-5djbr" Sep 29 11:25:28 crc kubenswrapper[4752]: I0929 11:25:28.700321 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" event={"ID":"5863c243-797d-462a-b11f-71aaf005f8d1","Type":"ContainerStarted","Data":"62794379c57b608ef54dfd72ad21536541e2a354845aabac9be8f3dde3444511"} Sep 29 11:25:28 crc kubenswrapper[4752]: W0929 11:25:28.795123 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod97c3c1de_9b22_442b_96b0_1df0d483fa12.slice/crio-444dd6e68ad37a8b3910f49456c98e2a846d93bef7914b6ae8070a70115c46cf WatchSource:0}: Error finding container 444dd6e68ad37a8b3910f49456c98e2a846d93bef7914b6ae8070a70115c46cf: Status 404 returned error can't find the container with id 444dd6e68ad37a8b3910f49456c98e2a846d93bef7914b6ae8070a70115c46cf Sep 29 11:25:28 crc kubenswrapper[4752]: I0929 11:25:28.795582 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-60ae-account-create-5djbr"] Sep 29 11:25:29 crc kubenswrapper[4752]: I0929 11:25:29.384141 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-m7dts"] Sep 29 11:25:29 crc kubenswrapper[4752]: I0929 11:25:29.709759 4752 generic.go:334] "Generic (PLEG): container finished" podID="97c3c1de-9b22-442b-96b0-1df0d483fa12" containerID="3601f11d6fce7e797b457f7c41b8cf7b4a88ba2c4df1e732ad5453804b2f295f" exitCode=0 Sep 29 11:25:29 crc kubenswrapper[4752]: I0929 11:25:29.709871 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-60ae-account-create-5djbr" event={"ID":"97c3c1de-9b22-442b-96b0-1df0d483fa12","Type":"ContainerDied","Data":"3601f11d6fce7e797b457f7c41b8cf7b4a88ba2c4df1e732ad5453804b2f295f"} Sep 29 11:25:29 crc kubenswrapper[4752]: I0929 11:25:29.710120 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-60ae-account-create-5djbr" event={"ID":"97c3c1de-9b22-442b-96b0-1df0d483fa12","Type":"ContainerStarted","Data":"444dd6e68ad37a8b3910f49456c98e2a846d93bef7914b6ae8070a70115c46cf"} Sep 29 11:25:29 crc kubenswrapper[4752]: I0929 11:25:29.710371 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-m7dts" podUID="1454f20a-1684-4223-b8d5-b171a26471f7" containerName="registry-server" containerID="cri-o://8d03988aed9a075fcab2a1f6e02d54f5e160f734a38f4cdba18d027be88f385c" gracePeriod=2 Sep 29 11:25:30 crc kubenswrapper[4752]: I0929 11:25:30.287739 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m7dts" Sep 29 11:25:30 crc kubenswrapper[4752]: I0929 11:25:30.359158 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1454f20a-1684-4223-b8d5-b171a26471f7-utilities\") pod \"1454f20a-1684-4223-b8d5-b171a26471f7\" (UID: \"1454f20a-1684-4223-b8d5-b171a26471f7\") " Sep 29 11:25:30 crc kubenswrapper[4752]: I0929 11:25:30.359281 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1454f20a-1684-4223-b8d5-b171a26471f7-catalog-content\") pod \"1454f20a-1684-4223-b8d5-b171a26471f7\" (UID: \"1454f20a-1684-4223-b8d5-b171a26471f7\") " Sep 29 11:25:30 crc kubenswrapper[4752]: I0929 11:25:30.359348 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2xbgl\" (UniqueName: \"kubernetes.io/projected/1454f20a-1684-4223-b8d5-b171a26471f7-kube-api-access-2xbgl\") pod \"1454f20a-1684-4223-b8d5-b171a26471f7\" (UID: \"1454f20a-1684-4223-b8d5-b171a26471f7\") " Sep 29 11:25:30 crc kubenswrapper[4752]: I0929 11:25:30.360366 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1454f20a-1684-4223-b8d5-b171a26471f7-utilities" (OuterVolumeSpecName: "utilities") pod "1454f20a-1684-4223-b8d5-b171a26471f7" (UID: "1454f20a-1684-4223-b8d5-b171a26471f7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 11:25:30 crc kubenswrapper[4752]: I0929 11:25:30.366293 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1454f20a-1684-4223-b8d5-b171a26471f7-kube-api-access-2xbgl" (OuterVolumeSpecName: "kube-api-access-2xbgl") pod "1454f20a-1684-4223-b8d5-b171a26471f7" (UID: "1454f20a-1684-4223-b8d5-b171a26471f7"). InnerVolumeSpecName "kube-api-access-2xbgl". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 11:25:30 crc kubenswrapper[4752]: I0929 11:25:30.406603 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1454f20a-1684-4223-b8d5-b171a26471f7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1454f20a-1684-4223-b8d5-b171a26471f7" (UID: "1454f20a-1684-4223-b8d5-b171a26471f7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 11:25:30 crc kubenswrapper[4752]: I0929 11:25:30.461460 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1454f20a-1684-4223-b8d5-b171a26471f7-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 11:25:30 crc kubenswrapper[4752]: I0929 11:25:30.461497 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1454f20a-1684-4223-b8d5-b171a26471f7-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 11:25:30 crc kubenswrapper[4752]: I0929 11:25:30.461511 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2xbgl\" (UniqueName: \"kubernetes.io/projected/1454f20a-1684-4223-b8d5-b171a26471f7-kube-api-access-2xbgl\") on node \"crc\" DevicePath \"\"" Sep 29 11:25:30 crc kubenswrapper[4752]: I0929 11:25:30.723700 4752 generic.go:334] "Generic (PLEG): container finished" podID="1454f20a-1684-4223-b8d5-b171a26471f7" containerID="8d03988aed9a075fcab2a1f6e02d54f5e160f734a38f4cdba18d027be88f385c" exitCode=0 Sep 29 11:25:30 crc kubenswrapper[4752]: I0929 11:25:30.723758 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m7dts" Sep 29 11:25:30 crc kubenswrapper[4752]: I0929 11:25:30.723855 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m7dts" event={"ID":"1454f20a-1684-4223-b8d5-b171a26471f7","Type":"ContainerDied","Data":"8d03988aed9a075fcab2a1f6e02d54f5e160f734a38f4cdba18d027be88f385c"} Sep 29 11:25:30 crc kubenswrapper[4752]: I0929 11:25:30.723891 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m7dts" event={"ID":"1454f20a-1684-4223-b8d5-b171a26471f7","Type":"ContainerDied","Data":"ea49e3d73270f3d08fa4ccafb11e46460cb548ba50cc69a35af2de4255d80de4"} Sep 29 11:25:30 crc kubenswrapper[4752]: I0929 11:25:30.723914 4752 scope.go:117] "RemoveContainer" containerID="8d03988aed9a075fcab2a1f6e02d54f5e160f734a38f4cdba18d027be88f385c" Sep 29 11:25:30 crc kubenswrapper[4752]: I0929 11:25:30.755460 4752 scope.go:117] "RemoveContainer" containerID="7f02411cc8e40123ef978f225f832fe0ee4c67580564c7468028acfa6412a0a1" Sep 29 11:25:30 crc kubenswrapper[4752]: I0929 11:25:30.765039 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-m7dts"] Sep 29 11:25:30 crc kubenswrapper[4752]: I0929 11:25:30.772247 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-m7dts"] Sep 29 11:25:30 crc kubenswrapper[4752]: I0929 11:25:30.799863 4752 scope.go:117] "RemoveContainer" containerID="a73faccd294b880b7fe63ab59e3c771c853d10de90e82a084c7e1b2986691e63" Sep 29 11:25:30 crc kubenswrapper[4752]: I0929 11:25:30.830482 4752 scope.go:117] "RemoveContainer" containerID="8d03988aed9a075fcab2a1f6e02d54f5e160f734a38f4cdba18d027be88f385c" Sep 29 11:25:30 crc kubenswrapper[4752]: E0929 11:25:30.831090 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d03988aed9a075fcab2a1f6e02d54f5e160f734a38f4cdba18d027be88f385c\": container with ID starting with 8d03988aed9a075fcab2a1f6e02d54f5e160f734a38f4cdba18d027be88f385c not found: ID does not exist" containerID="8d03988aed9a075fcab2a1f6e02d54f5e160f734a38f4cdba18d027be88f385c" Sep 29 11:25:30 crc kubenswrapper[4752]: I0929 11:25:30.831153 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d03988aed9a075fcab2a1f6e02d54f5e160f734a38f4cdba18d027be88f385c"} err="failed to get container status \"8d03988aed9a075fcab2a1f6e02d54f5e160f734a38f4cdba18d027be88f385c\": rpc error: code = NotFound desc = could not find container \"8d03988aed9a075fcab2a1f6e02d54f5e160f734a38f4cdba18d027be88f385c\": container with ID starting with 8d03988aed9a075fcab2a1f6e02d54f5e160f734a38f4cdba18d027be88f385c not found: ID does not exist" Sep 29 11:25:30 crc kubenswrapper[4752]: I0929 11:25:30.831180 4752 scope.go:117] "RemoveContainer" containerID="7f02411cc8e40123ef978f225f832fe0ee4c67580564c7468028acfa6412a0a1" Sep 29 11:25:30 crc kubenswrapper[4752]: E0929 11:25:30.831495 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f02411cc8e40123ef978f225f832fe0ee4c67580564c7468028acfa6412a0a1\": container with ID starting with 7f02411cc8e40123ef978f225f832fe0ee4c67580564c7468028acfa6412a0a1 not found: ID does not exist" containerID="7f02411cc8e40123ef978f225f832fe0ee4c67580564c7468028acfa6412a0a1" Sep 29 11:25:30 crc kubenswrapper[4752]: I0929 11:25:30.831545 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f02411cc8e40123ef978f225f832fe0ee4c67580564c7468028acfa6412a0a1"} err="failed to get container status \"7f02411cc8e40123ef978f225f832fe0ee4c67580564c7468028acfa6412a0a1\": rpc error: code = NotFound desc = could not find container \"7f02411cc8e40123ef978f225f832fe0ee4c67580564c7468028acfa6412a0a1\": container with ID starting with 7f02411cc8e40123ef978f225f832fe0ee4c67580564c7468028acfa6412a0a1 not found: ID does not exist" Sep 29 11:25:30 crc kubenswrapper[4752]: I0929 11:25:30.831566 4752 scope.go:117] "RemoveContainer" containerID="a73faccd294b880b7fe63ab59e3c771c853d10de90e82a084c7e1b2986691e63" Sep 29 11:25:30 crc kubenswrapper[4752]: E0929 11:25:30.832056 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a73faccd294b880b7fe63ab59e3c771c853d10de90e82a084c7e1b2986691e63\": container with ID starting with a73faccd294b880b7fe63ab59e3c771c853d10de90e82a084c7e1b2986691e63 not found: ID does not exist" containerID="a73faccd294b880b7fe63ab59e3c771c853d10de90e82a084c7e1b2986691e63" Sep 29 11:25:30 crc kubenswrapper[4752]: I0929 11:25:30.832100 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a73faccd294b880b7fe63ab59e3c771c853d10de90e82a084c7e1b2986691e63"} err="failed to get container status \"a73faccd294b880b7fe63ab59e3c771c853d10de90e82a084c7e1b2986691e63\": rpc error: code = NotFound desc = could not find container \"a73faccd294b880b7fe63ab59e3c771c853d10de90e82a084c7e1b2986691e63\": container with ID starting with a73faccd294b880b7fe63ab59e3c771c853d10de90e82a084c7e1b2986691e63 not found: ID does not exist" Sep 29 11:25:30 crc kubenswrapper[4752]: I0929 11:25:30.996928 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-60ae-account-create-5djbr" Sep 29 11:25:31 crc kubenswrapper[4752]: I0929 11:25:31.068567 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-92lfp\" (UniqueName: \"kubernetes.io/projected/97c3c1de-9b22-442b-96b0-1df0d483fa12-kube-api-access-92lfp\") pod \"97c3c1de-9b22-442b-96b0-1df0d483fa12\" (UID: \"97c3c1de-9b22-442b-96b0-1df0d483fa12\") " Sep 29 11:25:31 crc kubenswrapper[4752]: I0929 11:25:31.073123 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97c3c1de-9b22-442b-96b0-1df0d483fa12-kube-api-access-92lfp" (OuterVolumeSpecName: "kube-api-access-92lfp") pod "97c3c1de-9b22-442b-96b0-1df0d483fa12" (UID: "97c3c1de-9b22-442b-96b0-1df0d483fa12"). InnerVolumeSpecName "kube-api-access-92lfp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 11:25:31 crc kubenswrapper[4752]: I0929 11:25:31.169898 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-92lfp\" (UniqueName: \"kubernetes.io/projected/97c3c1de-9b22-442b-96b0-1df0d483fa12-kube-api-access-92lfp\") on node \"crc\" DevicePath \"\"" Sep 29 11:25:31 crc kubenswrapper[4752]: I0929 11:25:31.733425 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-60ae-account-create-5djbr" event={"ID":"97c3c1de-9b22-442b-96b0-1df0d483fa12","Type":"ContainerDied","Data":"444dd6e68ad37a8b3910f49456c98e2a846d93bef7914b6ae8070a70115c46cf"} Sep 29 11:25:31 crc kubenswrapper[4752]: I0929 11:25:31.733475 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="444dd6e68ad37a8b3910f49456c98e2a846d93bef7914b6ae8070a70115c46cf" Sep 29 11:25:31 crc kubenswrapper[4752]: I0929 11:25:31.733533 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-60ae-account-create-5djbr" Sep 29 11:25:32 crc kubenswrapper[4752]: I0929 11:25:32.040150 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1454f20a-1684-4223-b8d5-b171a26471f7" path="/var/lib/kubelet/pods/1454f20a-1684-4223-b8d5-b171a26471f7/volumes" Sep 29 11:25:33 crc kubenswrapper[4752]: I0929 11:25:33.254727 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-drxw2"] Sep 29 11:25:33 crc kubenswrapper[4752]: E0929 11:25:33.255726 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1454f20a-1684-4223-b8d5-b171a26471f7" containerName="extract-utilities" Sep 29 11:25:33 crc kubenswrapper[4752]: I0929 11:25:33.255745 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="1454f20a-1684-4223-b8d5-b171a26471f7" containerName="extract-utilities" Sep 29 11:25:33 crc kubenswrapper[4752]: E0929 11:25:33.255887 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1454f20a-1684-4223-b8d5-b171a26471f7" containerName="extract-content" Sep 29 11:25:33 crc kubenswrapper[4752]: I0929 11:25:33.255933 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="1454f20a-1684-4223-b8d5-b171a26471f7" containerName="extract-content" Sep 29 11:25:33 crc kubenswrapper[4752]: E0929 11:25:33.255953 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1454f20a-1684-4223-b8d5-b171a26471f7" containerName="registry-server" Sep 29 11:25:33 crc kubenswrapper[4752]: I0929 11:25:33.255964 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="1454f20a-1684-4223-b8d5-b171a26471f7" containerName="registry-server" Sep 29 11:25:33 crc kubenswrapper[4752]: E0929 11:25:33.255976 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97c3c1de-9b22-442b-96b0-1df0d483fa12" containerName="mariadb-account-create" Sep 29 11:25:33 crc kubenswrapper[4752]: I0929 11:25:33.255990 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="97c3c1de-9b22-442b-96b0-1df0d483fa12" containerName="mariadb-account-create" Sep 29 11:25:33 crc kubenswrapper[4752]: I0929 11:25:33.256245 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="97c3c1de-9b22-442b-96b0-1df0d483fa12" containerName="mariadb-account-create" Sep 29 11:25:33 crc kubenswrapper[4752]: I0929 11:25:33.256265 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="1454f20a-1684-4223-b8d5-b171a26471f7" containerName="registry-server" Sep 29 11:25:33 crc kubenswrapper[4752]: I0929 11:25:33.257065 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-drxw2" Sep 29 11:25:33 crc kubenswrapper[4752]: I0929 11:25:33.259384 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-watcher-kuttl-dockercfg-9d7gs" Sep 29 11:25:33 crc kubenswrapper[4752]: I0929 11:25:33.260754 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-config-data" Sep 29 11:25:33 crc kubenswrapper[4752]: I0929 11:25:33.265229 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-drxw2"] Sep 29 11:25:33 crc kubenswrapper[4752]: I0929 11:25:33.303590 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hr755\" (UniqueName: \"kubernetes.io/projected/107ba64b-b986-4e7a-bfe3-e9f6b8a68e0e-kube-api-access-hr755\") pod \"watcher-kuttl-db-sync-drxw2\" (UID: \"107ba64b-b986-4e7a-bfe3-e9f6b8a68e0e\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-drxw2" Sep 29 11:25:33 crc kubenswrapper[4752]: I0929 11:25:33.303663 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/107ba64b-b986-4e7a-bfe3-e9f6b8a68e0e-db-sync-config-data\") pod \"watcher-kuttl-db-sync-drxw2\" (UID: \"107ba64b-b986-4e7a-bfe3-e9f6b8a68e0e\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-drxw2" Sep 29 11:25:33 crc kubenswrapper[4752]: I0929 11:25:33.303703 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/107ba64b-b986-4e7a-bfe3-e9f6b8a68e0e-config-data\") pod \"watcher-kuttl-db-sync-drxw2\" (UID: \"107ba64b-b986-4e7a-bfe3-e9f6b8a68e0e\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-drxw2" Sep 29 11:25:33 crc kubenswrapper[4752]: I0929 11:25:33.405727 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/107ba64b-b986-4e7a-bfe3-e9f6b8a68e0e-db-sync-config-data\") pod \"watcher-kuttl-db-sync-drxw2\" (UID: \"107ba64b-b986-4e7a-bfe3-e9f6b8a68e0e\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-drxw2" Sep 29 11:25:33 crc kubenswrapper[4752]: I0929 11:25:33.405859 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/107ba64b-b986-4e7a-bfe3-e9f6b8a68e0e-config-data\") pod \"watcher-kuttl-db-sync-drxw2\" (UID: \"107ba64b-b986-4e7a-bfe3-e9f6b8a68e0e\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-drxw2" Sep 29 11:25:33 crc kubenswrapper[4752]: I0929 11:25:33.405967 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hr755\" (UniqueName: \"kubernetes.io/projected/107ba64b-b986-4e7a-bfe3-e9f6b8a68e0e-kube-api-access-hr755\") pod \"watcher-kuttl-db-sync-drxw2\" (UID: \"107ba64b-b986-4e7a-bfe3-e9f6b8a68e0e\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-drxw2" Sep 29 11:25:33 crc kubenswrapper[4752]: I0929 11:25:33.411426 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/107ba64b-b986-4e7a-bfe3-e9f6b8a68e0e-db-sync-config-data\") pod \"watcher-kuttl-db-sync-drxw2\" (UID: \"107ba64b-b986-4e7a-bfe3-e9f6b8a68e0e\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-drxw2" Sep 29 11:25:33 crc kubenswrapper[4752]: I0929 11:25:33.412294 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/107ba64b-b986-4e7a-bfe3-e9f6b8a68e0e-config-data\") pod \"watcher-kuttl-db-sync-drxw2\" (UID: \"107ba64b-b986-4e7a-bfe3-e9f6b8a68e0e\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-drxw2" Sep 29 11:25:33 crc kubenswrapper[4752]: I0929 11:25:33.421640 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hr755\" (UniqueName: \"kubernetes.io/projected/107ba64b-b986-4e7a-bfe3-e9f6b8a68e0e-kube-api-access-hr755\") pod \"watcher-kuttl-db-sync-drxw2\" (UID: \"107ba64b-b986-4e7a-bfe3-e9f6b8a68e0e\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-drxw2" Sep 29 11:25:33 crc kubenswrapper[4752]: I0929 11:25:33.575226 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-drxw2" Sep 29 11:25:34 crc kubenswrapper[4752]: I0929 11:25:34.165911 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-drxw2"] Sep 29 11:25:34 crc kubenswrapper[4752]: I0929 11:25:34.770327 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-drxw2" event={"ID":"107ba64b-b986-4e7a-bfe3-e9f6b8a68e0e","Type":"ContainerStarted","Data":"cec29d708524b25c21787a568e273c2da24bcda5189c74bd71dbe5c47cb35afa"} Sep 29 11:25:34 crc kubenswrapper[4752]: I0929 11:25:34.770653 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-drxw2" event={"ID":"107ba64b-b986-4e7a-bfe3-e9f6b8a68e0e","Type":"ContainerStarted","Data":"65761525fc796141833394379760c15aafcf402c9db5d4117393a637e1a020f4"} Sep 29 11:25:34 crc kubenswrapper[4752]: I0929 11:25:34.793025 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-db-sync-drxw2" podStartSLOduration=1.793002908 podStartE2EDuration="1.793002908s" podCreationTimestamp="2025-09-29 11:25:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 11:25:34.78804711 +0000 UTC m=+2475.577188787" watchObservedRunningTime="2025-09-29 11:25:34.793002908 +0000 UTC m=+2475.582144575" Sep 29 11:25:36 crc kubenswrapper[4752]: I0929 11:25:36.785710 4752 generic.go:334] "Generic (PLEG): container finished" podID="107ba64b-b986-4e7a-bfe3-e9f6b8a68e0e" containerID="cec29d708524b25c21787a568e273c2da24bcda5189c74bd71dbe5c47cb35afa" exitCode=0 Sep 29 11:25:36 crc kubenswrapper[4752]: I0929 11:25:36.785809 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-drxw2" event={"ID":"107ba64b-b986-4e7a-bfe3-e9f6b8a68e0e","Type":"ContainerDied","Data":"cec29d708524b25c21787a568e273c2da24bcda5189c74bd71dbe5c47cb35afa"} Sep 29 11:25:38 crc kubenswrapper[4752]: I0929 11:25:38.130267 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-drxw2" Sep 29 11:25:38 crc kubenswrapper[4752]: I0929 11:25:38.178005 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hr755\" (UniqueName: \"kubernetes.io/projected/107ba64b-b986-4e7a-bfe3-e9f6b8a68e0e-kube-api-access-hr755\") pod \"107ba64b-b986-4e7a-bfe3-e9f6b8a68e0e\" (UID: \"107ba64b-b986-4e7a-bfe3-e9f6b8a68e0e\") " Sep 29 11:25:38 crc kubenswrapper[4752]: I0929 11:25:38.178119 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/107ba64b-b986-4e7a-bfe3-e9f6b8a68e0e-config-data\") pod \"107ba64b-b986-4e7a-bfe3-e9f6b8a68e0e\" (UID: \"107ba64b-b986-4e7a-bfe3-e9f6b8a68e0e\") " Sep 29 11:25:38 crc kubenswrapper[4752]: I0929 11:25:38.178166 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/107ba64b-b986-4e7a-bfe3-e9f6b8a68e0e-db-sync-config-data\") pod \"107ba64b-b986-4e7a-bfe3-e9f6b8a68e0e\" (UID: \"107ba64b-b986-4e7a-bfe3-e9f6b8a68e0e\") " Sep 29 11:25:38 crc kubenswrapper[4752]: I0929 11:25:38.185998 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/107ba64b-b986-4e7a-bfe3-e9f6b8a68e0e-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "107ba64b-b986-4e7a-bfe3-e9f6b8a68e0e" (UID: "107ba64b-b986-4e7a-bfe3-e9f6b8a68e0e"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:25:38 crc kubenswrapper[4752]: I0929 11:25:38.186078 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/107ba64b-b986-4e7a-bfe3-e9f6b8a68e0e-kube-api-access-hr755" (OuterVolumeSpecName: "kube-api-access-hr755") pod "107ba64b-b986-4e7a-bfe3-e9f6b8a68e0e" (UID: "107ba64b-b986-4e7a-bfe3-e9f6b8a68e0e"). InnerVolumeSpecName "kube-api-access-hr755". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 11:25:38 crc kubenswrapper[4752]: I0929 11:25:38.223409 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/107ba64b-b986-4e7a-bfe3-e9f6b8a68e0e-config-data" (OuterVolumeSpecName: "config-data") pod "107ba64b-b986-4e7a-bfe3-e9f6b8a68e0e" (UID: "107ba64b-b986-4e7a-bfe3-e9f6b8a68e0e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:25:38 crc kubenswrapper[4752]: I0929 11:25:38.280495 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/107ba64b-b986-4e7a-bfe3-e9f6b8a68e0e-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 11:25:38 crc kubenswrapper[4752]: I0929 11:25:38.280733 4752 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/107ba64b-b986-4e7a-bfe3-e9f6b8a68e0e-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 11:25:38 crc kubenswrapper[4752]: I0929 11:25:38.280906 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hr755\" (UniqueName: \"kubernetes.io/projected/107ba64b-b986-4e7a-bfe3-e9f6b8a68e0e-kube-api-access-hr755\") on node \"crc\" DevicePath \"\"" Sep 29 11:25:38 crc kubenswrapper[4752]: I0929 11:25:38.803414 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-drxw2" event={"ID":"107ba64b-b986-4e7a-bfe3-e9f6b8a68e0e","Type":"ContainerDied","Data":"65761525fc796141833394379760c15aafcf402c9db5d4117393a637e1a020f4"} Sep 29 11:25:38 crc kubenswrapper[4752]: I0929 11:25:38.803460 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="65761525fc796141833394379760c15aafcf402c9db5d4117393a637e1a020f4" Sep 29 11:25:38 crc kubenswrapper[4752]: I0929 11:25:38.803517 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-drxw2" Sep 29 11:25:39 crc kubenswrapper[4752]: I0929 11:25:39.073643 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Sep 29 11:25:39 crc kubenswrapper[4752]: E0929 11:25:39.073948 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="107ba64b-b986-4e7a-bfe3-e9f6b8a68e0e" containerName="watcher-kuttl-db-sync" Sep 29 11:25:39 crc kubenswrapper[4752]: I0929 11:25:39.073961 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="107ba64b-b986-4e7a-bfe3-e9f6b8a68e0e" containerName="watcher-kuttl-db-sync" Sep 29 11:25:39 crc kubenswrapper[4752]: I0929 11:25:39.074136 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="107ba64b-b986-4e7a-bfe3-e9f6b8a68e0e" containerName="watcher-kuttl-db-sync" Sep 29 11:25:39 crc kubenswrapper[4752]: I0929 11:25:39.074920 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:25:39 crc kubenswrapper[4752]: I0929 11:25:39.077282 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-watcher-kuttl-dockercfg-9d7gs" Sep 29 11:25:39 crc kubenswrapper[4752]: I0929 11:25:39.078186 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-api-config-data" Sep 29 11:25:39 crc kubenswrapper[4752]: I0929 11:25:39.091064 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Sep 29 11:25:39 crc kubenswrapper[4752]: I0929 11:25:39.092559 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:25:39 crc kubenswrapper[4752]: I0929 11:25:39.096965 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-decision-engine-config-data" Sep 29 11:25:39 crc kubenswrapper[4752]: I0929 11:25:39.098953 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Sep 29 11:25:39 crc kubenswrapper[4752]: I0929 11:25:39.104744 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-1"] Sep 29 11:25:39 crc kubenswrapper[4752]: I0929 11:25:39.106524 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-1" Sep 29 11:25:39 crc kubenswrapper[4752]: I0929 11:25:39.116011 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Sep 29 11:25:39 crc kubenswrapper[4752]: I0929 11:25:39.117509 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:25:39 crc kubenswrapper[4752]: I0929 11:25:39.122973 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Sep 29 11:25:39 crc kubenswrapper[4752]: I0929 11:25:39.128714 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-applier-config-data" Sep 29 11:25:39 crc kubenswrapper[4752]: I0929 11:25:39.164822 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-1"] Sep 29 11:25:39 crc kubenswrapper[4752]: I0929 11:25:39.177141 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Sep 29 11:25:39 crc kubenswrapper[4752]: I0929 11:25:39.194693 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95d7502d-c8b2-4023-a56f-f69bf1dc4b0f-config-data\") pod \"watcher-kuttl-api-1\" (UID: \"95d7502d-c8b2-4023-a56f-f69bf1dc4b0f\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Sep 29 11:25:39 crc kubenswrapper[4752]: I0929 11:25:39.194993 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/514664cb-8dd6-4485-bd72-85346d81346c-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"514664cb-8dd6-4485-bd72-85346d81346c\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:25:39 crc kubenswrapper[4752]: I0929 11:25:39.195114 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsj5r\" (UniqueName: \"kubernetes.io/projected/f840aee8-6059-4658-89f6-09d799f64614-kube-api-access-qsj5r\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"f840aee8-6059-4658-89f6-09d799f64614\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:25:39 crc kubenswrapper[4752]: I0929 11:25:39.195411 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzmv7\" (UniqueName: \"kubernetes.io/projected/95d7502d-c8b2-4023-a56f-f69bf1dc4b0f-kube-api-access-lzmv7\") pod \"watcher-kuttl-api-1\" (UID: \"95d7502d-c8b2-4023-a56f-f69bf1dc4b0f\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Sep 29 11:25:39 crc kubenswrapper[4752]: I0929 11:25:39.195543 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f840aee8-6059-4658-89f6-09d799f64614-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"f840aee8-6059-4658-89f6-09d799f64614\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:25:39 crc kubenswrapper[4752]: I0929 11:25:39.195617 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f840aee8-6059-4658-89f6-09d799f64614-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"f840aee8-6059-4658-89f6-09d799f64614\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:25:39 crc kubenswrapper[4752]: I0929 11:25:39.195724 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95d7502d-c8b2-4023-a56f-f69bf1dc4b0f-logs\") pod \"watcher-kuttl-api-1\" (UID: \"95d7502d-c8b2-4023-a56f-f69bf1dc4b0f\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Sep 29 11:25:39 crc kubenswrapper[4752]: I0929 11:25:39.195837 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/f840aee8-6059-4658-89f6-09d799f64614-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"f840aee8-6059-4658-89f6-09d799f64614\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:25:39 crc kubenswrapper[4752]: I0929 11:25:39.195991 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/514664cb-8dd6-4485-bd72-85346d81346c-logs\") pod \"watcher-kuttl-api-0\" (UID: \"514664cb-8dd6-4485-bd72-85346d81346c\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:25:39 crc kubenswrapper[4752]: I0929 11:25:39.196043 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/514664cb-8dd6-4485-bd72-85346d81346c-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"514664cb-8dd6-4485-bd72-85346d81346c\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:25:39 crc kubenswrapper[4752]: I0929 11:25:39.196253 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a72ff6e9-9f96-4da4-979c-7ef8afdc59c7-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"a72ff6e9-9f96-4da4-979c-7ef8afdc59c7\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:25:39 crc kubenswrapper[4752]: I0929 11:25:39.196291 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpvll\" (UniqueName: \"kubernetes.io/projected/a72ff6e9-9f96-4da4-979c-7ef8afdc59c7-kube-api-access-cpvll\") pod \"watcher-kuttl-applier-0\" (UID: \"a72ff6e9-9f96-4da4-979c-7ef8afdc59c7\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:25:39 crc kubenswrapper[4752]: I0929 11:25:39.196451 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/95d7502d-c8b2-4023-a56f-f69bf1dc4b0f-custom-prometheus-ca\") pod \"watcher-kuttl-api-1\" (UID: \"95d7502d-c8b2-4023-a56f-f69bf1dc4b0f\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Sep 29 11:25:39 crc kubenswrapper[4752]: I0929 11:25:39.196502 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wpg9\" (UniqueName: \"kubernetes.io/projected/514664cb-8dd6-4485-bd72-85346d81346c-kube-api-access-9wpg9\") pod \"watcher-kuttl-api-0\" (UID: \"514664cb-8dd6-4485-bd72-85346d81346c\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:25:39 crc kubenswrapper[4752]: I0929 11:25:39.196527 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a72ff6e9-9f96-4da4-979c-7ef8afdc59c7-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"a72ff6e9-9f96-4da4-979c-7ef8afdc59c7\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:25:39 crc kubenswrapper[4752]: I0929 11:25:39.298528 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/514664cb-8dd6-4485-bd72-85346d81346c-logs\") pod \"watcher-kuttl-api-0\" (UID: \"514664cb-8dd6-4485-bd72-85346d81346c\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:25:39 crc kubenswrapper[4752]: I0929 11:25:39.298574 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/514664cb-8dd6-4485-bd72-85346d81346c-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"514664cb-8dd6-4485-bd72-85346d81346c\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:25:39 crc kubenswrapper[4752]: I0929 11:25:39.298650 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a72ff6e9-9f96-4da4-979c-7ef8afdc59c7-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"a72ff6e9-9f96-4da4-979c-7ef8afdc59c7\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:25:39 crc kubenswrapper[4752]: I0929 11:25:39.298821 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cpvll\" (UniqueName: \"kubernetes.io/projected/a72ff6e9-9f96-4da4-979c-7ef8afdc59c7-kube-api-access-cpvll\") pod \"watcher-kuttl-applier-0\" (UID: \"a72ff6e9-9f96-4da4-979c-7ef8afdc59c7\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:25:39 crc kubenswrapper[4752]: I0929 11:25:39.299163 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/514664cb-8dd6-4485-bd72-85346d81346c-logs\") pod \"watcher-kuttl-api-0\" (UID: \"514664cb-8dd6-4485-bd72-85346d81346c\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:25:39 crc kubenswrapper[4752]: I0929 11:25:39.299349 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a72ff6e9-9f96-4da4-979c-7ef8afdc59c7-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"a72ff6e9-9f96-4da4-979c-7ef8afdc59c7\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:25:39 crc kubenswrapper[4752]: I0929 11:25:39.299471 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/95d7502d-c8b2-4023-a56f-f69bf1dc4b0f-custom-prometheus-ca\") pod \"watcher-kuttl-api-1\" (UID: \"95d7502d-c8b2-4023-a56f-f69bf1dc4b0f\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Sep 29 11:25:39 crc kubenswrapper[4752]: I0929 11:25:39.299521 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wpg9\" (UniqueName: \"kubernetes.io/projected/514664cb-8dd6-4485-bd72-85346d81346c-kube-api-access-9wpg9\") pod \"watcher-kuttl-api-0\" (UID: \"514664cb-8dd6-4485-bd72-85346d81346c\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:25:39 crc kubenswrapper[4752]: I0929 11:25:39.299539 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a72ff6e9-9f96-4da4-979c-7ef8afdc59c7-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"a72ff6e9-9f96-4da4-979c-7ef8afdc59c7\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:25:39 crc kubenswrapper[4752]: I0929 11:25:39.299638 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95d7502d-c8b2-4023-a56f-f69bf1dc4b0f-config-data\") pod \"watcher-kuttl-api-1\" (UID: \"95d7502d-c8b2-4023-a56f-f69bf1dc4b0f\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Sep 29 11:25:39 crc kubenswrapper[4752]: I0929 11:25:39.299679 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/514664cb-8dd6-4485-bd72-85346d81346c-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"514664cb-8dd6-4485-bd72-85346d81346c\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:25:39 crc kubenswrapper[4752]: I0929 11:25:39.299699 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsj5r\" (UniqueName: \"kubernetes.io/projected/f840aee8-6059-4658-89f6-09d799f64614-kube-api-access-qsj5r\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"f840aee8-6059-4658-89f6-09d799f64614\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:25:39 crc kubenswrapper[4752]: I0929 11:25:39.299721 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzmv7\" (UniqueName: \"kubernetes.io/projected/95d7502d-c8b2-4023-a56f-f69bf1dc4b0f-kube-api-access-lzmv7\") pod \"watcher-kuttl-api-1\" (UID: \"95d7502d-c8b2-4023-a56f-f69bf1dc4b0f\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Sep 29 11:25:39 crc kubenswrapper[4752]: I0929 11:25:39.299753 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f840aee8-6059-4658-89f6-09d799f64614-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"f840aee8-6059-4658-89f6-09d799f64614\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:25:39 crc kubenswrapper[4752]: I0929 11:25:39.299776 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f840aee8-6059-4658-89f6-09d799f64614-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"f840aee8-6059-4658-89f6-09d799f64614\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:25:39 crc kubenswrapper[4752]: I0929 11:25:39.299820 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95d7502d-c8b2-4023-a56f-f69bf1dc4b0f-logs\") pod \"watcher-kuttl-api-1\" (UID: \"95d7502d-c8b2-4023-a56f-f69bf1dc4b0f\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Sep 29 11:25:39 crc kubenswrapper[4752]: I0929 11:25:39.299843 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/f840aee8-6059-4658-89f6-09d799f64614-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"f840aee8-6059-4658-89f6-09d799f64614\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:25:39 crc kubenswrapper[4752]: I0929 11:25:39.300573 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f840aee8-6059-4658-89f6-09d799f64614-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"f840aee8-6059-4658-89f6-09d799f64614\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:25:39 crc kubenswrapper[4752]: I0929 11:25:39.300863 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95d7502d-c8b2-4023-a56f-f69bf1dc4b0f-logs\") pod \"watcher-kuttl-api-1\" (UID: \"95d7502d-c8b2-4023-a56f-f69bf1dc4b0f\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Sep 29 11:25:39 crc kubenswrapper[4752]: I0929 11:25:39.305904 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/f840aee8-6059-4658-89f6-09d799f64614-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"f840aee8-6059-4658-89f6-09d799f64614\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:25:39 crc kubenswrapper[4752]: I0929 11:25:39.305902 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/514664cb-8dd6-4485-bd72-85346d81346c-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"514664cb-8dd6-4485-bd72-85346d81346c\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:25:39 crc kubenswrapper[4752]: I0929 11:25:39.306692 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a72ff6e9-9f96-4da4-979c-7ef8afdc59c7-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"a72ff6e9-9f96-4da4-979c-7ef8afdc59c7\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:25:39 crc kubenswrapper[4752]: I0929 11:25:39.307073 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f840aee8-6059-4658-89f6-09d799f64614-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"f840aee8-6059-4658-89f6-09d799f64614\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:25:39 crc kubenswrapper[4752]: I0929 11:25:39.307416 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95d7502d-c8b2-4023-a56f-f69bf1dc4b0f-config-data\") pod \"watcher-kuttl-api-1\" (UID: \"95d7502d-c8b2-4023-a56f-f69bf1dc4b0f\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Sep 29 11:25:39 crc kubenswrapper[4752]: I0929 11:25:39.308463 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/514664cb-8dd6-4485-bd72-85346d81346c-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"514664cb-8dd6-4485-bd72-85346d81346c\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:25:39 crc kubenswrapper[4752]: I0929 11:25:39.319176 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/95d7502d-c8b2-4023-a56f-f69bf1dc4b0f-custom-prometheus-ca\") pod \"watcher-kuttl-api-1\" (UID: \"95d7502d-c8b2-4023-a56f-f69bf1dc4b0f\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Sep 29 11:25:39 crc kubenswrapper[4752]: I0929 11:25:39.323006 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzmv7\" (UniqueName: \"kubernetes.io/projected/95d7502d-c8b2-4023-a56f-f69bf1dc4b0f-kube-api-access-lzmv7\") pod \"watcher-kuttl-api-1\" (UID: \"95d7502d-c8b2-4023-a56f-f69bf1dc4b0f\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Sep 29 11:25:39 crc kubenswrapper[4752]: I0929 11:25:39.324444 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsj5r\" (UniqueName: \"kubernetes.io/projected/f840aee8-6059-4658-89f6-09d799f64614-kube-api-access-qsj5r\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"f840aee8-6059-4658-89f6-09d799f64614\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:25:39 crc kubenswrapper[4752]: I0929 11:25:39.324604 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpvll\" (UniqueName: \"kubernetes.io/projected/a72ff6e9-9f96-4da4-979c-7ef8afdc59c7-kube-api-access-cpvll\") pod \"watcher-kuttl-applier-0\" (UID: \"a72ff6e9-9f96-4da4-979c-7ef8afdc59c7\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:25:39 crc kubenswrapper[4752]: I0929 11:25:39.329234 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wpg9\" (UniqueName: \"kubernetes.io/projected/514664cb-8dd6-4485-bd72-85346d81346c-kube-api-access-9wpg9\") pod \"watcher-kuttl-api-0\" (UID: \"514664cb-8dd6-4485-bd72-85346d81346c\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:25:39 crc kubenswrapper[4752]: I0929 11:25:39.388995 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:25:39 crc kubenswrapper[4752]: I0929 11:25:39.410288 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:25:39 crc kubenswrapper[4752]: I0929 11:25:39.430005 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-1" Sep 29 11:25:39 crc kubenswrapper[4752]: I0929 11:25:39.451541 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:25:39 crc kubenswrapper[4752]: I0929 11:25:39.724282 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Sep 29 11:25:39 crc kubenswrapper[4752]: W0929 11:25:39.729249 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod514664cb_8dd6_4485_bd72_85346d81346c.slice/crio-37793d51589d92bedc26ed575f05659b3b303681d6ce881ab98bf76820a7c3f5 WatchSource:0}: Error finding container 37793d51589d92bedc26ed575f05659b3b303681d6ce881ab98bf76820a7c3f5: Status 404 returned error can't find the container with id 37793d51589d92bedc26ed575f05659b3b303681d6ce881ab98bf76820a7c3f5 Sep 29 11:25:39 crc kubenswrapper[4752]: I0929 11:25:39.827415 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"514664cb-8dd6-4485-bd72-85346d81346c","Type":"ContainerStarted","Data":"37793d51589d92bedc26ed575f05659b3b303681d6ce881ab98bf76820a7c3f5"} Sep 29 11:25:40 crc kubenswrapper[4752]: I0929 11:25:40.002069 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Sep 29 11:25:40 crc kubenswrapper[4752]: I0929 11:25:40.012654 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-1"] Sep 29 11:25:40 crc kubenswrapper[4752]: W0929 11:25:40.015126 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf840aee8_6059_4658_89f6_09d799f64614.slice/crio-a373b497c645ed88afd0f91bef6dcacb4cce9578ff28cd821f11a51e53b72922 WatchSource:0}: Error finding container a373b497c645ed88afd0f91bef6dcacb4cce9578ff28cd821f11a51e53b72922: Status 404 returned error can't find the container with id a373b497c645ed88afd0f91bef6dcacb4cce9578ff28cd821f11a51e53b72922 Sep 29 11:25:40 crc kubenswrapper[4752]: I0929 11:25:40.144701 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Sep 29 11:25:40 crc kubenswrapper[4752]: I0929 11:25:40.838375 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"f840aee8-6059-4658-89f6-09d799f64614","Type":"ContainerStarted","Data":"808ada2e1206fb1b6bc6be64d4e7369c387d11d87d065bac5c3baa3cd2d89e8f"} Sep 29 11:25:40 crc kubenswrapper[4752]: I0929 11:25:40.838807 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"f840aee8-6059-4658-89f6-09d799f64614","Type":"ContainerStarted","Data":"a373b497c645ed88afd0f91bef6dcacb4cce9578ff28cd821f11a51e53b72922"} Sep 29 11:25:40 crc kubenswrapper[4752]: I0929 11:25:40.840403 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-1" event={"ID":"95d7502d-c8b2-4023-a56f-f69bf1dc4b0f","Type":"ContainerStarted","Data":"c678f0d8a2213be62656dc64a578177c4c5f70f2a4ba2376857466766218c920"} Sep 29 11:25:40 crc kubenswrapper[4752]: I0929 11:25:40.840440 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-1" event={"ID":"95d7502d-c8b2-4023-a56f-f69bf1dc4b0f","Type":"ContainerStarted","Data":"94070899e8acd4d5e8e47c4c0add85ef44035fb6afb55608e67393ef4d7d1331"} Sep 29 11:25:40 crc kubenswrapper[4752]: I0929 11:25:40.840453 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-1" event={"ID":"95d7502d-c8b2-4023-a56f-f69bf1dc4b0f","Type":"ContainerStarted","Data":"acd23cdd52edd2f93622223d2187bfed18cb9f4d12b581a73d267dfbcaf64489"} Sep 29 11:25:40 crc kubenswrapper[4752]: I0929 11:25:40.842049 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-1" Sep 29 11:25:40 crc kubenswrapper[4752]: I0929 11:25:40.844849 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"514664cb-8dd6-4485-bd72-85346d81346c","Type":"ContainerStarted","Data":"c2aeb3e958ca36cc87702187b7dbbe6665610fbcc4dd0fab6df4723e465fe7cc"} Sep 29 11:25:40 crc kubenswrapper[4752]: I0929 11:25:40.844923 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"514664cb-8dd6-4485-bd72-85346d81346c","Type":"ContainerStarted","Data":"882177c1010a738182504c8ff4965c48e3c8b9f425c2477b2a4b427b37c4951e"} Sep 29 11:25:40 crc kubenswrapper[4752]: I0929 11:25:40.845134 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:25:40 crc kubenswrapper[4752]: I0929 11:25:40.847671 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"a72ff6e9-9f96-4da4-979c-7ef8afdc59c7","Type":"ContainerStarted","Data":"86992edd597bf5fa1f0ab937814c8b38338a68164063335ef3b4711dc0323086"} Sep 29 11:25:40 crc kubenswrapper[4752]: I0929 11:25:40.847732 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"a72ff6e9-9f96-4da4-979c-7ef8afdc59c7","Type":"ContainerStarted","Data":"6b808184fa8d5c32f0c6b1b6b6066850985fff4296c723b7af28f8c8380523a1"} Sep 29 11:25:40 crc kubenswrapper[4752]: I0929 11:25:40.860185 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podStartSLOduration=1.8601621499999998 podStartE2EDuration="1.86016215s" podCreationTimestamp="2025-09-29 11:25:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 11:25:40.857401088 +0000 UTC m=+2481.646542755" watchObservedRunningTime="2025-09-29 11:25:40.86016215 +0000 UTC m=+2481.649303817" Sep 29 11:25:40 crc kubenswrapper[4752]: I0929 11:25:40.878528 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-api-0" podStartSLOduration=1.878507417 podStartE2EDuration="1.878507417s" podCreationTimestamp="2025-09-29 11:25:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 11:25:40.87709591 +0000 UTC m=+2481.666237577" watchObservedRunningTime="2025-09-29 11:25:40.878507417 +0000 UTC m=+2481.667649084" Sep 29 11:25:40 crc kubenswrapper[4752]: I0929 11:25:40.922201 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-api-1" podStartSLOduration=1.922185754 podStartE2EDuration="1.922185754s" podCreationTimestamp="2025-09-29 11:25:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 11:25:40.90514471 +0000 UTC m=+2481.694286397" watchObservedRunningTime="2025-09-29 11:25:40.922185754 +0000 UTC m=+2481.711327421" Sep 29 11:25:40 crc kubenswrapper[4752]: I0929 11:25:40.924305 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podStartSLOduration=1.924295978 podStartE2EDuration="1.924295978s" podCreationTimestamp="2025-09-29 11:25:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 11:25:40.919959335 +0000 UTC m=+2481.709101002" watchObservedRunningTime="2025-09-29 11:25:40.924295978 +0000 UTC m=+2481.713437645" Sep 29 11:25:42 crc kubenswrapper[4752]: I0929 11:25:42.861373 4752 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 29 11:25:43 crc kubenswrapper[4752]: I0929 11:25:43.277235 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:25:43 crc kubenswrapper[4752]: I0929 11:25:43.480169 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-1" Sep 29 11:25:43 crc kubenswrapper[4752]: I0929 11:25:43.877116 4752 generic.go:334] "Generic (PLEG): container finished" podID="f840aee8-6059-4658-89f6-09d799f64614" containerID="808ada2e1206fb1b6bc6be64d4e7369c387d11d87d065bac5c3baa3cd2d89e8f" exitCode=1 Sep 29 11:25:43 crc kubenswrapper[4752]: I0929 11:25:43.877185 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"f840aee8-6059-4658-89f6-09d799f64614","Type":"ContainerDied","Data":"808ada2e1206fb1b6bc6be64d4e7369c387d11d87d065bac5c3baa3cd2d89e8f"} Sep 29 11:25:43 crc kubenswrapper[4752]: I0929 11:25:43.893514 4752 scope.go:117] "RemoveContainer" containerID="808ada2e1206fb1b6bc6be64d4e7369c387d11d87d065bac5c3baa3cd2d89e8f" Sep 29 11:25:44 crc kubenswrapper[4752]: I0929 11:25:44.390227 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:25:44 crc kubenswrapper[4752]: I0929 11:25:44.430244 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-1" Sep 29 11:25:44 crc kubenswrapper[4752]: I0929 11:25:44.452211 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:25:44 crc kubenswrapper[4752]: I0929 11:25:44.885500 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"f840aee8-6059-4658-89f6-09d799f64614","Type":"ContainerStarted","Data":"4b27ca60e969a2ffaccb1411788f4be529ac305413f4a8ffc8e8f7c52ec80cfe"} Sep 29 11:25:47 crc kubenswrapper[4752]: I0929 11:25:47.913926 4752 generic.go:334] "Generic (PLEG): container finished" podID="f840aee8-6059-4658-89f6-09d799f64614" containerID="4b27ca60e969a2ffaccb1411788f4be529ac305413f4a8ffc8e8f7c52ec80cfe" exitCode=1 Sep 29 11:25:47 crc kubenswrapper[4752]: I0929 11:25:47.914014 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"f840aee8-6059-4658-89f6-09d799f64614","Type":"ContainerDied","Data":"4b27ca60e969a2ffaccb1411788f4be529ac305413f4a8ffc8e8f7c52ec80cfe"} Sep 29 11:25:47 crc kubenswrapper[4752]: I0929 11:25:47.915271 4752 scope.go:117] "RemoveContainer" containerID="808ada2e1206fb1b6bc6be64d4e7369c387d11d87d065bac5c3baa3cd2d89e8f" Sep 29 11:25:47 crc kubenswrapper[4752]: I0929 11:25:47.915824 4752 scope.go:117] "RemoveContainer" containerID="4b27ca60e969a2ffaccb1411788f4be529ac305413f4a8ffc8e8f7c52ec80cfe" Sep 29 11:25:47 crc kubenswrapper[4752]: E0929 11:25:47.916144 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 10s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(f840aee8-6059-4658-89f6-09d799f64614)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="f840aee8-6059-4658-89f6-09d799f64614" Sep 29 11:25:49 crc kubenswrapper[4752]: I0929 11:25:49.389718 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:25:49 crc kubenswrapper[4752]: I0929 11:25:49.395384 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:25:49 crc kubenswrapper[4752]: I0929 11:25:49.410516 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:25:49 crc kubenswrapper[4752]: I0929 11:25:49.410604 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:25:49 crc kubenswrapper[4752]: I0929 11:25:49.411639 4752 scope.go:117] "RemoveContainer" containerID="4b27ca60e969a2ffaccb1411788f4be529ac305413f4a8ffc8e8f7c52ec80cfe" Sep 29 11:25:49 crc kubenswrapper[4752]: E0929 11:25:49.412118 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 10s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(f840aee8-6059-4658-89f6-09d799f64614)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="f840aee8-6059-4658-89f6-09d799f64614" Sep 29 11:25:49 crc kubenswrapper[4752]: I0929 11:25:49.430541 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-api-1" Sep 29 11:25:49 crc kubenswrapper[4752]: I0929 11:25:49.442126 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-api-1" Sep 29 11:25:49 crc kubenswrapper[4752]: I0929 11:25:49.452564 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:25:49 crc kubenswrapper[4752]: I0929 11:25:49.480861 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:25:49 crc kubenswrapper[4752]: I0929 11:25:49.945203 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-1" Sep 29 11:25:49 crc kubenswrapper[4752]: I0929 11:25:49.948304 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:25:49 crc kubenswrapper[4752]: I0929 11:25:49.961645 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:25:50 crc kubenswrapper[4752]: I0929 11:25:50.359499 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:25:52 crc kubenswrapper[4752]: I0929 11:25:52.242631 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Sep 29 11:25:52 crc kubenswrapper[4752]: I0929 11:25:52.242896 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="c3da3393-1343-4c89-93ae-0d0c4a42da37" containerName="ceilometer-central-agent" containerID="cri-o://d7fd9138103487aa206ea241d49b3f46e8c2c4039494512f333a436957e63a97" gracePeriod=30 Sep 29 11:25:52 crc kubenswrapper[4752]: I0929 11:25:52.243000 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="c3da3393-1343-4c89-93ae-0d0c4a42da37" containerName="ceilometer-notification-agent" containerID="cri-o://b0c61026715200c4271cbed347d293dd1bc92eb08e9c1fe11c13d7b9693ac03c" gracePeriod=30 Sep 29 11:25:52 crc kubenswrapper[4752]: I0929 11:25:52.243000 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="c3da3393-1343-4c89-93ae-0d0c4a42da37" containerName="sg-core" containerID="cri-o://f4832676b8519faaa5f3a7e45280324c615b2a20a859fe3589cf4e298f3b648c" gracePeriod=30 Sep 29 11:25:52 crc kubenswrapper[4752]: I0929 11:25:52.243184 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="c3da3393-1343-4c89-93ae-0d0c4a42da37" containerName="proxy-httpd" containerID="cri-o://679190bed769d21c559d53903066087cd67ad43acf856a932c66073c0f268273" gracePeriod=30 Sep 29 11:25:52 crc kubenswrapper[4752]: I0929 11:25:52.964292 4752 generic.go:334] "Generic (PLEG): container finished" podID="c3da3393-1343-4c89-93ae-0d0c4a42da37" containerID="679190bed769d21c559d53903066087cd67ad43acf856a932c66073c0f268273" exitCode=0 Sep 29 11:25:52 crc kubenswrapper[4752]: I0929 11:25:52.964572 4752 generic.go:334] "Generic (PLEG): container finished" podID="c3da3393-1343-4c89-93ae-0d0c4a42da37" containerID="f4832676b8519faaa5f3a7e45280324c615b2a20a859fe3589cf4e298f3b648c" exitCode=2 Sep 29 11:25:52 crc kubenswrapper[4752]: I0929 11:25:52.964583 4752 generic.go:334] "Generic (PLEG): container finished" podID="c3da3393-1343-4c89-93ae-0d0c4a42da37" containerID="d7fd9138103487aa206ea241d49b3f46e8c2c4039494512f333a436957e63a97" exitCode=0 Sep 29 11:25:52 crc kubenswrapper[4752]: I0929 11:25:52.964379 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"c3da3393-1343-4c89-93ae-0d0c4a42da37","Type":"ContainerDied","Data":"679190bed769d21c559d53903066087cd67ad43acf856a932c66073c0f268273"} Sep 29 11:25:52 crc kubenswrapper[4752]: I0929 11:25:52.964612 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"c3da3393-1343-4c89-93ae-0d0c4a42da37","Type":"ContainerDied","Data":"f4832676b8519faaa5f3a7e45280324c615b2a20a859fe3589cf4e298f3b648c"} Sep 29 11:25:52 crc kubenswrapper[4752]: I0929 11:25:52.964625 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"c3da3393-1343-4c89-93ae-0d0c4a42da37","Type":"ContainerDied","Data":"d7fd9138103487aa206ea241d49b3f46e8c2c4039494512f333a436957e63a97"} Sep 29 11:25:53 crc kubenswrapper[4752]: I0929 11:25:53.986033 4752 generic.go:334] "Generic (PLEG): container finished" podID="c3da3393-1343-4c89-93ae-0d0c4a42da37" containerID="b0c61026715200c4271cbed347d293dd1bc92eb08e9c1fe11c13d7b9693ac03c" exitCode=0 Sep 29 11:25:53 crc kubenswrapper[4752]: I0929 11:25:53.986081 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"c3da3393-1343-4c89-93ae-0d0c4a42da37","Type":"ContainerDied","Data":"b0c61026715200c4271cbed347d293dd1bc92eb08e9c1fe11c13d7b9693ac03c"} Sep 29 11:25:54 crc kubenswrapper[4752]: I0929 11:25:54.156348 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:25:54 crc kubenswrapper[4752]: I0929 11:25:54.175130 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c3da3393-1343-4c89-93ae-0d0c4a42da37-log-httpd\") pod \"c3da3393-1343-4c89-93ae-0d0c4a42da37\" (UID: \"c3da3393-1343-4c89-93ae-0d0c4a42da37\") " Sep 29 11:25:54 crc kubenswrapper[4752]: I0929 11:25:54.175219 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5khdl\" (UniqueName: \"kubernetes.io/projected/c3da3393-1343-4c89-93ae-0d0c4a42da37-kube-api-access-5khdl\") pod \"c3da3393-1343-4c89-93ae-0d0c4a42da37\" (UID: \"c3da3393-1343-4c89-93ae-0d0c4a42da37\") " Sep 29 11:25:54 crc kubenswrapper[4752]: I0929 11:25:54.175243 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c3da3393-1343-4c89-93ae-0d0c4a42da37-sg-core-conf-yaml\") pod \"c3da3393-1343-4c89-93ae-0d0c4a42da37\" (UID: \"c3da3393-1343-4c89-93ae-0d0c4a42da37\") " Sep 29 11:25:54 crc kubenswrapper[4752]: I0929 11:25:54.175263 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3da3393-1343-4c89-93ae-0d0c4a42da37-config-data\") pod \"c3da3393-1343-4c89-93ae-0d0c4a42da37\" (UID: \"c3da3393-1343-4c89-93ae-0d0c4a42da37\") " Sep 29 11:25:54 crc kubenswrapper[4752]: I0929 11:25:54.175309 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3da3393-1343-4c89-93ae-0d0c4a42da37-scripts\") pod \"c3da3393-1343-4c89-93ae-0d0c4a42da37\" (UID: \"c3da3393-1343-4c89-93ae-0d0c4a42da37\") " Sep 29 11:25:54 crc kubenswrapper[4752]: I0929 11:25:54.175357 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c3da3393-1343-4c89-93ae-0d0c4a42da37-run-httpd\") pod \"c3da3393-1343-4c89-93ae-0d0c4a42da37\" (UID: \"c3da3393-1343-4c89-93ae-0d0c4a42da37\") " Sep 29 11:25:54 crc kubenswrapper[4752]: I0929 11:25:54.175413 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3da3393-1343-4c89-93ae-0d0c4a42da37-ceilometer-tls-certs\") pod \"c3da3393-1343-4c89-93ae-0d0c4a42da37\" (UID: \"c3da3393-1343-4c89-93ae-0d0c4a42da37\") " Sep 29 11:25:54 crc kubenswrapper[4752]: I0929 11:25:54.175481 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3da3393-1343-4c89-93ae-0d0c4a42da37-combined-ca-bundle\") pod \"c3da3393-1343-4c89-93ae-0d0c4a42da37\" (UID: \"c3da3393-1343-4c89-93ae-0d0c4a42da37\") " Sep 29 11:25:54 crc kubenswrapper[4752]: I0929 11:25:54.175868 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3da3393-1343-4c89-93ae-0d0c4a42da37-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "c3da3393-1343-4c89-93ae-0d0c4a42da37" (UID: "c3da3393-1343-4c89-93ae-0d0c4a42da37"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 11:25:54 crc kubenswrapper[4752]: I0929 11:25:54.175890 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3da3393-1343-4c89-93ae-0d0c4a42da37-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "c3da3393-1343-4c89-93ae-0d0c4a42da37" (UID: "c3da3393-1343-4c89-93ae-0d0c4a42da37"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 11:25:54 crc kubenswrapper[4752]: I0929 11:25:54.195516 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3da3393-1343-4c89-93ae-0d0c4a42da37-scripts" (OuterVolumeSpecName: "scripts") pod "c3da3393-1343-4c89-93ae-0d0c4a42da37" (UID: "c3da3393-1343-4c89-93ae-0d0c4a42da37"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:25:54 crc kubenswrapper[4752]: I0929 11:25:54.195552 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3da3393-1343-4c89-93ae-0d0c4a42da37-kube-api-access-5khdl" (OuterVolumeSpecName: "kube-api-access-5khdl") pod "c3da3393-1343-4c89-93ae-0d0c4a42da37" (UID: "c3da3393-1343-4c89-93ae-0d0c4a42da37"). InnerVolumeSpecName "kube-api-access-5khdl". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 11:25:54 crc kubenswrapper[4752]: I0929 11:25:54.219246 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3da3393-1343-4c89-93ae-0d0c4a42da37-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "c3da3393-1343-4c89-93ae-0d0c4a42da37" (UID: "c3da3393-1343-4c89-93ae-0d0c4a42da37"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:25:54 crc kubenswrapper[4752]: I0929 11:25:54.241050 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3da3393-1343-4c89-93ae-0d0c4a42da37-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "c3da3393-1343-4c89-93ae-0d0c4a42da37" (UID: "c3da3393-1343-4c89-93ae-0d0c4a42da37"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:25:54 crc kubenswrapper[4752]: I0929 11:25:54.273558 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3da3393-1343-4c89-93ae-0d0c4a42da37-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c3da3393-1343-4c89-93ae-0d0c4a42da37" (UID: "c3da3393-1343-4c89-93ae-0d0c4a42da37"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:25:54 crc kubenswrapper[4752]: I0929 11:25:54.276361 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3da3393-1343-4c89-93ae-0d0c4a42da37-config-data" (OuterVolumeSpecName: "config-data") pod "c3da3393-1343-4c89-93ae-0d0c4a42da37" (UID: "c3da3393-1343-4c89-93ae-0d0c4a42da37"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:25:54 crc kubenswrapper[4752]: I0929 11:25:54.277182 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5khdl\" (UniqueName: \"kubernetes.io/projected/c3da3393-1343-4c89-93ae-0d0c4a42da37-kube-api-access-5khdl\") on node \"crc\" DevicePath \"\"" Sep 29 11:25:54 crc kubenswrapper[4752]: I0929 11:25:54.277204 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3da3393-1343-4c89-93ae-0d0c4a42da37-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 11:25:54 crc kubenswrapper[4752]: I0929 11:25:54.277215 4752 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c3da3393-1343-4c89-93ae-0d0c4a42da37-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 29 11:25:54 crc kubenswrapper[4752]: I0929 11:25:54.277224 4752 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3da3393-1343-4c89-93ae-0d0c4a42da37-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 11:25:54 crc kubenswrapper[4752]: I0929 11:25:54.277233 4752 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c3da3393-1343-4c89-93ae-0d0c4a42da37-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 29 11:25:54 crc kubenswrapper[4752]: I0929 11:25:54.277242 4752 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3da3393-1343-4c89-93ae-0d0c4a42da37-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 29 11:25:54 crc kubenswrapper[4752]: I0929 11:25:54.277250 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3da3393-1343-4c89-93ae-0d0c4a42da37-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 11:25:54 crc kubenswrapper[4752]: I0929 11:25:54.277258 4752 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c3da3393-1343-4c89-93ae-0d0c4a42da37-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 29 11:25:54 crc kubenswrapper[4752]: I0929 11:25:54.996517 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"c3da3393-1343-4c89-93ae-0d0c4a42da37","Type":"ContainerDied","Data":"b95acc5a33785c9978d14aff85999c6688ec62a07e3e9206aaf0ab1393c784c4"} Sep 29 11:25:54 crc kubenswrapper[4752]: I0929 11:25:54.996580 4752 scope.go:117] "RemoveContainer" containerID="679190bed769d21c559d53903066087cd67ad43acf856a932c66073c0f268273" Sep 29 11:25:54 crc kubenswrapper[4752]: I0929 11:25:54.996588 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:25:55 crc kubenswrapper[4752]: I0929 11:25:55.018503 4752 scope.go:117] "RemoveContainer" containerID="f4832676b8519faaa5f3a7e45280324c615b2a20a859fe3589cf4e298f3b648c" Sep 29 11:25:55 crc kubenswrapper[4752]: I0929 11:25:55.052415 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Sep 29 11:25:55 crc kubenswrapper[4752]: I0929 11:25:55.059980 4752 scope.go:117] "RemoveContainer" containerID="b0c61026715200c4271cbed347d293dd1bc92eb08e9c1fe11c13d7b9693ac03c" Sep 29 11:25:55 crc kubenswrapper[4752]: I0929 11:25:55.061415 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Sep 29 11:25:55 crc kubenswrapper[4752]: I0929 11:25:55.071559 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Sep 29 11:25:55 crc kubenswrapper[4752]: E0929 11:25:55.071903 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3da3393-1343-4c89-93ae-0d0c4a42da37" containerName="ceilometer-central-agent" Sep 29 11:25:55 crc kubenswrapper[4752]: I0929 11:25:55.071921 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3da3393-1343-4c89-93ae-0d0c4a42da37" containerName="ceilometer-central-agent" Sep 29 11:25:55 crc kubenswrapper[4752]: E0929 11:25:55.071949 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3da3393-1343-4c89-93ae-0d0c4a42da37" containerName="sg-core" Sep 29 11:25:55 crc kubenswrapper[4752]: I0929 11:25:55.071955 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3da3393-1343-4c89-93ae-0d0c4a42da37" containerName="sg-core" Sep 29 11:25:55 crc kubenswrapper[4752]: E0929 11:25:55.071974 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3da3393-1343-4c89-93ae-0d0c4a42da37" containerName="ceilometer-notification-agent" Sep 29 11:25:55 crc kubenswrapper[4752]: I0929 11:25:55.071982 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3da3393-1343-4c89-93ae-0d0c4a42da37" containerName="ceilometer-notification-agent" Sep 29 11:25:55 crc kubenswrapper[4752]: E0929 11:25:55.071993 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3da3393-1343-4c89-93ae-0d0c4a42da37" containerName="proxy-httpd" Sep 29 11:25:55 crc kubenswrapper[4752]: I0929 11:25:55.071999 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3da3393-1343-4c89-93ae-0d0c4a42da37" containerName="proxy-httpd" Sep 29 11:25:55 crc kubenswrapper[4752]: I0929 11:25:55.072138 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3da3393-1343-4c89-93ae-0d0c4a42da37" containerName="ceilometer-central-agent" Sep 29 11:25:55 crc kubenswrapper[4752]: I0929 11:25:55.072151 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3da3393-1343-4c89-93ae-0d0c4a42da37" containerName="ceilometer-notification-agent" Sep 29 11:25:55 crc kubenswrapper[4752]: I0929 11:25:55.072167 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3da3393-1343-4c89-93ae-0d0c4a42da37" containerName="proxy-httpd" Sep 29 11:25:55 crc kubenswrapper[4752]: I0929 11:25:55.072177 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3da3393-1343-4c89-93ae-0d0c4a42da37" containerName="sg-core" Sep 29 11:25:55 crc kubenswrapper[4752]: I0929 11:25:55.073845 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:25:55 crc kubenswrapper[4752]: I0929 11:25:55.077112 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Sep 29 11:25:55 crc kubenswrapper[4752]: I0929 11:25:55.077353 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Sep 29 11:25:55 crc kubenswrapper[4752]: I0929 11:25:55.077424 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-ceilometer-internal-svc" Sep 29 11:25:55 crc kubenswrapper[4752]: I0929 11:25:55.084334 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Sep 29 11:25:55 crc kubenswrapper[4752]: I0929 11:25:55.085668 4752 scope.go:117] "RemoveContainer" containerID="d7fd9138103487aa206ea241d49b3f46e8c2c4039494512f333a436957e63a97" Sep 29 11:25:55 crc kubenswrapper[4752]: I0929 11:25:55.089142 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70c139f9-abc5-4375-a70f-768959d3945e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"70c139f9-abc5-4375-a70f-768959d3945e\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:25:55 crc kubenswrapper[4752]: I0929 11:25:55.089292 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70c139f9-abc5-4375-a70f-768959d3945e-config-data\") pod \"ceilometer-0\" (UID: \"70c139f9-abc5-4375-a70f-768959d3945e\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:25:55 crc kubenswrapper[4752]: I0929 11:25:55.089334 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/70c139f9-abc5-4375-a70f-768959d3945e-run-httpd\") pod \"ceilometer-0\" (UID: \"70c139f9-abc5-4375-a70f-768959d3945e\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:25:55 crc kubenswrapper[4752]: I0929 11:25:55.089362 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/70c139f9-abc5-4375-a70f-768959d3945e-scripts\") pod \"ceilometer-0\" (UID: \"70c139f9-abc5-4375-a70f-768959d3945e\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:25:55 crc kubenswrapper[4752]: I0929 11:25:55.089431 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/70c139f9-abc5-4375-a70f-768959d3945e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"70c139f9-abc5-4375-a70f-768959d3945e\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:25:55 crc kubenswrapper[4752]: I0929 11:25:55.089501 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/70c139f9-abc5-4375-a70f-768959d3945e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"70c139f9-abc5-4375-a70f-768959d3945e\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:25:55 crc kubenswrapper[4752]: I0929 11:25:55.089607 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t42z2\" (UniqueName: \"kubernetes.io/projected/70c139f9-abc5-4375-a70f-768959d3945e-kube-api-access-t42z2\") pod \"ceilometer-0\" (UID: \"70c139f9-abc5-4375-a70f-768959d3945e\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:25:55 crc kubenswrapper[4752]: I0929 11:25:55.089663 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/70c139f9-abc5-4375-a70f-768959d3945e-log-httpd\") pod \"ceilometer-0\" (UID: \"70c139f9-abc5-4375-a70f-768959d3945e\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:25:55 crc kubenswrapper[4752]: I0929 11:25:55.197637 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/70c139f9-abc5-4375-a70f-768959d3945e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"70c139f9-abc5-4375-a70f-768959d3945e\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:25:55 crc kubenswrapper[4752]: I0929 11:25:55.197754 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/70c139f9-abc5-4375-a70f-768959d3945e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"70c139f9-abc5-4375-a70f-768959d3945e\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:25:55 crc kubenswrapper[4752]: I0929 11:25:55.197897 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t42z2\" (UniqueName: \"kubernetes.io/projected/70c139f9-abc5-4375-a70f-768959d3945e-kube-api-access-t42z2\") pod \"ceilometer-0\" (UID: \"70c139f9-abc5-4375-a70f-768959d3945e\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:25:55 crc kubenswrapper[4752]: I0929 11:25:55.197976 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/70c139f9-abc5-4375-a70f-768959d3945e-log-httpd\") pod \"ceilometer-0\" (UID: \"70c139f9-abc5-4375-a70f-768959d3945e\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:25:55 crc kubenswrapper[4752]: I0929 11:25:55.198056 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70c139f9-abc5-4375-a70f-768959d3945e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"70c139f9-abc5-4375-a70f-768959d3945e\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:25:55 crc kubenswrapper[4752]: I0929 11:25:55.198174 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70c139f9-abc5-4375-a70f-768959d3945e-config-data\") pod \"ceilometer-0\" (UID: \"70c139f9-abc5-4375-a70f-768959d3945e\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:25:55 crc kubenswrapper[4752]: I0929 11:25:55.198224 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/70c139f9-abc5-4375-a70f-768959d3945e-run-httpd\") pod \"ceilometer-0\" (UID: \"70c139f9-abc5-4375-a70f-768959d3945e\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:25:55 crc kubenswrapper[4752]: I0929 11:25:55.198262 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/70c139f9-abc5-4375-a70f-768959d3945e-scripts\") pod \"ceilometer-0\" (UID: \"70c139f9-abc5-4375-a70f-768959d3945e\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:25:55 crc kubenswrapper[4752]: I0929 11:25:55.203499 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/70c139f9-abc5-4375-a70f-768959d3945e-scripts\") pod \"ceilometer-0\" (UID: \"70c139f9-abc5-4375-a70f-768959d3945e\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:25:55 crc kubenswrapper[4752]: I0929 11:25:55.206308 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/70c139f9-abc5-4375-a70f-768959d3945e-log-httpd\") pod \"ceilometer-0\" (UID: \"70c139f9-abc5-4375-a70f-768959d3945e\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:25:55 crc kubenswrapper[4752]: I0929 11:25:55.206559 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/70c139f9-abc5-4375-a70f-768959d3945e-run-httpd\") pod \"ceilometer-0\" (UID: \"70c139f9-abc5-4375-a70f-768959d3945e\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:25:55 crc kubenswrapper[4752]: I0929 11:25:55.207194 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/70c139f9-abc5-4375-a70f-768959d3945e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"70c139f9-abc5-4375-a70f-768959d3945e\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:25:55 crc kubenswrapper[4752]: I0929 11:25:55.211696 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70c139f9-abc5-4375-a70f-768959d3945e-config-data\") pod \"ceilometer-0\" (UID: \"70c139f9-abc5-4375-a70f-768959d3945e\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:25:55 crc kubenswrapper[4752]: I0929 11:25:55.212787 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/70c139f9-abc5-4375-a70f-768959d3945e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"70c139f9-abc5-4375-a70f-768959d3945e\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:25:55 crc kubenswrapper[4752]: I0929 11:25:55.214171 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70c139f9-abc5-4375-a70f-768959d3945e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"70c139f9-abc5-4375-a70f-768959d3945e\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:25:55 crc kubenswrapper[4752]: I0929 11:25:55.221353 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t42z2\" (UniqueName: \"kubernetes.io/projected/70c139f9-abc5-4375-a70f-768959d3945e-kube-api-access-t42z2\") pod \"ceilometer-0\" (UID: \"70c139f9-abc5-4375-a70f-768959d3945e\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:25:55 crc kubenswrapper[4752]: I0929 11:25:55.393465 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:25:55 crc kubenswrapper[4752]: I0929 11:25:55.826864 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Sep 29 11:25:56 crc kubenswrapper[4752]: I0929 11:25:56.005106 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"70c139f9-abc5-4375-a70f-768959d3945e","Type":"ContainerStarted","Data":"0f907ad45ff7ea20881207bde8bdf7a243ec4928f39c8f04af02da0d01c5986b"} Sep 29 11:25:56 crc kubenswrapper[4752]: I0929 11:25:56.043334 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3da3393-1343-4c89-93ae-0d0c4a42da37" path="/var/lib/kubelet/pods/c3da3393-1343-4c89-93ae-0d0c4a42da37/volumes" Sep 29 11:25:57 crc kubenswrapper[4752]: I0929 11:25:57.018825 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"70c139f9-abc5-4375-a70f-768959d3945e","Type":"ContainerStarted","Data":"92ba532bef5e98659babc1ec73055f9b66e157b9db2ceb6db7057163e037d15a"} Sep 29 11:25:57 crc kubenswrapper[4752]: I0929 11:25:57.069309 4752 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 29 11:25:58 crc kubenswrapper[4752]: I0929 11:25:58.029274 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"70c139f9-abc5-4375-a70f-768959d3945e","Type":"ContainerStarted","Data":"7e94e0121f7521fb5e12d0c79f85472c2a3a837e529ded4449b1778d4863e3d5"} Sep 29 11:25:59 crc kubenswrapper[4752]: I0929 11:25:59.041659 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"70c139f9-abc5-4375-a70f-768959d3945e","Type":"ContainerStarted","Data":"2de9987615f58684a718a1384a0714af8a8938e628d0a1a3962a6135810349b1"} Sep 29 11:26:00 crc kubenswrapper[4752]: I0929 11:26:00.054297 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"70c139f9-abc5-4375-a70f-768959d3945e","Type":"ContainerStarted","Data":"f29594268803a043a1620f6e0479c1da3affafabd1524e87c5cd3d694969ded8"} Sep 29 11:26:00 crc kubenswrapper[4752]: I0929 11:26:00.054782 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:26:00 crc kubenswrapper[4752]: I0929 11:26:00.106120 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/ceilometer-0" podStartSLOduration=1.686643236 podStartE2EDuration="5.106103706s" podCreationTimestamp="2025-09-29 11:25:55 +0000 UTC" firstStartedPulling="2025-09-29 11:25:55.83579528 +0000 UTC m=+2496.624936947" lastFinishedPulling="2025-09-29 11:25:59.25525574 +0000 UTC m=+2500.044397417" observedRunningTime="2025-09-29 11:26:00.097950174 +0000 UTC m=+2500.887091861" watchObservedRunningTime="2025-09-29 11:26:00.106103706 +0000 UTC m=+2500.895245373" Sep 29 11:26:02 crc kubenswrapper[4752]: I0929 11:26:02.034816 4752 scope.go:117] "RemoveContainer" containerID="4b27ca60e969a2ffaccb1411788f4be529ac305413f4a8ffc8e8f7c52ec80cfe" Sep 29 11:26:03 crc kubenswrapper[4752]: I0929 11:26:03.094140 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"f840aee8-6059-4658-89f6-09d799f64614","Type":"ContainerStarted","Data":"9144f9b7fd810a7f9909904f5a4fa35660a2c577c4fa058a01ba59740650d41c"} Sep 29 11:26:05 crc kubenswrapper[4752]: I0929 11:26:05.112286 4752 generic.go:334] "Generic (PLEG): container finished" podID="f840aee8-6059-4658-89f6-09d799f64614" containerID="9144f9b7fd810a7f9909904f5a4fa35660a2c577c4fa058a01ba59740650d41c" exitCode=1 Sep 29 11:26:05 crc kubenswrapper[4752]: I0929 11:26:05.112461 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"f840aee8-6059-4658-89f6-09d799f64614","Type":"ContainerDied","Data":"9144f9b7fd810a7f9909904f5a4fa35660a2c577c4fa058a01ba59740650d41c"} Sep 29 11:26:05 crc kubenswrapper[4752]: I0929 11:26:05.112736 4752 scope.go:117] "RemoveContainer" containerID="4b27ca60e969a2ffaccb1411788f4be529ac305413f4a8ffc8e8f7c52ec80cfe" Sep 29 11:26:05 crc kubenswrapper[4752]: I0929 11:26:05.115199 4752 scope.go:117] "RemoveContainer" containerID="9144f9b7fd810a7f9909904f5a4fa35660a2c577c4fa058a01ba59740650d41c" Sep 29 11:26:05 crc kubenswrapper[4752]: E0929 11:26:05.115568 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 20s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(f840aee8-6059-4658-89f6-09d799f64614)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="f840aee8-6059-4658-89f6-09d799f64614" Sep 29 11:26:09 crc kubenswrapper[4752]: I0929 11:26:09.411389 4752 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:26:09 crc kubenswrapper[4752]: I0929 11:26:09.411992 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:26:09 crc kubenswrapper[4752]: I0929 11:26:09.412007 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:26:09 crc kubenswrapper[4752]: I0929 11:26:09.412019 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:26:09 crc kubenswrapper[4752]: I0929 11:26:09.412677 4752 scope.go:117] "RemoveContainer" containerID="9144f9b7fd810a7f9909904f5a4fa35660a2c577c4fa058a01ba59740650d41c" Sep 29 11:26:09 crc kubenswrapper[4752]: E0929 11:26:09.412963 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 20s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(f840aee8-6059-4658-89f6-09d799f64614)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="f840aee8-6059-4658-89f6-09d799f64614" Sep 29 11:26:22 crc kubenswrapper[4752]: I0929 11:26:22.153909 4752 scope.go:117] "RemoveContainer" containerID="0e4f1e827e2e4ebc0ea1d82fac6dbc8c45acccf2dca2fa2aecbdde2864203599" Sep 29 11:26:22 crc kubenswrapper[4752]: I0929 11:26:22.185940 4752 scope.go:117] "RemoveContainer" containerID="677ab6c2f5bed7596e70d334d998a9e911915f5add047779a0020fe282fe2d04" Sep 29 11:26:22 crc kubenswrapper[4752]: I0929 11:26:22.223293 4752 scope.go:117] "RemoveContainer" containerID="5f30c3a09e2a5ce1467c592640ce15f1520dbbe9b97508532e8211b2514f57aa" Sep 29 11:26:22 crc kubenswrapper[4752]: I0929 11:26:22.246282 4752 scope.go:117] "RemoveContainer" containerID="eeef5656eeb1c35fe864f226bec885b77000cf1e0b00ea410bd57ddb27f9d93b" Sep 29 11:26:22 crc kubenswrapper[4752]: I0929 11:26:22.286532 4752 scope.go:117] "RemoveContainer" containerID="f15a8bb442a51a678ebde0546d73481be85419f62f5a5c7ce896f4bc32bed92c" Sep 29 11:26:24 crc kubenswrapper[4752]: I0929 11:26:24.031724 4752 scope.go:117] "RemoveContainer" containerID="9144f9b7fd810a7f9909904f5a4fa35660a2c577c4fa058a01ba59740650d41c" Sep 29 11:26:24 crc kubenswrapper[4752]: E0929 11:26:24.031915 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 20s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(f840aee8-6059-4658-89f6-09d799f64614)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="f840aee8-6059-4658-89f6-09d799f64614" Sep 29 11:26:25 crc kubenswrapper[4752]: I0929 11:26:25.408638 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:26:39 crc kubenswrapper[4752]: I0929 11:26:39.031299 4752 scope.go:117] "RemoveContainer" containerID="9144f9b7fd810a7f9909904f5a4fa35660a2c577c4fa058a01ba59740650d41c" Sep 29 11:26:39 crc kubenswrapper[4752]: I0929 11:26:39.443029 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"f840aee8-6059-4658-89f6-09d799f64614","Type":"ContainerStarted","Data":"94bc8fc27eef660a39d2d37350f4fd1744564d2208a31442111af6ef8bfa4500"} Sep 29 11:26:42 crc kubenswrapper[4752]: I0929 11:26:42.465184 4752 generic.go:334] "Generic (PLEG): container finished" podID="f840aee8-6059-4658-89f6-09d799f64614" containerID="94bc8fc27eef660a39d2d37350f4fd1744564d2208a31442111af6ef8bfa4500" exitCode=1 Sep 29 11:26:42 crc kubenswrapper[4752]: I0929 11:26:42.465686 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"f840aee8-6059-4658-89f6-09d799f64614","Type":"ContainerDied","Data":"94bc8fc27eef660a39d2d37350f4fd1744564d2208a31442111af6ef8bfa4500"} Sep 29 11:26:42 crc kubenswrapper[4752]: I0929 11:26:42.465726 4752 scope.go:117] "RemoveContainer" containerID="9144f9b7fd810a7f9909904f5a4fa35660a2c577c4fa058a01ba59740650d41c" Sep 29 11:26:42 crc kubenswrapper[4752]: I0929 11:26:42.466345 4752 scope.go:117] "RemoveContainer" containerID="94bc8fc27eef660a39d2d37350f4fd1744564d2208a31442111af6ef8bfa4500" Sep 29 11:26:42 crc kubenswrapper[4752]: E0929 11:26:42.466620 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 40s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(f840aee8-6059-4658-89f6-09d799f64614)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="f840aee8-6059-4658-89f6-09d799f64614" Sep 29 11:26:49 crc kubenswrapper[4752]: I0929 11:26:49.410517 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:26:49 crc kubenswrapper[4752]: I0929 11:26:49.411137 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:26:49 crc kubenswrapper[4752]: I0929 11:26:49.411830 4752 scope.go:117] "RemoveContainer" containerID="94bc8fc27eef660a39d2d37350f4fd1744564d2208a31442111af6ef8bfa4500" Sep 29 11:26:49 crc kubenswrapper[4752]: E0929 11:26:49.412055 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 40s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(f840aee8-6059-4658-89f6-09d799f64614)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="f840aee8-6059-4658-89f6-09d799f64614" Sep 29 11:27:04 crc kubenswrapper[4752]: I0929 11:27:04.030698 4752 scope.go:117] "RemoveContainer" containerID="94bc8fc27eef660a39d2d37350f4fd1744564d2208a31442111af6ef8bfa4500" Sep 29 11:27:04 crc kubenswrapper[4752]: E0929 11:27:04.031522 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 40s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(f840aee8-6059-4658-89f6-09d799f64614)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="f840aee8-6059-4658-89f6-09d799f64614" Sep 29 11:27:09 crc kubenswrapper[4752]: I0929 11:27:09.411395 4752 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:27:09 crc kubenswrapper[4752]: I0929 11:27:09.412684 4752 scope.go:117] "RemoveContainer" containerID="94bc8fc27eef660a39d2d37350f4fd1744564d2208a31442111af6ef8bfa4500" Sep 29 11:27:09 crc kubenswrapper[4752]: E0929 11:27:09.412977 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 40s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(f840aee8-6059-4658-89f6-09d799f64614)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="f840aee8-6059-4658-89f6-09d799f64614" Sep 29 11:27:09 crc kubenswrapper[4752]: I0929 11:27:09.413377 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:27:09 crc kubenswrapper[4752]: I0929 11:27:09.692060 4752 scope.go:117] "RemoveContainer" containerID="94bc8fc27eef660a39d2d37350f4fd1744564d2208a31442111af6ef8bfa4500" Sep 29 11:27:09 crc kubenswrapper[4752]: E0929 11:27:09.692271 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 40s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(f840aee8-6059-4658-89f6-09d799f64614)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="f840aee8-6059-4658-89f6-09d799f64614" Sep 29 11:27:22 crc kubenswrapper[4752]: I0929 11:27:22.031418 4752 scope.go:117] "RemoveContainer" containerID="94bc8fc27eef660a39d2d37350f4fd1744564d2208a31442111af6ef8bfa4500" Sep 29 11:27:22 crc kubenswrapper[4752]: I0929 11:27:22.511981 4752 scope.go:117] "RemoveContainer" containerID="da63c6596140475a28463c9fad39f6b1c0241c51dbb01c52f19a84c3a9a26c6e" Sep 29 11:27:22 crc kubenswrapper[4752]: I0929 11:27:22.557264 4752 scope.go:117] "RemoveContainer" containerID="4fbf3f9f90bbde8caa7b4d7d58e2151c32ac096a9e406c00832a3a1df376cb28" Sep 29 11:27:22 crc kubenswrapper[4752]: I0929 11:27:22.815333 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"f840aee8-6059-4658-89f6-09d799f64614","Type":"ContainerStarted","Data":"71e48599f57587f4940dd2e25365599f16c0cace56058f7454f07fb9220c8aa0"} Sep 29 11:27:25 crc kubenswrapper[4752]: I0929 11:27:25.841413 4752 generic.go:334] "Generic (PLEG): container finished" podID="f840aee8-6059-4658-89f6-09d799f64614" containerID="71e48599f57587f4940dd2e25365599f16c0cace56058f7454f07fb9220c8aa0" exitCode=1 Sep 29 11:27:25 crc kubenswrapper[4752]: I0929 11:27:25.841491 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"f840aee8-6059-4658-89f6-09d799f64614","Type":"ContainerDied","Data":"71e48599f57587f4940dd2e25365599f16c0cace56058f7454f07fb9220c8aa0"} Sep 29 11:27:25 crc kubenswrapper[4752]: I0929 11:27:25.841746 4752 scope.go:117] "RemoveContainer" containerID="94bc8fc27eef660a39d2d37350f4fd1744564d2208a31442111af6ef8bfa4500" Sep 29 11:27:25 crc kubenswrapper[4752]: I0929 11:27:25.842267 4752 scope.go:117] "RemoveContainer" containerID="71e48599f57587f4940dd2e25365599f16c0cace56058f7454f07fb9220c8aa0" Sep 29 11:27:25 crc kubenswrapper[4752]: E0929 11:27:25.842581 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(f840aee8-6059-4658-89f6-09d799f64614)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="f840aee8-6059-4658-89f6-09d799f64614" Sep 29 11:27:29 crc kubenswrapper[4752]: I0929 11:27:29.411341 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:27:29 crc kubenswrapper[4752]: I0929 11:27:29.411775 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:27:29 crc kubenswrapper[4752]: I0929 11:27:29.412336 4752 scope.go:117] "RemoveContainer" containerID="71e48599f57587f4940dd2e25365599f16c0cace56058f7454f07fb9220c8aa0" Sep 29 11:27:29 crc kubenswrapper[4752]: E0929 11:27:29.412602 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(f840aee8-6059-4658-89f6-09d799f64614)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="f840aee8-6059-4658-89f6-09d799f64614" Sep 29 11:27:39 crc kubenswrapper[4752]: I0929 11:27:39.411315 4752 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:27:39 crc kubenswrapper[4752]: I0929 11:27:39.412112 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:27:39 crc kubenswrapper[4752]: I0929 11:27:39.413153 4752 scope.go:117] "RemoveContainer" containerID="71e48599f57587f4940dd2e25365599f16c0cace56058f7454f07fb9220c8aa0" Sep 29 11:27:39 crc kubenswrapper[4752]: E0929 11:27:39.413697 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(f840aee8-6059-4658-89f6-09d799f64614)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="f840aee8-6059-4658-89f6-09d799f64614" Sep 29 11:27:52 crc kubenswrapper[4752]: I0929 11:27:52.031450 4752 scope.go:117] "RemoveContainer" containerID="71e48599f57587f4940dd2e25365599f16c0cace56058f7454f07fb9220c8aa0" Sep 29 11:27:52 crc kubenswrapper[4752]: E0929 11:27:52.032169 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(f840aee8-6059-4658-89f6-09d799f64614)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="f840aee8-6059-4658-89f6-09d799f64614" Sep 29 11:27:56 crc kubenswrapper[4752]: I0929 11:27:56.176112 4752 patch_prober.go:28] interesting pod/machine-config-daemon-mgrvs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 11:27:56 crc kubenswrapper[4752]: I0929 11:27:56.176702 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 11:28:04 crc kubenswrapper[4752]: I0929 11:28:04.031487 4752 scope.go:117] "RemoveContainer" containerID="71e48599f57587f4940dd2e25365599f16c0cace56058f7454f07fb9220c8aa0" Sep 29 11:28:04 crc kubenswrapper[4752]: E0929 11:28:04.032636 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(f840aee8-6059-4658-89f6-09d799f64614)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="f840aee8-6059-4658-89f6-09d799f64614" Sep 29 11:28:15 crc kubenswrapper[4752]: I0929 11:28:15.031723 4752 scope.go:117] "RemoveContainer" containerID="71e48599f57587f4940dd2e25365599f16c0cace56058f7454f07fb9220c8aa0" Sep 29 11:28:15 crc kubenswrapper[4752]: E0929 11:28:15.032433 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(f840aee8-6059-4658-89f6-09d799f64614)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="f840aee8-6059-4658-89f6-09d799f64614" Sep 29 11:28:26 crc kubenswrapper[4752]: I0929 11:28:26.175967 4752 patch_prober.go:28] interesting pod/machine-config-daemon-mgrvs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 11:28:26 crc kubenswrapper[4752]: I0929 11:28:26.176561 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 11:28:28 crc kubenswrapper[4752]: I0929 11:28:28.030954 4752 scope.go:117] "RemoveContainer" containerID="71e48599f57587f4940dd2e25365599f16c0cace56058f7454f07fb9220c8aa0" Sep 29 11:28:28 crc kubenswrapper[4752]: E0929 11:28:28.031511 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(f840aee8-6059-4658-89f6-09d799f64614)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="f840aee8-6059-4658-89f6-09d799f64614" Sep 29 11:28:41 crc kubenswrapper[4752]: I0929 11:28:41.031567 4752 scope.go:117] "RemoveContainer" containerID="71e48599f57587f4940dd2e25365599f16c0cace56058f7454f07fb9220c8aa0" Sep 29 11:28:41 crc kubenswrapper[4752]: E0929 11:28:41.032412 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(f840aee8-6059-4658-89f6-09d799f64614)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="f840aee8-6059-4658-89f6-09d799f64614" Sep 29 11:28:54 crc kubenswrapper[4752]: I0929 11:28:54.190429 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-f957l"] Sep 29 11:28:54 crc kubenswrapper[4752]: I0929 11:28:54.192948 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f957l" Sep 29 11:28:54 crc kubenswrapper[4752]: I0929 11:28:54.205369 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-f957l"] Sep 29 11:28:54 crc kubenswrapper[4752]: I0929 11:28:54.266163 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0fa23946-d44e-43ca-934b-521bd91a01ff-catalog-content\") pod \"redhat-marketplace-f957l\" (UID: \"0fa23946-d44e-43ca-934b-521bd91a01ff\") " pod="openshift-marketplace/redhat-marketplace-f957l" Sep 29 11:28:54 crc kubenswrapper[4752]: I0929 11:28:54.266275 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2clm\" (UniqueName: \"kubernetes.io/projected/0fa23946-d44e-43ca-934b-521bd91a01ff-kube-api-access-l2clm\") pod \"redhat-marketplace-f957l\" (UID: \"0fa23946-d44e-43ca-934b-521bd91a01ff\") " pod="openshift-marketplace/redhat-marketplace-f957l" Sep 29 11:28:54 crc kubenswrapper[4752]: I0929 11:28:54.266388 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0fa23946-d44e-43ca-934b-521bd91a01ff-utilities\") pod \"redhat-marketplace-f957l\" (UID: \"0fa23946-d44e-43ca-934b-521bd91a01ff\") " pod="openshift-marketplace/redhat-marketplace-f957l" Sep 29 11:28:54 crc kubenswrapper[4752]: I0929 11:28:54.368030 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0fa23946-d44e-43ca-934b-521bd91a01ff-catalog-content\") pod \"redhat-marketplace-f957l\" (UID: \"0fa23946-d44e-43ca-934b-521bd91a01ff\") " pod="openshift-marketplace/redhat-marketplace-f957l" Sep 29 11:28:54 crc kubenswrapper[4752]: I0929 11:28:54.368150 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2clm\" (UniqueName: \"kubernetes.io/projected/0fa23946-d44e-43ca-934b-521bd91a01ff-kube-api-access-l2clm\") pod \"redhat-marketplace-f957l\" (UID: \"0fa23946-d44e-43ca-934b-521bd91a01ff\") " pod="openshift-marketplace/redhat-marketplace-f957l" Sep 29 11:28:54 crc kubenswrapper[4752]: I0929 11:28:54.368226 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0fa23946-d44e-43ca-934b-521bd91a01ff-utilities\") pod \"redhat-marketplace-f957l\" (UID: \"0fa23946-d44e-43ca-934b-521bd91a01ff\") " pod="openshift-marketplace/redhat-marketplace-f957l" Sep 29 11:28:54 crc kubenswrapper[4752]: I0929 11:28:54.368591 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0fa23946-d44e-43ca-934b-521bd91a01ff-catalog-content\") pod \"redhat-marketplace-f957l\" (UID: \"0fa23946-d44e-43ca-934b-521bd91a01ff\") " pod="openshift-marketplace/redhat-marketplace-f957l" Sep 29 11:28:54 crc kubenswrapper[4752]: I0929 11:28:54.368683 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0fa23946-d44e-43ca-934b-521bd91a01ff-utilities\") pod \"redhat-marketplace-f957l\" (UID: \"0fa23946-d44e-43ca-934b-521bd91a01ff\") " pod="openshift-marketplace/redhat-marketplace-f957l" Sep 29 11:28:54 crc kubenswrapper[4752]: I0929 11:28:54.392966 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2clm\" (UniqueName: \"kubernetes.io/projected/0fa23946-d44e-43ca-934b-521bd91a01ff-kube-api-access-l2clm\") pod \"redhat-marketplace-f957l\" (UID: \"0fa23946-d44e-43ca-934b-521bd91a01ff\") " pod="openshift-marketplace/redhat-marketplace-f957l" Sep 29 11:28:54 crc kubenswrapper[4752]: I0929 11:28:54.518161 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f957l" Sep 29 11:28:54 crc kubenswrapper[4752]: I0929 11:28:54.940658 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-f957l"] Sep 29 11:28:55 crc kubenswrapper[4752]: I0929 11:28:55.032379 4752 scope.go:117] "RemoveContainer" containerID="71e48599f57587f4940dd2e25365599f16c0cace56058f7454f07fb9220c8aa0" Sep 29 11:28:55 crc kubenswrapper[4752]: I0929 11:28:55.539866 4752 generic.go:334] "Generic (PLEG): container finished" podID="0fa23946-d44e-43ca-934b-521bd91a01ff" containerID="9eedcd188cf7fbfd218599224168819f1b149a8c2f140e3e59978e0818b63f15" exitCode=0 Sep 29 11:28:55 crc kubenswrapper[4752]: I0929 11:28:55.539898 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f957l" event={"ID":"0fa23946-d44e-43ca-934b-521bd91a01ff","Type":"ContainerDied","Data":"9eedcd188cf7fbfd218599224168819f1b149a8c2f140e3e59978e0818b63f15"} Sep 29 11:28:55 crc kubenswrapper[4752]: I0929 11:28:55.540209 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f957l" event={"ID":"0fa23946-d44e-43ca-934b-521bd91a01ff","Type":"ContainerStarted","Data":"f8d319389f86df48840774efc3d7248aa18f4d3eaddfad4baa38fca49ea4bb2f"} Sep 29 11:28:55 crc kubenswrapper[4752]: I0929 11:28:55.542315 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"f840aee8-6059-4658-89f6-09d799f64614","Type":"ContainerStarted","Data":"90950e56d39442e337bb3cdfe301d5f724ec54f47abd605b87f56e9741a68f4f"} Sep 29 11:28:56 crc kubenswrapper[4752]: I0929 11:28:56.175703 4752 patch_prober.go:28] interesting pod/machine-config-daemon-mgrvs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 11:28:56 crc kubenswrapper[4752]: I0929 11:28:56.176034 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 11:28:56 crc kubenswrapper[4752]: I0929 11:28:56.176084 4752 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" Sep 29 11:28:56 crc kubenswrapper[4752]: I0929 11:28:56.176855 4752 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"62794379c57b608ef54dfd72ad21536541e2a354845aabac9be8f3dde3444511"} pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 29 11:28:56 crc kubenswrapper[4752]: I0929 11:28:56.176931 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" containerName="machine-config-daemon" containerID="cri-o://62794379c57b608ef54dfd72ad21536541e2a354845aabac9be8f3dde3444511" gracePeriod=600 Sep 29 11:28:56 crc kubenswrapper[4752]: I0929 11:28:56.558083 4752 generic.go:334] "Generic (PLEG): container finished" podID="5863c243-797d-462a-b11f-71aaf005f8d1" containerID="62794379c57b608ef54dfd72ad21536541e2a354845aabac9be8f3dde3444511" exitCode=0 Sep 29 11:28:56 crc kubenswrapper[4752]: I0929 11:28:56.558104 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" event={"ID":"5863c243-797d-462a-b11f-71aaf005f8d1","Type":"ContainerDied","Data":"62794379c57b608ef54dfd72ad21536541e2a354845aabac9be8f3dde3444511"} Sep 29 11:28:56 crc kubenswrapper[4752]: I0929 11:28:56.558149 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" event={"ID":"5863c243-797d-462a-b11f-71aaf005f8d1","Type":"ContainerStarted","Data":"4edd0d5d46b3e4dc4505574db2621f4af50e77316cad134acc80e3c6d10d13c6"} Sep 29 11:28:56 crc kubenswrapper[4752]: I0929 11:28:56.558169 4752 scope.go:117] "RemoveContainer" containerID="93752bca1235c82c7e20c88ea68e0afd59b9dc59d3315066b08789cc37a87e37" Sep 29 11:28:57 crc kubenswrapper[4752]: I0929 11:28:57.567591 4752 generic.go:334] "Generic (PLEG): container finished" podID="0fa23946-d44e-43ca-934b-521bd91a01ff" containerID="cfb65b9c69d3252967fe3c4f72279a788328e7130c858541d4553efd17e4d41b" exitCode=0 Sep 29 11:28:57 crc kubenswrapper[4752]: I0929 11:28:57.568196 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f957l" event={"ID":"0fa23946-d44e-43ca-934b-521bd91a01ff","Type":"ContainerDied","Data":"cfb65b9c69d3252967fe3c4f72279a788328e7130c858541d4553efd17e4d41b"} Sep 29 11:28:58 crc kubenswrapper[4752]: I0929 11:28:58.582532 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f957l" event={"ID":"0fa23946-d44e-43ca-934b-521bd91a01ff","Type":"ContainerStarted","Data":"0e1453a7443622f8844c6b8ef01027d4960a26fe418bc295972e0e10e21cb7b2"} Sep 29 11:28:58 crc kubenswrapper[4752]: I0929 11:28:58.584137 4752 generic.go:334] "Generic (PLEG): container finished" podID="f840aee8-6059-4658-89f6-09d799f64614" containerID="90950e56d39442e337bb3cdfe301d5f724ec54f47abd605b87f56e9741a68f4f" exitCode=1 Sep 29 11:28:58 crc kubenswrapper[4752]: I0929 11:28:58.584180 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"f840aee8-6059-4658-89f6-09d799f64614","Type":"ContainerDied","Data":"90950e56d39442e337bb3cdfe301d5f724ec54f47abd605b87f56e9741a68f4f"} Sep 29 11:28:58 crc kubenswrapper[4752]: I0929 11:28:58.584265 4752 scope.go:117] "RemoveContainer" containerID="71e48599f57587f4940dd2e25365599f16c0cace56058f7454f07fb9220c8aa0" Sep 29 11:28:58 crc kubenswrapper[4752]: I0929 11:28:58.584505 4752 scope.go:117] "RemoveContainer" containerID="90950e56d39442e337bb3cdfe301d5f724ec54f47abd605b87f56e9741a68f4f" Sep 29 11:28:58 crc kubenswrapper[4752]: E0929 11:28:58.584678 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(f840aee8-6059-4658-89f6-09d799f64614)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="f840aee8-6059-4658-89f6-09d799f64614" Sep 29 11:28:58 crc kubenswrapper[4752]: I0929 11:28:58.610639 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-f957l" podStartSLOduration=2.147766827 podStartE2EDuration="4.610621009s" podCreationTimestamp="2025-09-29 11:28:54 +0000 UTC" firstStartedPulling="2025-09-29 11:28:55.541387736 +0000 UTC m=+2676.330529403" lastFinishedPulling="2025-09-29 11:28:58.004241918 +0000 UTC m=+2678.793383585" observedRunningTime="2025-09-29 11:28:58.604076468 +0000 UTC m=+2679.393218135" watchObservedRunningTime="2025-09-29 11:28:58.610621009 +0000 UTC m=+2679.399762676" Sep 29 11:28:59 crc kubenswrapper[4752]: I0929 11:28:59.410380 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:28:59 crc kubenswrapper[4752]: I0929 11:28:59.410643 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:28:59 crc kubenswrapper[4752]: I0929 11:28:59.598615 4752 scope.go:117] "RemoveContainer" containerID="90950e56d39442e337bb3cdfe301d5f724ec54f47abd605b87f56e9741a68f4f" Sep 29 11:28:59 crc kubenswrapper[4752]: E0929 11:28:59.598927 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(f840aee8-6059-4658-89f6-09d799f64614)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="f840aee8-6059-4658-89f6-09d799f64614" Sep 29 11:29:04 crc kubenswrapper[4752]: I0929 11:29:04.519120 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-f957l" Sep 29 11:29:04 crc kubenswrapper[4752]: I0929 11:29:04.519750 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-f957l" Sep 29 11:29:04 crc kubenswrapper[4752]: I0929 11:29:04.577886 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-f957l" Sep 29 11:29:04 crc kubenswrapper[4752]: I0929 11:29:04.678321 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-f957l" Sep 29 11:29:08 crc kubenswrapper[4752]: I0929 11:29:08.180697 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-f957l"] Sep 29 11:29:08 crc kubenswrapper[4752]: I0929 11:29:08.181470 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-f957l" podUID="0fa23946-d44e-43ca-934b-521bd91a01ff" containerName="registry-server" containerID="cri-o://0e1453a7443622f8844c6b8ef01027d4960a26fe418bc295972e0e10e21cb7b2" gracePeriod=2 Sep 29 11:29:08 crc kubenswrapper[4752]: E0929 11:29:08.499137 4752 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.21:38098->38.102.83.21:41527: write tcp 38.102.83.21:38098->38.102.83.21:41527: write: broken pipe Sep 29 11:29:08 crc kubenswrapper[4752]: I0929 11:29:08.578535 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f957l" Sep 29 11:29:08 crc kubenswrapper[4752]: I0929 11:29:08.664139 4752 generic.go:334] "Generic (PLEG): container finished" podID="0fa23946-d44e-43ca-934b-521bd91a01ff" containerID="0e1453a7443622f8844c6b8ef01027d4960a26fe418bc295972e0e10e21cb7b2" exitCode=0 Sep 29 11:29:08 crc kubenswrapper[4752]: I0929 11:29:08.664190 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f957l" Sep 29 11:29:08 crc kubenswrapper[4752]: I0929 11:29:08.664189 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f957l" event={"ID":"0fa23946-d44e-43ca-934b-521bd91a01ff","Type":"ContainerDied","Data":"0e1453a7443622f8844c6b8ef01027d4960a26fe418bc295972e0e10e21cb7b2"} Sep 29 11:29:08 crc kubenswrapper[4752]: I0929 11:29:08.664321 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f957l" event={"ID":"0fa23946-d44e-43ca-934b-521bd91a01ff","Type":"ContainerDied","Data":"f8d319389f86df48840774efc3d7248aa18f4d3eaddfad4baa38fca49ea4bb2f"} Sep 29 11:29:08 crc kubenswrapper[4752]: I0929 11:29:08.664346 4752 scope.go:117] "RemoveContainer" containerID="0e1453a7443622f8844c6b8ef01027d4960a26fe418bc295972e0e10e21cb7b2" Sep 29 11:29:08 crc kubenswrapper[4752]: I0929 11:29:08.681710 4752 scope.go:117] "RemoveContainer" containerID="cfb65b9c69d3252967fe3c4f72279a788328e7130c858541d4553efd17e4d41b" Sep 29 11:29:08 crc kubenswrapper[4752]: I0929 11:29:08.701008 4752 scope.go:117] "RemoveContainer" containerID="9eedcd188cf7fbfd218599224168819f1b149a8c2f140e3e59978e0818b63f15" Sep 29 11:29:08 crc kubenswrapper[4752]: I0929 11:29:08.709649 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0fa23946-d44e-43ca-934b-521bd91a01ff-catalog-content\") pod \"0fa23946-d44e-43ca-934b-521bd91a01ff\" (UID: \"0fa23946-d44e-43ca-934b-521bd91a01ff\") " Sep 29 11:29:08 crc kubenswrapper[4752]: I0929 11:29:08.709769 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0fa23946-d44e-43ca-934b-521bd91a01ff-utilities\") pod \"0fa23946-d44e-43ca-934b-521bd91a01ff\" (UID: \"0fa23946-d44e-43ca-934b-521bd91a01ff\") " Sep 29 11:29:08 crc kubenswrapper[4752]: I0929 11:29:08.709862 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l2clm\" (UniqueName: \"kubernetes.io/projected/0fa23946-d44e-43ca-934b-521bd91a01ff-kube-api-access-l2clm\") pod \"0fa23946-d44e-43ca-934b-521bd91a01ff\" (UID: \"0fa23946-d44e-43ca-934b-521bd91a01ff\") " Sep 29 11:29:08 crc kubenswrapper[4752]: I0929 11:29:08.710850 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0fa23946-d44e-43ca-934b-521bd91a01ff-utilities" (OuterVolumeSpecName: "utilities") pod "0fa23946-d44e-43ca-934b-521bd91a01ff" (UID: "0fa23946-d44e-43ca-934b-521bd91a01ff"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 11:29:08 crc kubenswrapper[4752]: I0929 11:29:08.715045 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fa23946-d44e-43ca-934b-521bd91a01ff-kube-api-access-l2clm" (OuterVolumeSpecName: "kube-api-access-l2clm") pod "0fa23946-d44e-43ca-934b-521bd91a01ff" (UID: "0fa23946-d44e-43ca-934b-521bd91a01ff"). InnerVolumeSpecName "kube-api-access-l2clm". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 11:29:08 crc kubenswrapper[4752]: I0929 11:29:08.723094 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0fa23946-d44e-43ca-934b-521bd91a01ff-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0fa23946-d44e-43ca-934b-521bd91a01ff" (UID: "0fa23946-d44e-43ca-934b-521bd91a01ff"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 11:29:08 crc kubenswrapper[4752]: I0929 11:29:08.760302 4752 scope.go:117] "RemoveContainer" containerID="0e1453a7443622f8844c6b8ef01027d4960a26fe418bc295972e0e10e21cb7b2" Sep 29 11:29:08 crc kubenswrapper[4752]: E0929 11:29:08.760916 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e1453a7443622f8844c6b8ef01027d4960a26fe418bc295972e0e10e21cb7b2\": container with ID starting with 0e1453a7443622f8844c6b8ef01027d4960a26fe418bc295972e0e10e21cb7b2 not found: ID does not exist" containerID="0e1453a7443622f8844c6b8ef01027d4960a26fe418bc295972e0e10e21cb7b2" Sep 29 11:29:08 crc kubenswrapper[4752]: I0929 11:29:08.760960 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e1453a7443622f8844c6b8ef01027d4960a26fe418bc295972e0e10e21cb7b2"} err="failed to get container status \"0e1453a7443622f8844c6b8ef01027d4960a26fe418bc295972e0e10e21cb7b2\": rpc error: code = NotFound desc = could not find container \"0e1453a7443622f8844c6b8ef01027d4960a26fe418bc295972e0e10e21cb7b2\": container with ID starting with 0e1453a7443622f8844c6b8ef01027d4960a26fe418bc295972e0e10e21cb7b2 not found: ID does not exist" Sep 29 11:29:08 crc kubenswrapper[4752]: I0929 11:29:08.761003 4752 scope.go:117] "RemoveContainer" containerID="cfb65b9c69d3252967fe3c4f72279a788328e7130c858541d4553efd17e4d41b" Sep 29 11:29:08 crc kubenswrapper[4752]: E0929 11:29:08.761382 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cfb65b9c69d3252967fe3c4f72279a788328e7130c858541d4553efd17e4d41b\": container with ID starting with cfb65b9c69d3252967fe3c4f72279a788328e7130c858541d4553efd17e4d41b not found: ID does not exist" containerID="cfb65b9c69d3252967fe3c4f72279a788328e7130c858541d4553efd17e4d41b" Sep 29 11:29:08 crc kubenswrapper[4752]: I0929 11:29:08.761450 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfb65b9c69d3252967fe3c4f72279a788328e7130c858541d4553efd17e4d41b"} err="failed to get container status \"cfb65b9c69d3252967fe3c4f72279a788328e7130c858541d4553efd17e4d41b\": rpc error: code = NotFound desc = could not find container \"cfb65b9c69d3252967fe3c4f72279a788328e7130c858541d4553efd17e4d41b\": container with ID starting with cfb65b9c69d3252967fe3c4f72279a788328e7130c858541d4553efd17e4d41b not found: ID does not exist" Sep 29 11:29:08 crc kubenswrapper[4752]: I0929 11:29:08.761503 4752 scope.go:117] "RemoveContainer" containerID="9eedcd188cf7fbfd218599224168819f1b149a8c2f140e3e59978e0818b63f15" Sep 29 11:29:08 crc kubenswrapper[4752]: E0929 11:29:08.761893 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9eedcd188cf7fbfd218599224168819f1b149a8c2f140e3e59978e0818b63f15\": container with ID starting with 9eedcd188cf7fbfd218599224168819f1b149a8c2f140e3e59978e0818b63f15 not found: ID does not exist" containerID="9eedcd188cf7fbfd218599224168819f1b149a8c2f140e3e59978e0818b63f15" Sep 29 11:29:08 crc kubenswrapper[4752]: I0929 11:29:08.761967 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9eedcd188cf7fbfd218599224168819f1b149a8c2f140e3e59978e0818b63f15"} err="failed to get container status \"9eedcd188cf7fbfd218599224168819f1b149a8c2f140e3e59978e0818b63f15\": rpc error: code = NotFound desc = could not find container \"9eedcd188cf7fbfd218599224168819f1b149a8c2f140e3e59978e0818b63f15\": container with ID starting with 9eedcd188cf7fbfd218599224168819f1b149a8c2f140e3e59978e0818b63f15 not found: ID does not exist" Sep 29 11:29:08 crc kubenswrapper[4752]: I0929 11:29:08.811374 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0fa23946-d44e-43ca-934b-521bd91a01ff-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 11:29:08 crc kubenswrapper[4752]: I0929 11:29:08.811679 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0fa23946-d44e-43ca-934b-521bd91a01ff-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 11:29:08 crc kubenswrapper[4752]: I0929 11:29:08.811692 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l2clm\" (UniqueName: \"kubernetes.io/projected/0fa23946-d44e-43ca-934b-521bd91a01ff-kube-api-access-l2clm\") on node \"crc\" DevicePath \"\"" Sep 29 11:29:08 crc kubenswrapper[4752]: I0929 11:29:08.995192 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-f957l"] Sep 29 11:29:09 crc kubenswrapper[4752]: I0929 11:29:09.014664 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-f957l"] Sep 29 11:29:09 crc kubenswrapper[4752]: I0929 11:29:09.410724 4752 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:29:09 crc kubenswrapper[4752]: I0929 11:29:09.410771 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:29:09 crc kubenswrapper[4752]: I0929 11:29:09.411323 4752 scope.go:117] "RemoveContainer" containerID="90950e56d39442e337bb3cdfe301d5f724ec54f47abd605b87f56e9741a68f4f" Sep 29 11:29:09 crc kubenswrapper[4752]: E0929 11:29:09.411519 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(f840aee8-6059-4658-89f6-09d799f64614)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="f840aee8-6059-4658-89f6-09d799f64614" Sep 29 11:29:10 crc kubenswrapper[4752]: I0929 11:29:10.040336 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0fa23946-d44e-43ca-934b-521bd91a01ff" path="/var/lib/kubelet/pods/0fa23946-d44e-43ca-934b-521bd91a01ff/volumes" Sep 29 11:29:20 crc kubenswrapper[4752]: I0929 11:29:20.035267 4752 scope.go:117] "RemoveContainer" containerID="90950e56d39442e337bb3cdfe301d5f724ec54f47abd605b87f56e9741a68f4f" Sep 29 11:29:20 crc kubenswrapper[4752]: E0929 11:29:20.036041 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(f840aee8-6059-4658-89f6-09d799f64614)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="f840aee8-6059-4658-89f6-09d799f64614" Sep 29 11:29:33 crc kubenswrapper[4752]: I0929 11:29:33.032038 4752 scope.go:117] "RemoveContainer" containerID="90950e56d39442e337bb3cdfe301d5f724ec54f47abd605b87f56e9741a68f4f" Sep 29 11:29:33 crc kubenswrapper[4752]: E0929 11:29:33.032837 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(f840aee8-6059-4658-89f6-09d799f64614)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="f840aee8-6059-4658-89f6-09d799f64614" Sep 29 11:29:46 crc kubenswrapper[4752]: I0929 11:29:46.031477 4752 scope.go:117] "RemoveContainer" containerID="90950e56d39442e337bb3cdfe301d5f724ec54f47abd605b87f56e9741a68f4f" Sep 29 11:29:46 crc kubenswrapper[4752]: E0929 11:29:46.032338 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(f840aee8-6059-4658-89f6-09d799f64614)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="f840aee8-6059-4658-89f6-09d799f64614" Sep 29 11:29:57 crc kubenswrapper[4752]: I0929 11:29:57.031569 4752 scope.go:117] "RemoveContainer" containerID="90950e56d39442e337bb3cdfe301d5f724ec54f47abd605b87f56e9741a68f4f" Sep 29 11:29:57 crc kubenswrapper[4752]: E0929 11:29:57.032464 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(f840aee8-6059-4658-89f6-09d799f64614)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="f840aee8-6059-4658-89f6-09d799f64614" Sep 29 11:30:00 crc kubenswrapper[4752]: I0929 11:30:00.159054 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29319090-vrv97"] Sep 29 11:30:00 crc kubenswrapper[4752]: E0929 11:30:00.159970 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fa23946-d44e-43ca-934b-521bd91a01ff" containerName="extract-utilities" Sep 29 11:30:00 crc kubenswrapper[4752]: I0929 11:30:00.159988 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fa23946-d44e-43ca-934b-521bd91a01ff" containerName="extract-utilities" Sep 29 11:30:00 crc kubenswrapper[4752]: E0929 11:30:00.160011 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fa23946-d44e-43ca-934b-521bd91a01ff" containerName="extract-content" Sep 29 11:30:00 crc kubenswrapper[4752]: I0929 11:30:00.160021 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fa23946-d44e-43ca-934b-521bd91a01ff" containerName="extract-content" Sep 29 11:30:00 crc kubenswrapper[4752]: E0929 11:30:00.160057 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fa23946-d44e-43ca-934b-521bd91a01ff" containerName="registry-server" Sep 29 11:30:00 crc kubenswrapper[4752]: I0929 11:30:00.160065 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fa23946-d44e-43ca-934b-521bd91a01ff" containerName="registry-server" Sep 29 11:30:00 crc kubenswrapper[4752]: I0929 11:30:00.160235 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fa23946-d44e-43ca-934b-521bd91a01ff" containerName="registry-server" Sep 29 11:30:00 crc kubenswrapper[4752]: I0929 11:30:00.160768 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29319090-vrv97" Sep 29 11:30:00 crc kubenswrapper[4752]: I0929 11:30:00.162888 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 29 11:30:00 crc kubenswrapper[4752]: I0929 11:30:00.164527 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 29 11:30:00 crc kubenswrapper[4752]: I0929 11:30:00.170301 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29319090-vrv97"] Sep 29 11:30:00 crc kubenswrapper[4752]: I0929 11:30:00.322390 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/848fc867-8805-4835-a8e6-b534782360ad-secret-volume\") pod \"collect-profiles-29319090-vrv97\" (UID: \"848fc867-8805-4835-a8e6-b534782360ad\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319090-vrv97" Sep 29 11:30:00 crc kubenswrapper[4752]: I0929 11:30:00.322489 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-476hz\" (UniqueName: \"kubernetes.io/projected/848fc867-8805-4835-a8e6-b534782360ad-kube-api-access-476hz\") pod \"collect-profiles-29319090-vrv97\" (UID: \"848fc867-8805-4835-a8e6-b534782360ad\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319090-vrv97" Sep 29 11:30:00 crc kubenswrapper[4752]: I0929 11:30:00.322595 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/848fc867-8805-4835-a8e6-b534782360ad-config-volume\") pod \"collect-profiles-29319090-vrv97\" (UID: \"848fc867-8805-4835-a8e6-b534782360ad\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319090-vrv97" Sep 29 11:30:00 crc kubenswrapper[4752]: I0929 11:30:00.423612 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/848fc867-8805-4835-a8e6-b534782360ad-secret-volume\") pod \"collect-profiles-29319090-vrv97\" (UID: \"848fc867-8805-4835-a8e6-b534782360ad\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319090-vrv97" Sep 29 11:30:00 crc kubenswrapper[4752]: I0929 11:30:00.423669 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-476hz\" (UniqueName: \"kubernetes.io/projected/848fc867-8805-4835-a8e6-b534782360ad-kube-api-access-476hz\") pod \"collect-profiles-29319090-vrv97\" (UID: \"848fc867-8805-4835-a8e6-b534782360ad\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319090-vrv97" Sep 29 11:30:00 crc kubenswrapper[4752]: I0929 11:30:00.423741 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/848fc867-8805-4835-a8e6-b534782360ad-config-volume\") pod \"collect-profiles-29319090-vrv97\" (UID: \"848fc867-8805-4835-a8e6-b534782360ad\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319090-vrv97" Sep 29 11:30:00 crc kubenswrapper[4752]: I0929 11:30:00.424635 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/848fc867-8805-4835-a8e6-b534782360ad-config-volume\") pod \"collect-profiles-29319090-vrv97\" (UID: \"848fc867-8805-4835-a8e6-b534782360ad\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319090-vrv97" Sep 29 11:30:00 crc kubenswrapper[4752]: I0929 11:30:00.431112 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/848fc867-8805-4835-a8e6-b534782360ad-secret-volume\") pod \"collect-profiles-29319090-vrv97\" (UID: \"848fc867-8805-4835-a8e6-b534782360ad\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319090-vrv97" Sep 29 11:30:00 crc kubenswrapper[4752]: I0929 11:30:00.443233 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-476hz\" (UniqueName: \"kubernetes.io/projected/848fc867-8805-4835-a8e6-b534782360ad-kube-api-access-476hz\") pod \"collect-profiles-29319090-vrv97\" (UID: \"848fc867-8805-4835-a8e6-b534782360ad\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319090-vrv97" Sep 29 11:30:00 crc kubenswrapper[4752]: I0929 11:30:00.485690 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29319090-vrv97" Sep 29 11:30:00 crc kubenswrapper[4752]: I0929 11:30:00.910039 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29319090-vrv97"] Sep 29 11:30:01 crc kubenswrapper[4752]: I0929 11:30:01.136430 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29319090-vrv97" event={"ID":"848fc867-8805-4835-a8e6-b534782360ad","Type":"ContainerStarted","Data":"3128a1b3464e1f77baf0f89e1f857ad10542a4c6475af2221047fce02235c244"} Sep 29 11:30:02 crc kubenswrapper[4752]: I0929 11:30:02.144978 4752 generic.go:334] "Generic (PLEG): container finished" podID="848fc867-8805-4835-a8e6-b534782360ad" containerID="02214920cd003a0a0bb2e57b1824269aefd569f16bb9925091c0278c487a9186" exitCode=0 Sep 29 11:30:02 crc kubenswrapper[4752]: I0929 11:30:02.145030 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29319090-vrv97" event={"ID":"848fc867-8805-4835-a8e6-b534782360ad","Type":"ContainerDied","Data":"02214920cd003a0a0bb2e57b1824269aefd569f16bb9925091c0278c487a9186"} Sep 29 11:30:03 crc kubenswrapper[4752]: I0929 11:30:03.436736 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29319090-vrv97" Sep 29 11:30:03 crc kubenswrapper[4752]: I0929 11:30:03.576542 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/848fc867-8805-4835-a8e6-b534782360ad-config-volume\") pod \"848fc867-8805-4835-a8e6-b534782360ad\" (UID: \"848fc867-8805-4835-a8e6-b534782360ad\") " Sep 29 11:30:03 crc kubenswrapper[4752]: I0929 11:30:03.576674 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-476hz\" (UniqueName: \"kubernetes.io/projected/848fc867-8805-4835-a8e6-b534782360ad-kube-api-access-476hz\") pod \"848fc867-8805-4835-a8e6-b534782360ad\" (UID: \"848fc867-8805-4835-a8e6-b534782360ad\") " Sep 29 11:30:03 crc kubenswrapper[4752]: I0929 11:30:03.576745 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/848fc867-8805-4835-a8e6-b534782360ad-secret-volume\") pod \"848fc867-8805-4835-a8e6-b534782360ad\" (UID: \"848fc867-8805-4835-a8e6-b534782360ad\") " Sep 29 11:30:03 crc kubenswrapper[4752]: I0929 11:30:03.578070 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/848fc867-8805-4835-a8e6-b534782360ad-config-volume" (OuterVolumeSpecName: "config-volume") pod "848fc867-8805-4835-a8e6-b534782360ad" (UID: "848fc867-8805-4835-a8e6-b534782360ad"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 11:30:03 crc kubenswrapper[4752]: I0929 11:30:03.581880 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/848fc867-8805-4835-a8e6-b534782360ad-kube-api-access-476hz" (OuterVolumeSpecName: "kube-api-access-476hz") pod "848fc867-8805-4835-a8e6-b534782360ad" (UID: "848fc867-8805-4835-a8e6-b534782360ad"). InnerVolumeSpecName "kube-api-access-476hz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 11:30:03 crc kubenswrapper[4752]: I0929 11:30:03.582727 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/848fc867-8805-4835-a8e6-b534782360ad-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "848fc867-8805-4835-a8e6-b534782360ad" (UID: "848fc867-8805-4835-a8e6-b534782360ad"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:30:03 crc kubenswrapper[4752]: I0929 11:30:03.677940 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-476hz\" (UniqueName: \"kubernetes.io/projected/848fc867-8805-4835-a8e6-b534782360ad-kube-api-access-476hz\") on node \"crc\" DevicePath \"\"" Sep 29 11:30:03 crc kubenswrapper[4752]: I0929 11:30:03.677972 4752 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/848fc867-8805-4835-a8e6-b534782360ad-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 29 11:30:03 crc kubenswrapper[4752]: I0929 11:30:03.677984 4752 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/848fc867-8805-4835-a8e6-b534782360ad-config-volume\") on node \"crc\" DevicePath \"\"" Sep 29 11:30:04 crc kubenswrapper[4752]: I0929 11:30:04.161084 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29319090-vrv97" event={"ID":"848fc867-8805-4835-a8e6-b534782360ad","Type":"ContainerDied","Data":"3128a1b3464e1f77baf0f89e1f857ad10542a4c6475af2221047fce02235c244"} Sep 29 11:30:04 crc kubenswrapper[4752]: I0929 11:30:04.161430 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3128a1b3464e1f77baf0f89e1f857ad10542a4c6475af2221047fce02235c244" Sep 29 11:30:04 crc kubenswrapper[4752]: I0929 11:30:04.161160 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29319090-vrv97" Sep 29 11:30:04 crc kubenswrapper[4752]: I0929 11:30:04.511791 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29319045-qpr28"] Sep 29 11:30:04 crc kubenswrapper[4752]: I0929 11:30:04.526325 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29319045-qpr28"] Sep 29 11:30:06 crc kubenswrapper[4752]: I0929 11:30:06.043129 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae8c092c-ec9d-456a-9ba3-5501c22f6280" path="/var/lib/kubelet/pods/ae8c092c-ec9d-456a-9ba3-5501c22f6280/volumes" Sep 29 11:30:10 crc kubenswrapper[4752]: I0929 11:30:10.062852 4752 scope.go:117] "RemoveContainer" containerID="90950e56d39442e337bb3cdfe301d5f724ec54f47abd605b87f56e9741a68f4f" Sep 29 11:30:10 crc kubenswrapper[4752]: E0929 11:30:10.063402 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(f840aee8-6059-4658-89f6-09d799f64614)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="f840aee8-6059-4658-89f6-09d799f64614" Sep 29 11:30:13 crc kubenswrapper[4752]: I0929 11:30:13.809930 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-drxw2"] Sep 29 11:30:13 crc kubenswrapper[4752]: I0929 11:30:13.815839 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-drxw2"] Sep 29 11:30:13 crc kubenswrapper[4752]: I0929 11:30:13.883418 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher60ae-account-delete-sh7wc"] Sep 29 11:30:13 crc kubenswrapper[4752]: E0929 11:30:13.883839 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="848fc867-8805-4835-a8e6-b534782360ad" containerName="collect-profiles" Sep 29 11:30:13 crc kubenswrapper[4752]: I0929 11:30:13.883858 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="848fc867-8805-4835-a8e6-b534782360ad" containerName="collect-profiles" Sep 29 11:30:13 crc kubenswrapper[4752]: I0929 11:30:13.884074 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="848fc867-8805-4835-a8e6-b534782360ad" containerName="collect-profiles" Sep 29 11:30:13 crc kubenswrapper[4752]: I0929 11:30:13.884629 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher60ae-account-delete-sh7wc" Sep 29 11:30:13 crc kubenswrapper[4752]: I0929 11:30:13.927905 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Sep 29 11:30:13 crc kubenswrapper[4752]: I0929 11:30:13.985866 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher60ae-account-delete-sh7wc"] Sep 29 11:30:14 crc kubenswrapper[4752]: I0929 11:30:14.010764 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Sep 29 11:30:14 crc kubenswrapper[4752]: I0929 11:30:14.011058 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="514664cb-8dd6-4485-bd72-85346d81346c" containerName="watcher-kuttl-api-log" containerID="cri-o://882177c1010a738182504c8ff4965c48e3c8b9f425c2477b2a4b427b37c4951e" gracePeriod=30 Sep 29 11:30:14 crc kubenswrapper[4752]: I0929 11:30:14.011512 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="514664cb-8dd6-4485-bd72-85346d81346c" containerName="watcher-api" containerID="cri-o://c2aeb3e958ca36cc87702187b7dbbe6665610fbcc4dd0fab6df4723e465fe7cc" gracePeriod=30 Sep 29 11:30:14 crc kubenswrapper[4752]: I0929 11:30:14.021312 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-1"] Sep 29 11:30:14 crc kubenswrapper[4752]: I0929 11:30:14.021628 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-1" podUID="95d7502d-c8b2-4023-a56f-f69bf1dc4b0f" containerName="watcher-kuttl-api-log" containerID="cri-o://94070899e8acd4d5e8e47c4c0add85ef44035fb6afb55608e67393ef4d7d1331" gracePeriod=30 Sep 29 11:30:14 crc kubenswrapper[4752]: I0929 11:30:14.021737 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-1" podUID="95d7502d-c8b2-4023-a56f-f69bf1dc4b0f" containerName="watcher-api" containerID="cri-o://c678f0d8a2213be62656dc64a578177c4c5f70f2a4ba2376857466766218c920" gracePeriod=30 Sep 29 11:30:14 crc kubenswrapper[4752]: I0929 11:30:14.074262 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="107ba64b-b986-4e7a-bfe3-e9f6b8a68e0e" path="/var/lib/kubelet/pods/107ba64b-b986-4e7a-bfe3-e9f6b8a68e0e/volumes" Sep 29 11:30:14 crc kubenswrapper[4752]: I0929 11:30:14.075144 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-db-create-97jzl"] Sep 29 11:30:14 crc kubenswrapper[4752]: I0929 11:30:14.075267 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-db-create-97jzl"] Sep 29 11:30:14 crc kubenswrapper[4752]: I0929 11:30:14.077022 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-60ae-account-create-5djbr"] Sep 29 11:30:14 crc kubenswrapper[4752]: I0929 11:30:14.090049 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bf99\" (UniqueName: \"kubernetes.io/projected/e17b7ff7-14b1-4e5b-9293-1a59096c6acc-kube-api-access-5bf99\") pod \"watcher60ae-account-delete-sh7wc\" (UID: \"e17b7ff7-14b1-4e5b-9293-1a59096c6acc\") " pod="watcher-kuttl-default/watcher60ae-account-delete-sh7wc" Sep 29 11:30:14 crc kubenswrapper[4752]: I0929 11:30:14.099720 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-60ae-account-create-5djbr"] Sep 29 11:30:14 crc kubenswrapper[4752]: I0929 11:30:14.106943 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher60ae-account-delete-sh7wc"] Sep 29 11:30:14 crc kubenswrapper[4752]: E0929 11:30:14.107608 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-5bf99], unattached volumes=[], failed to process volumes=[]: context canceled" pod="watcher-kuttl-default/watcher60ae-account-delete-sh7wc" podUID="e17b7ff7-14b1-4e5b-9293-1a59096c6acc" Sep 29 11:30:14 crc kubenswrapper[4752]: I0929 11:30:14.118970 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Sep 29 11:30:14 crc kubenswrapper[4752]: I0929 11:30:14.119216 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podUID="a72ff6e9-9f96-4da4-979c-7ef8afdc59c7" containerName="watcher-applier" containerID="cri-o://86992edd597bf5fa1f0ab937814c8b38338a68164063335ef3b4711dc0323086" gracePeriod=30 Sep 29 11:30:14 crc kubenswrapper[4752]: I0929 11:30:14.191713 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bf99\" (UniqueName: \"kubernetes.io/projected/e17b7ff7-14b1-4e5b-9293-1a59096c6acc-kube-api-access-5bf99\") pod \"watcher60ae-account-delete-sh7wc\" (UID: \"e17b7ff7-14b1-4e5b-9293-1a59096c6acc\") " pod="watcher-kuttl-default/watcher60ae-account-delete-sh7wc" Sep 29 11:30:14 crc kubenswrapper[4752]: I0929 11:30:14.217113 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bf99\" (UniqueName: \"kubernetes.io/projected/e17b7ff7-14b1-4e5b-9293-1a59096c6acc-kube-api-access-5bf99\") pod \"watcher60ae-account-delete-sh7wc\" (UID: \"e17b7ff7-14b1-4e5b-9293-1a59096c6acc\") " pod="watcher-kuttl-default/watcher60ae-account-delete-sh7wc" Sep 29 11:30:14 crc kubenswrapper[4752]: I0929 11:30:14.278962 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-db-create-84dqj"] Sep 29 11:30:14 crc kubenswrapper[4752]: I0929 11:30:14.282612 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-84dqj" Sep 29 11:30:14 crc kubenswrapper[4752]: I0929 11:30:14.285223 4752 generic.go:334] "Generic (PLEG): container finished" podID="95d7502d-c8b2-4023-a56f-f69bf1dc4b0f" containerID="94070899e8acd4d5e8e47c4c0add85ef44035fb6afb55608e67393ef4d7d1331" exitCode=143 Sep 29 11:30:14 crc kubenswrapper[4752]: I0929 11:30:14.285325 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-1" event={"ID":"95d7502d-c8b2-4023-a56f-f69bf1dc4b0f","Type":"ContainerDied","Data":"94070899e8acd4d5e8e47c4c0add85ef44035fb6afb55608e67393ef4d7d1331"} Sep 29 11:30:14 crc kubenswrapper[4752]: I0929 11:30:14.293310 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-db-create-84dqj"] Sep 29 11:30:14 crc kubenswrapper[4752]: I0929 11:30:14.311276 4752 generic.go:334] "Generic (PLEG): container finished" podID="514664cb-8dd6-4485-bd72-85346d81346c" containerID="882177c1010a738182504c8ff4965c48e3c8b9f425c2477b2a4b427b37c4951e" exitCode=143 Sep 29 11:30:14 crc kubenswrapper[4752]: I0929 11:30:14.311384 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher60ae-account-delete-sh7wc" Sep 29 11:30:14 crc kubenswrapper[4752]: I0929 11:30:14.311877 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"514664cb-8dd6-4485-bd72-85346d81346c","Type":"ContainerDied","Data":"882177c1010a738182504c8ff4965c48e3c8b9f425c2477b2a4b427b37c4951e"} Sep 29 11:30:14 crc kubenswrapper[4752]: I0929 11:30:14.364405 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:30:14 crc kubenswrapper[4752]: I0929 11:30:14.372831 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher60ae-account-delete-sh7wc" Sep 29 11:30:14 crc kubenswrapper[4752]: I0929 11:30:14.396061 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qsj5r\" (UniqueName: \"kubernetes.io/projected/f840aee8-6059-4658-89f6-09d799f64614-kube-api-access-qsj5r\") pod \"f840aee8-6059-4658-89f6-09d799f64614\" (UID: \"f840aee8-6059-4658-89f6-09d799f64614\") " Sep 29 11:30:14 crc kubenswrapper[4752]: I0929 11:30:14.396143 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f840aee8-6059-4658-89f6-09d799f64614-logs\") pod \"f840aee8-6059-4658-89f6-09d799f64614\" (UID: \"f840aee8-6059-4658-89f6-09d799f64614\") " Sep 29 11:30:14 crc kubenswrapper[4752]: I0929 11:30:14.396237 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/f840aee8-6059-4658-89f6-09d799f64614-custom-prometheus-ca\") pod \"f840aee8-6059-4658-89f6-09d799f64614\" (UID: \"f840aee8-6059-4658-89f6-09d799f64614\") " Sep 29 11:30:14 crc kubenswrapper[4752]: I0929 11:30:14.396324 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5bf99\" (UniqueName: \"kubernetes.io/projected/e17b7ff7-14b1-4e5b-9293-1a59096c6acc-kube-api-access-5bf99\") pod \"e17b7ff7-14b1-4e5b-9293-1a59096c6acc\" (UID: \"e17b7ff7-14b1-4e5b-9293-1a59096c6acc\") " Sep 29 11:30:14 crc kubenswrapper[4752]: I0929 11:30:14.396391 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f840aee8-6059-4658-89f6-09d799f64614-config-data\") pod \"f840aee8-6059-4658-89f6-09d799f64614\" (UID: \"f840aee8-6059-4658-89f6-09d799f64614\") " Sep 29 11:30:14 crc kubenswrapper[4752]: I0929 11:30:14.396672 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rcxg\" (UniqueName: \"kubernetes.io/projected/9bc68bdf-98e2-4ebf-be29-30fb59ebf7bd-kube-api-access-7rcxg\") pod \"watcher-db-create-84dqj\" (UID: \"9bc68bdf-98e2-4ebf-be29-30fb59ebf7bd\") " pod="watcher-kuttl-default/watcher-db-create-84dqj" Sep 29 11:30:14 crc kubenswrapper[4752]: I0929 11:30:14.397133 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f840aee8-6059-4658-89f6-09d799f64614-logs" (OuterVolumeSpecName: "logs") pod "f840aee8-6059-4658-89f6-09d799f64614" (UID: "f840aee8-6059-4658-89f6-09d799f64614"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 11:30:14 crc kubenswrapper[4752]: I0929 11:30:14.422715 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f840aee8-6059-4658-89f6-09d799f64614-kube-api-access-qsj5r" (OuterVolumeSpecName: "kube-api-access-qsj5r") pod "f840aee8-6059-4658-89f6-09d799f64614" (UID: "f840aee8-6059-4658-89f6-09d799f64614"). InnerVolumeSpecName "kube-api-access-qsj5r". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 11:30:14 crc kubenswrapper[4752]: I0929 11:30:14.431999 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e17b7ff7-14b1-4e5b-9293-1a59096c6acc-kube-api-access-5bf99" (OuterVolumeSpecName: "kube-api-access-5bf99") pod "e17b7ff7-14b1-4e5b-9293-1a59096c6acc" (UID: "e17b7ff7-14b1-4e5b-9293-1a59096c6acc"). InnerVolumeSpecName "kube-api-access-5bf99". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 11:30:14 crc kubenswrapper[4752]: E0929 11:30:14.461579 4752 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="86992edd597bf5fa1f0ab937814c8b38338a68164063335ef3b4711dc0323086" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Sep 29 11:30:14 crc kubenswrapper[4752]: I0929 11:30:14.463950 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f840aee8-6059-4658-89f6-09d799f64614-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "f840aee8-6059-4658-89f6-09d799f64614" (UID: "f840aee8-6059-4658-89f6-09d799f64614"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:30:14 crc kubenswrapper[4752]: E0929 11:30:14.470265 4752 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="86992edd597bf5fa1f0ab937814c8b38338a68164063335ef3b4711dc0323086" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Sep 29 11:30:14 crc kubenswrapper[4752]: I0929 11:30:14.470585 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f840aee8-6059-4658-89f6-09d799f64614-config-data" (OuterVolumeSpecName: "config-data") pod "f840aee8-6059-4658-89f6-09d799f64614" (UID: "f840aee8-6059-4658-89f6-09d799f64614"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:30:14 crc kubenswrapper[4752]: E0929 11:30:14.472118 4752 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="86992edd597bf5fa1f0ab937814c8b38338a68164063335ef3b4711dc0323086" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Sep 29 11:30:14 crc kubenswrapper[4752]: E0929 11:30:14.472170 4752 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podUID="a72ff6e9-9f96-4da4-979c-7ef8afdc59c7" containerName="watcher-applier" Sep 29 11:30:14 crc kubenswrapper[4752]: I0929 11:30:14.497956 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rcxg\" (UniqueName: \"kubernetes.io/projected/9bc68bdf-98e2-4ebf-be29-30fb59ebf7bd-kube-api-access-7rcxg\") pod \"watcher-db-create-84dqj\" (UID: \"9bc68bdf-98e2-4ebf-be29-30fb59ebf7bd\") " pod="watcher-kuttl-default/watcher-db-create-84dqj" Sep 29 11:30:14 crc kubenswrapper[4752]: I0929 11:30:14.498359 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qsj5r\" (UniqueName: \"kubernetes.io/projected/f840aee8-6059-4658-89f6-09d799f64614-kube-api-access-qsj5r\") on node \"crc\" DevicePath \"\"" Sep 29 11:30:14 crc kubenswrapper[4752]: I0929 11:30:14.498563 4752 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f840aee8-6059-4658-89f6-09d799f64614-logs\") on node \"crc\" DevicePath \"\"" Sep 29 11:30:14 crc kubenswrapper[4752]: I0929 11:30:14.498585 4752 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/f840aee8-6059-4658-89f6-09d799f64614-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Sep 29 11:30:14 crc kubenswrapper[4752]: I0929 11:30:14.498598 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5bf99\" (UniqueName: \"kubernetes.io/projected/e17b7ff7-14b1-4e5b-9293-1a59096c6acc-kube-api-access-5bf99\") on node \"crc\" DevicePath \"\"" Sep 29 11:30:14 crc kubenswrapper[4752]: I0929 11:30:14.499959 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f840aee8-6059-4658-89f6-09d799f64614-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 11:30:14 crc kubenswrapper[4752]: I0929 11:30:14.515727 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rcxg\" (UniqueName: \"kubernetes.io/projected/9bc68bdf-98e2-4ebf-be29-30fb59ebf7bd-kube-api-access-7rcxg\") pod \"watcher-db-create-84dqj\" (UID: \"9bc68bdf-98e2-4ebf-be29-30fb59ebf7bd\") " pod="watcher-kuttl-default/watcher-db-create-84dqj" Sep 29 11:30:14 crc kubenswrapper[4752]: I0929 11:30:14.668859 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-84dqj" Sep 29 11:30:14 crc kubenswrapper[4752]: I0929 11:30:14.894958 4752 prober.go:107] "Probe failed" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-api-1" podUID="95d7502d-c8b2-4023-a56f-f69bf1dc4b0f" containerName="watcher-kuttl-api-log" probeResult="failure" output="Get \"http://10.217.0.182:9322/\": read tcp 10.217.0.2:54276->10.217.0.182:9322: read: connection reset by peer" Sep 29 11:30:14 crc kubenswrapper[4752]: I0929 11:30:14.898758 4752 prober.go:107] "Probe failed" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-api-1" podUID="95d7502d-c8b2-4023-a56f-f69bf1dc4b0f" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.182:9322/\": read tcp 10.217.0.2:54264->10.217.0.182:9322: read: connection reset by peer" Sep 29 11:30:14 crc kubenswrapper[4752]: I0929 11:30:14.945360 4752 prober.go:107] "Probe failed" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="514664cb-8dd6-4485-bd72-85346d81346c" containerName="watcher-kuttl-api-log" probeResult="failure" output="Get \"http://10.217.0.180:9322/\": read tcp 10.217.0.2:44436->10.217.0.180:9322: read: connection reset by peer" Sep 29 11:30:14 crc kubenswrapper[4752]: I0929 11:30:14.945708 4752 prober.go:107] "Probe failed" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="514664cb-8dd6-4485-bd72-85346d81346c" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.180:9322/\": read tcp 10.217.0.2:44422->10.217.0.180:9322: read: connection reset by peer" Sep 29 11:30:15 crc kubenswrapper[4752]: I0929 11:30:15.164387 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-db-create-84dqj"] Sep 29 11:30:15 crc kubenswrapper[4752]: I0929 11:30:15.329828 4752 generic.go:334] "Generic (PLEG): container finished" podID="95d7502d-c8b2-4023-a56f-f69bf1dc4b0f" containerID="c678f0d8a2213be62656dc64a578177c4c5f70f2a4ba2376857466766218c920" exitCode=0 Sep 29 11:30:15 crc kubenswrapper[4752]: I0929 11:30:15.329886 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-1" event={"ID":"95d7502d-c8b2-4023-a56f-f69bf1dc4b0f","Type":"ContainerDied","Data":"c678f0d8a2213be62656dc64a578177c4c5f70f2a4ba2376857466766218c920"} Sep 29 11:30:15 crc kubenswrapper[4752]: I0929 11:30:15.329910 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-1" event={"ID":"95d7502d-c8b2-4023-a56f-f69bf1dc4b0f","Type":"ContainerDied","Data":"acd23cdd52edd2f93622223d2187bfed18cb9f4d12b581a73d267dfbcaf64489"} Sep 29 11:30:15 crc kubenswrapper[4752]: I0929 11:30:15.329919 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="acd23cdd52edd2f93622223d2187bfed18cb9f4d12b581a73d267dfbcaf64489" Sep 29 11:30:15 crc kubenswrapper[4752]: I0929 11:30:15.341240 4752 generic.go:334] "Generic (PLEG): container finished" podID="514664cb-8dd6-4485-bd72-85346d81346c" containerID="c2aeb3e958ca36cc87702187b7dbbe6665610fbcc4dd0fab6df4723e465fe7cc" exitCode=0 Sep 29 11:30:15 crc kubenswrapper[4752]: I0929 11:30:15.341322 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"514664cb-8dd6-4485-bd72-85346d81346c","Type":"ContainerDied","Data":"c2aeb3e958ca36cc87702187b7dbbe6665610fbcc4dd0fab6df4723e465fe7cc"} Sep 29 11:30:15 crc kubenswrapper[4752]: I0929 11:30:15.349467 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-84dqj" event={"ID":"9bc68bdf-98e2-4ebf-be29-30fb59ebf7bd","Type":"ContainerStarted","Data":"41e4b26366afa74cca1b34cdba0b2103d95636d9707238b862c2f44b544a0374"} Sep 29 11:30:15 crc kubenswrapper[4752]: I0929 11:30:15.356297 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher60ae-account-delete-sh7wc" Sep 29 11:30:15 crc kubenswrapper[4752]: I0929 11:30:15.356750 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:30:15 crc kubenswrapper[4752]: I0929 11:30:15.361578 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"f840aee8-6059-4658-89f6-09d799f64614","Type":"ContainerDied","Data":"a373b497c645ed88afd0f91bef6dcacb4cce9578ff28cd821f11a51e53b72922"} Sep 29 11:30:15 crc kubenswrapper[4752]: I0929 11:30:15.361644 4752 scope.go:117] "RemoveContainer" containerID="90950e56d39442e337bb3cdfe301d5f724ec54f47abd605b87f56e9741a68f4f" Sep 29 11:30:15 crc kubenswrapper[4752]: I0929 11:30:15.464642 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-1" Sep 29 11:30:15 crc kubenswrapper[4752]: I0929 11:30:15.475236 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:30:15 crc kubenswrapper[4752]: I0929 11:30:15.477642 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Sep 29 11:30:15 crc kubenswrapper[4752]: I0929 11:30:15.495432 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Sep 29 11:30:15 crc kubenswrapper[4752]: I0929 11:30:15.513014 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher60ae-account-delete-sh7wc"] Sep 29 11:30:15 crc kubenswrapper[4752]: I0929 11:30:15.515690 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/514664cb-8dd6-4485-bd72-85346d81346c-config-data\") pod \"514664cb-8dd6-4485-bd72-85346d81346c\" (UID: \"514664cb-8dd6-4485-bd72-85346d81346c\") " Sep 29 11:30:15 crc kubenswrapper[4752]: I0929 11:30:15.515760 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/514664cb-8dd6-4485-bd72-85346d81346c-logs\") pod \"514664cb-8dd6-4485-bd72-85346d81346c\" (UID: \"514664cb-8dd6-4485-bd72-85346d81346c\") " Sep 29 11:30:15 crc kubenswrapper[4752]: I0929 11:30:15.515818 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9wpg9\" (UniqueName: \"kubernetes.io/projected/514664cb-8dd6-4485-bd72-85346d81346c-kube-api-access-9wpg9\") pod \"514664cb-8dd6-4485-bd72-85346d81346c\" (UID: \"514664cb-8dd6-4485-bd72-85346d81346c\") " Sep 29 11:30:15 crc kubenswrapper[4752]: I0929 11:30:15.515849 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/95d7502d-c8b2-4023-a56f-f69bf1dc4b0f-custom-prometheus-ca\") pod \"95d7502d-c8b2-4023-a56f-f69bf1dc4b0f\" (UID: \"95d7502d-c8b2-4023-a56f-f69bf1dc4b0f\") " Sep 29 11:30:15 crc kubenswrapper[4752]: I0929 11:30:15.515875 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95d7502d-c8b2-4023-a56f-f69bf1dc4b0f-logs\") pod \"95d7502d-c8b2-4023-a56f-f69bf1dc4b0f\" (UID: \"95d7502d-c8b2-4023-a56f-f69bf1dc4b0f\") " Sep 29 11:30:15 crc kubenswrapper[4752]: I0929 11:30:15.515908 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzmv7\" (UniqueName: \"kubernetes.io/projected/95d7502d-c8b2-4023-a56f-f69bf1dc4b0f-kube-api-access-lzmv7\") pod \"95d7502d-c8b2-4023-a56f-f69bf1dc4b0f\" (UID: \"95d7502d-c8b2-4023-a56f-f69bf1dc4b0f\") " Sep 29 11:30:15 crc kubenswrapper[4752]: I0929 11:30:15.515943 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95d7502d-c8b2-4023-a56f-f69bf1dc4b0f-config-data\") pod \"95d7502d-c8b2-4023-a56f-f69bf1dc4b0f\" (UID: \"95d7502d-c8b2-4023-a56f-f69bf1dc4b0f\") " Sep 29 11:30:15 crc kubenswrapper[4752]: I0929 11:30:15.515980 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/514664cb-8dd6-4485-bd72-85346d81346c-custom-prometheus-ca\") pod \"514664cb-8dd6-4485-bd72-85346d81346c\" (UID: \"514664cb-8dd6-4485-bd72-85346d81346c\") " Sep 29 11:30:15 crc kubenswrapper[4752]: I0929 11:30:15.528879 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher60ae-account-delete-sh7wc"] Sep 29 11:30:15 crc kubenswrapper[4752]: I0929 11:30:15.529755 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95d7502d-c8b2-4023-a56f-f69bf1dc4b0f-logs" (OuterVolumeSpecName: "logs") pod "95d7502d-c8b2-4023-a56f-f69bf1dc4b0f" (UID: "95d7502d-c8b2-4023-a56f-f69bf1dc4b0f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 11:30:15 crc kubenswrapper[4752]: I0929 11:30:15.529841 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/514664cb-8dd6-4485-bd72-85346d81346c-logs" (OuterVolumeSpecName: "logs") pod "514664cb-8dd6-4485-bd72-85346d81346c" (UID: "514664cb-8dd6-4485-bd72-85346d81346c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 11:30:15 crc kubenswrapper[4752]: I0929 11:30:15.537125 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95d7502d-c8b2-4023-a56f-f69bf1dc4b0f-kube-api-access-lzmv7" (OuterVolumeSpecName: "kube-api-access-lzmv7") pod "95d7502d-c8b2-4023-a56f-f69bf1dc4b0f" (UID: "95d7502d-c8b2-4023-a56f-f69bf1dc4b0f"). InnerVolumeSpecName "kube-api-access-lzmv7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 11:30:15 crc kubenswrapper[4752]: I0929 11:30:15.546348 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/514664cb-8dd6-4485-bd72-85346d81346c-kube-api-access-9wpg9" (OuterVolumeSpecName: "kube-api-access-9wpg9") pod "514664cb-8dd6-4485-bd72-85346d81346c" (UID: "514664cb-8dd6-4485-bd72-85346d81346c"). InnerVolumeSpecName "kube-api-access-9wpg9". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 11:30:15 crc kubenswrapper[4752]: I0929 11:30:15.578724 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/514664cb-8dd6-4485-bd72-85346d81346c-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "514664cb-8dd6-4485-bd72-85346d81346c" (UID: "514664cb-8dd6-4485-bd72-85346d81346c"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:30:15 crc kubenswrapper[4752]: I0929 11:30:15.601362 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95d7502d-c8b2-4023-a56f-f69bf1dc4b0f-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "95d7502d-c8b2-4023-a56f-f69bf1dc4b0f" (UID: "95d7502d-c8b2-4023-a56f-f69bf1dc4b0f"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:30:15 crc kubenswrapper[4752]: I0929 11:30:15.601711 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/514664cb-8dd6-4485-bd72-85346d81346c-config-data" (OuterVolumeSpecName: "config-data") pod "514664cb-8dd6-4485-bd72-85346d81346c" (UID: "514664cb-8dd6-4485-bd72-85346d81346c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:30:15 crc kubenswrapper[4752]: I0929 11:30:15.617793 4752 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95d7502d-c8b2-4023-a56f-f69bf1dc4b0f-logs\") on node \"crc\" DevicePath \"\"" Sep 29 11:30:15 crc kubenswrapper[4752]: I0929 11:30:15.618289 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzmv7\" (UniqueName: \"kubernetes.io/projected/95d7502d-c8b2-4023-a56f-f69bf1dc4b0f-kube-api-access-lzmv7\") on node \"crc\" DevicePath \"\"" Sep 29 11:30:15 crc kubenswrapper[4752]: I0929 11:30:15.618379 4752 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/514664cb-8dd6-4485-bd72-85346d81346c-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Sep 29 11:30:15 crc kubenswrapper[4752]: I0929 11:30:15.618490 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/514664cb-8dd6-4485-bd72-85346d81346c-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 11:30:15 crc kubenswrapper[4752]: I0929 11:30:15.618558 4752 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/514664cb-8dd6-4485-bd72-85346d81346c-logs\") on node \"crc\" DevicePath \"\"" Sep 29 11:30:15 crc kubenswrapper[4752]: I0929 11:30:15.618626 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9wpg9\" (UniqueName: \"kubernetes.io/projected/514664cb-8dd6-4485-bd72-85346d81346c-kube-api-access-9wpg9\") on node \"crc\" DevicePath \"\"" Sep 29 11:30:15 crc kubenswrapper[4752]: I0929 11:30:15.618733 4752 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/95d7502d-c8b2-4023-a56f-f69bf1dc4b0f-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Sep 29 11:30:15 crc kubenswrapper[4752]: I0929 11:30:15.629737 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95d7502d-c8b2-4023-a56f-f69bf1dc4b0f-config-data" (OuterVolumeSpecName: "config-data") pod "95d7502d-c8b2-4023-a56f-f69bf1dc4b0f" (UID: "95d7502d-c8b2-4023-a56f-f69bf1dc4b0f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:30:15 crc kubenswrapper[4752]: I0929 11:30:15.719552 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95d7502d-c8b2-4023-a56f-f69bf1dc4b0f-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 11:30:16 crc kubenswrapper[4752]: I0929 11:30:16.043481 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45988c51-6072-40ba-91cf-dce53d796c66" path="/var/lib/kubelet/pods/45988c51-6072-40ba-91cf-dce53d796c66/volumes" Sep 29 11:30:16 crc kubenswrapper[4752]: I0929 11:30:16.044823 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97c3c1de-9b22-442b-96b0-1df0d483fa12" path="/var/lib/kubelet/pods/97c3c1de-9b22-442b-96b0-1df0d483fa12/volumes" Sep 29 11:30:16 crc kubenswrapper[4752]: I0929 11:30:16.045421 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e17b7ff7-14b1-4e5b-9293-1a59096c6acc" path="/var/lib/kubelet/pods/e17b7ff7-14b1-4e5b-9293-1a59096c6acc/volumes" Sep 29 11:30:16 crc kubenswrapper[4752]: I0929 11:30:16.045930 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f840aee8-6059-4658-89f6-09d799f64614" path="/var/lib/kubelet/pods/f840aee8-6059-4658-89f6-09d799f64614/volumes" Sep 29 11:30:16 crc kubenswrapper[4752]: I0929 11:30:16.378487 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"514664cb-8dd6-4485-bd72-85346d81346c","Type":"ContainerDied","Data":"37793d51589d92bedc26ed575f05659b3b303681d6ce881ab98bf76820a7c3f5"} Sep 29 11:30:16 crc kubenswrapper[4752]: I0929 11:30:16.378548 4752 scope.go:117] "RemoveContainer" containerID="c2aeb3e958ca36cc87702187b7dbbe6665610fbcc4dd0fab6df4723e465fe7cc" Sep 29 11:30:16 crc kubenswrapper[4752]: I0929 11:30:16.378739 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:30:16 crc kubenswrapper[4752]: I0929 11:30:16.381932 4752 generic.go:334] "Generic (PLEG): container finished" podID="9bc68bdf-98e2-4ebf-be29-30fb59ebf7bd" containerID="ec28be1e5f634f90da981f9c5e07d88e4d408e07e2938fcce564217941a2ceee" exitCode=0 Sep 29 11:30:16 crc kubenswrapper[4752]: I0929 11:30:16.381984 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-84dqj" event={"ID":"9bc68bdf-98e2-4ebf-be29-30fb59ebf7bd","Type":"ContainerDied","Data":"ec28be1e5f634f90da981f9c5e07d88e4d408e07e2938fcce564217941a2ceee"} Sep 29 11:30:16 crc kubenswrapper[4752]: I0929 11:30:16.384512 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-1" Sep 29 11:30:16 crc kubenswrapper[4752]: I0929 11:30:16.404658 4752 scope.go:117] "RemoveContainer" containerID="882177c1010a738182504c8ff4965c48e3c8b9f425c2477b2a4b427b37c4951e" Sep 29 11:30:16 crc kubenswrapper[4752]: I0929 11:30:16.420846 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Sep 29 11:30:16 crc kubenswrapper[4752]: I0929 11:30:16.424858 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Sep 29 11:30:16 crc kubenswrapper[4752]: I0929 11:30:16.430371 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-1"] Sep 29 11:30:16 crc kubenswrapper[4752]: I0929 11:30:16.435692 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-1"] Sep 29 11:30:16 crc kubenswrapper[4752]: I0929 11:30:16.903498 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Sep 29 11:30:16 crc kubenswrapper[4752]: I0929 11:30:16.903969 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="70c139f9-abc5-4375-a70f-768959d3945e" containerName="ceilometer-central-agent" containerID="cri-o://92ba532bef5e98659babc1ec73055f9b66e157b9db2ceb6db7057163e037d15a" gracePeriod=30 Sep 29 11:30:16 crc kubenswrapper[4752]: I0929 11:30:16.904439 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="70c139f9-abc5-4375-a70f-768959d3945e" containerName="proxy-httpd" containerID="cri-o://f29594268803a043a1620f6e0479c1da3affafabd1524e87c5cd3d694969ded8" gracePeriod=30 Sep 29 11:30:16 crc kubenswrapper[4752]: I0929 11:30:16.904555 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="70c139f9-abc5-4375-a70f-768959d3945e" containerName="ceilometer-notification-agent" containerID="cri-o://7e94e0121f7521fb5e12d0c79f85472c2a3a837e529ded4449b1778d4863e3d5" gracePeriod=30 Sep 29 11:30:16 crc kubenswrapper[4752]: I0929 11:30:16.904719 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="70c139f9-abc5-4375-a70f-768959d3945e" containerName="sg-core" containerID="cri-o://2de9987615f58684a718a1384a0714af8a8938e628d0a1a3962a6135810349b1" gracePeriod=30 Sep 29 11:30:17 crc kubenswrapper[4752]: I0929 11:30:17.395873 4752 generic.go:334] "Generic (PLEG): container finished" podID="70c139f9-abc5-4375-a70f-768959d3945e" containerID="f29594268803a043a1620f6e0479c1da3affafabd1524e87c5cd3d694969ded8" exitCode=0 Sep 29 11:30:17 crc kubenswrapper[4752]: I0929 11:30:17.395903 4752 generic.go:334] "Generic (PLEG): container finished" podID="70c139f9-abc5-4375-a70f-768959d3945e" containerID="2de9987615f58684a718a1384a0714af8a8938e628d0a1a3962a6135810349b1" exitCode=2 Sep 29 11:30:17 crc kubenswrapper[4752]: I0929 11:30:17.395910 4752 generic.go:334] "Generic (PLEG): container finished" podID="70c139f9-abc5-4375-a70f-768959d3945e" containerID="92ba532bef5e98659babc1ec73055f9b66e157b9db2ceb6db7057163e037d15a" exitCode=0 Sep 29 11:30:17 crc kubenswrapper[4752]: I0929 11:30:17.395917 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"70c139f9-abc5-4375-a70f-768959d3945e","Type":"ContainerDied","Data":"f29594268803a043a1620f6e0479c1da3affafabd1524e87c5cd3d694969ded8"} Sep 29 11:30:17 crc kubenswrapper[4752]: I0929 11:30:17.395954 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"70c139f9-abc5-4375-a70f-768959d3945e","Type":"ContainerDied","Data":"2de9987615f58684a718a1384a0714af8a8938e628d0a1a3962a6135810349b1"} Sep 29 11:30:17 crc kubenswrapper[4752]: I0929 11:30:17.395989 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"70c139f9-abc5-4375-a70f-768959d3945e","Type":"ContainerDied","Data":"92ba532bef5e98659babc1ec73055f9b66e157b9db2ceb6db7057163e037d15a"} Sep 29 11:30:17 crc kubenswrapper[4752]: I0929 11:30:17.696997 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-84dqj" Sep 29 11:30:17 crc kubenswrapper[4752]: I0929 11:30:17.762813 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7rcxg\" (UniqueName: \"kubernetes.io/projected/9bc68bdf-98e2-4ebf-be29-30fb59ebf7bd-kube-api-access-7rcxg\") pod \"9bc68bdf-98e2-4ebf-be29-30fb59ebf7bd\" (UID: \"9bc68bdf-98e2-4ebf-be29-30fb59ebf7bd\") " Sep 29 11:30:17 crc kubenswrapper[4752]: I0929 11:30:17.771552 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9bc68bdf-98e2-4ebf-be29-30fb59ebf7bd-kube-api-access-7rcxg" (OuterVolumeSpecName: "kube-api-access-7rcxg") pod "9bc68bdf-98e2-4ebf-be29-30fb59ebf7bd" (UID: "9bc68bdf-98e2-4ebf-be29-30fb59ebf7bd"). InnerVolumeSpecName "kube-api-access-7rcxg". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 11:30:17 crc kubenswrapper[4752]: I0929 11:30:17.864447 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7rcxg\" (UniqueName: \"kubernetes.io/projected/9bc68bdf-98e2-4ebf-be29-30fb59ebf7bd-kube-api-access-7rcxg\") on node \"crc\" DevicePath \"\"" Sep 29 11:30:18 crc kubenswrapper[4752]: I0929 11:30:18.040111 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="514664cb-8dd6-4485-bd72-85346d81346c" path="/var/lib/kubelet/pods/514664cb-8dd6-4485-bd72-85346d81346c/volumes" Sep 29 11:30:18 crc kubenswrapper[4752]: I0929 11:30:18.041075 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95d7502d-c8b2-4023-a56f-f69bf1dc4b0f" path="/var/lib/kubelet/pods/95d7502d-c8b2-4023-a56f-f69bf1dc4b0f/volumes" Sep 29 11:30:18 crc kubenswrapper[4752]: I0929 11:30:18.408660 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-84dqj" event={"ID":"9bc68bdf-98e2-4ebf-be29-30fb59ebf7bd","Type":"ContainerDied","Data":"41e4b26366afa74cca1b34cdba0b2103d95636d9707238b862c2f44b544a0374"} Sep 29 11:30:18 crc kubenswrapper[4752]: I0929 11:30:18.408701 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="41e4b26366afa74cca1b34cdba0b2103d95636d9707238b862c2f44b544a0374" Sep 29 11:30:18 crc kubenswrapper[4752]: I0929 11:30:18.408763 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-84dqj" Sep 29 11:30:19 crc kubenswrapper[4752]: E0929 11:30:19.454812 4752 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="86992edd597bf5fa1f0ab937814c8b38338a68164063335ef3b4711dc0323086" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Sep 29 11:30:19 crc kubenswrapper[4752]: E0929 11:30:19.456268 4752 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="86992edd597bf5fa1f0ab937814c8b38338a68164063335ef3b4711dc0323086" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Sep 29 11:30:19 crc kubenswrapper[4752]: E0929 11:30:19.457992 4752 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="86992edd597bf5fa1f0ab937814c8b38338a68164063335ef3b4711dc0323086" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Sep 29 11:30:19 crc kubenswrapper[4752]: E0929 11:30:19.458035 4752 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podUID="a72ff6e9-9f96-4da4-979c-7ef8afdc59c7" containerName="watcher-applier" Sep 29 11:30:20 crc kubenswrapper[4752]: I0929 11:30:20.002511 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:30:20 crc kubenswrapper[4752]: I0929 11:30:20.199750 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t42z2\" (UniqueName: \"kubernetes.io/projected/70c139f9-abc5-4375-a70f-768959d3945e-kube-api-access-t42z2\") pod \"70c139f9-abc5-4375-a70f-768959d3945e\" (UID: \"70c139f9-abc5-4375-a70f-768959d3945e\") " Sep 29 11:30:20 crc kubenswrapper[4752]: I0929 11:30:20.200080 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70c139f9-abc5-4375-a70f-768959d3945e-config-data\") pod \"70c139f9-abc5-4375-a70f-768959d3945e\" (UID: \"70c139f9-abc5-4375-a70f-768959d3945e\") " Sep 29 11:30:20 crc kubenswrapper[4752]: I0929 11:30:20.200209 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70c139f9-abc5-4375-a70f-768959d3945e-combined-ca-bundle\") pod \"70c139f9-abc5-4375-a70f-768959d3945e\" (UID: \"70c139f9-abc5-4375-a70f-768959d3945e\") " Sep 29 11:30:20 crc kubenswrapper[4752]: I0929 11:30:20.200270 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/70c139f9-abc5-4375-a70f-768959d3945e-log-httpd\") pod \"70c139f9-abc5-4375-a70f-768959d3945e\" (UID: \"70c139f9-abc5-4375-a70f-768959d3945e\") " Sep 29 11:30:20 crc kubenswrapper[4752]: I0929 11:30:20.200310 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/70c139f9-abc5-4375-a70f-768959d3945e-run-httpd\") pod \"70c139f9-abc5-4375-a70f-768959d3945e\" (UID: \"70c139f9-abc5-4375-a70f-768959d3945e\") " Sep 29 11:30:20 crc kubenswrapper[4752]: I0929 11:30:20.200331 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/70c139f9-abc5-4375-a70f-768959d3945e-scripts\") pod \"70c139f9-abc5-4375-a70f-768959d3945e\" (UID: \"70c139f9-abc5-4375-a70f-768959d3945e\") " Sep 29 11:30:20 crc kubenswrapper[4752]: I0929 11:30:20.200347 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/70c139f9-abc5-4375-a70f-768959d3945e-sg-core-conf-yaml\") pod \"70c139f9-abc5-4375-a70f-768959d3945e\" (UID: \"70c139f9-abc5-4375-a70f-768959d3945e\") " Sep 29 11:30:20 crc kubenswrapper[4752]: I0929 11:30:20.200383 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/70c139f9-abc5-4375-a70f-768959d3945e-ceilometer-tls-certs\") pod \"70c139f9-abc5-4375-a70f-768959d3945e\" (UID: \"70c139f9-abc5-4375-a70f-768959d3945e\") " Sep 29 11:30:20 crc kubenswrapper[4752]: I0929 11:30:20.200942 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70c139f9-abc5-4375-a70f-768959d3945e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "70c139f9-abc5-4375-a70f-768959d3945e" (UID: "70c139f9-abc5-4375-a70f-768959d3945e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 11:30:20 crc kubenswrapper[4752]: I0929 11:30:20.201062 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70c139f9-abc5-4375-a70f-768959d3945e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "70c139f9-abc5-4375-a70f-768959d3945e" (UID: "70c139f9-abc5-4375-a70f-768959d3945e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 11:30:20 crc kubenswrapper[4752]: I0929 11:30:20.201361 4752 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/70c139f9-abc5-4375-a70f-768959d3945e-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 29 11:30:20 crc kubenswrapper[4752]: I0929 11:30:20.201395 4752 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/70c139f9-abc5-4375-a70f-768959d3945e-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 29 11:30:20 crc kubenswrapper[4752]: I0929 11:30:20.206011 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70c139f9-abc5-4375-a70f-768959d3945e-scripts" (OuterVolumeSpecName: "scripts") pod "70c139f9-abc5-4375-a70f-768959d3945e" (UID: "70c139f9-abc5-4375-a70f-768959d3945e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:30:20 crc kubenswrapper[4752]: I0929 11:30:20.206246 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70c139f9-abc5-4375-a70f-768959d3945e-kube-api-access-t42z2" (OuterVolumeSpecName: "kube-api-access-t42z2") pod "70c139f9-abc5-4375-a70f-768959d3945e" (UID: "70c139f9-abc5-4375-a70f-768959d3945e"). InnerVolumeSpecName "kube-api-access-t42z2". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 11:30:20 crc kubenswrapper[4752]: I0929 11:30:20.247386 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70c139f9-abc5-4375-a70f-768959d3945e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "70c139f9-abc5-4375-a70f-768959d3945e" (UID: "70c139f9-abc5-4375-a70f-768959d3945e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:30:20 crc kubenswrapper[4752]: I0929 11:30:20.257596 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70c139f9-abc5-4375-a70f-768959d3945e-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "70c139f9-abc5-4375-a70f-768959d3945e" (UID: "70c139f9-abc5-4375-a70f-768959d3945e"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:30:20 crc kubenswrapper[4752]: I0929 11:30:20.287780 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70c139f9-abc5-4375-a70f-768959d3945e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "70c139f9-abc5-4375-a70f-768959d3945e" (UID: "70c139f9-abc5-4375-a70f-768959d3945e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:30:20 crc kubenswrapper[4752]: I0929 11:30:20.297096 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70c139f9-abc5-4375-a70f-768959d3945e-config-data" (OuterVolumeSpecName: "config-data") pod "70c139f9-abc5-4375-a70f-768959d3945e" (UID: "70c139f9-abc5-4375-a70f-768959d3945e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:30:20 crc kubenswrapper[4752]: I0929 11:30:20.302799 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70c139f9-abc5-4375-a70f-768959d3945e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 11:30:20 crc kubenswrapper[4752]: I0929 11:30:20.302849 4752 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/70c139f9-abc5-4375-a70f-768959d3945e-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 11:30:20 crc kubenswrapper[4752]: I0929 11:30:20.302863 4752 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/70c139f9-abc5-4375-a70f-768959d3945e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 29 11:30:20 crc kubenswrapper[4752]: I0929 11:30:20.302874 4752 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/70c139f9-abc5-4375-a70f-768959d3945e-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 29 11:30:20 crc kubenswrapper[4752]: I0929 11:30:20.302887 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t42z2\" (UniqueName: \"kubernetes.io/projected/70c139f9-abc5-4375-a70f-768959d3945e-kube-api-access-t42z2\") on node \"crc\" DevicePath \"\"" Sep 29 11:30:20 crc kubenswrapper[4752]: I0929 11:30:20.302900 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70c139f9-abc5-4375-a70f-768959d3945e-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 11:30:20 crc kubenswrapper[4752]: I0929 11:30:20.425628 4752 generic.go:334] "Generic (PLEG): container finished" podID="70c139f9-abc5-4375-a70f-768959d3945e" containerID="7e94e0121f7521fb5e12d0c79f85472c2a3a837e529ded4449b1778d4863e3d5" exitCode=0 Sep 29 11:30:20 crc kubenswrapper[4752]: I0929 11:30:20.425662 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"70c139f9-abc5-4375-a70f-768959d3945e","Type":"ContainerDied","Data":"7e94e0121f7521fb5e12d0c79f85472c2a3a837e529ded4449b1778d4863e3d5"} Sep 29 11:30:20 crc kubenswrapper[4752]: I0929 11:30:20.425687 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:30:20 crc kubenswrapper[4752]: I0929 11:30:20.425716 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"70c139f9-abc5-4375-a70f-768959d3945e","Type":"ContainerDied","Data":"0f907ad45ff7ea20881207bde8bdf7a243ec4928f39c8f04af02da0d01c5986b"} Sep 29 11:30:20 crc kubenswrapper[4752]: I0929 11:30:20.425736 4752 scope.go:117] "RemoveContainer" containerID="f29594268803a043a1620f6e0479c1da3affafabd1524e87c5cd3d694969ded8" Sep 29 11:30:20 crc kubenswrapper[4752]: I0929 11:30:20.443228 4752 scope.go:117] "RemoveContainer" containerID="2de9987615f58684a718a1384a0714af8a8938e628d0a1a3962a6135810349b1" Sep 29 11:30:20 crc kubenswrapper[4752]: I0929 11:30:20.463836 4752 scope.go:117] "RemoveContainer" containerID="7e94e0121f7521fb5e12d0c79f85472c2a3a837e529ded4449b1778d4863e3d5" Sep 29 11:30:20 crc kubenswrapper[4752]: I0929 11:30:20.464920 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Sep 29 11:30:20 crc kubenswrapper[4752]: I0929 11:30:20.472658 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Sep 29 11:30:20 crc kubenswrapper[4752]: I0929 11:30:20.482447 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Sep 29 11:30:20 crc kubenswrapper[4752]: E0929 11:30:20.482836 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70c139f9-abc5-4375-a70f-768959d3945e" containerName="ceilometer-notification-agent" Sep 29 11:30:20 crc kubenswrapper[4752]: I0929 11:30:20.482855 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="70c139f9-abc5-4375-a70f-768959d3945e" containerName="ceilometer-notification-agent" Sep 29 11:30:20 crc kubenswrapper[4752]: E0929 11:30:20.482874 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="514664cb-8dd6-4485-bd72-85346d81346c" containerName="watcher-api" Sep 29 11:30:20 crc kubenswrapper[4752]: I0929 11:30:20.482999 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="514664cb-8dd6-4485-bd72-85346d81346c" containerName="watcher-api" Sep 29 11:30:20 crc kubenswrapper[4752]: E0929 11:30:20.483014 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70c139f9-abc5-4375-a70f-768959d3945e" containerName="ceilometer-central-agent" Sep 29 11:30:20 crc kubenswrapper[4752]: I0929 11:30:20.483023 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="70c139f9-abc5-4375-a70f-768959d3945e" containerName="ceilometer-central-agent" Sep 29 11:30:20 crc kubenswrapper[4752]: E0929 11:30:20.483040 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70c139f9-abc5-4375-a70f-768959d3945e" containerName="sg-core" Sep 29 11:30:20 crc kubenswrapper[4752]: I0929 11:30:20.483048 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="70c139f9-abc5-4375-a70f-768959d3945e" containerName="sg-core" Sep 29 11:30:20 crc kubenswrapper[4752]: E0929 11:30:20.483058 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="514664cb-8dd6-4485-bd72-85346d81346c" containerName="watcher-kuttl-api-log" Sep 29 11:30:20 crc kubenswrapper[4752]: I0929 11:30:20.483065 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="514664cb-8dd6-4485-bd72-85346d81346c" containerName="watcher-kuttl-api-log" Sep 29 11:30:20 crc kubenswrapper[4752]: E0929 11:30:20.483077 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f840aee8-6059-4658-89f6-09d799f64614" containerName="watcher-decision-engine" Sep 29 11:30:20 crc kubenswrapper[4752]: I0929 11:30:20.483084 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="f840aee8-6059-4658-89f6-09d799f64614" containerName="watcher-decision-engine" Sep 29 11:30:20 crc kubenswrapper[4752]: E0929 11:30:20.483099 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f840aee8-6059-4658-89f6-09d799f64614" containerName="watcher-decision-engine" Sep 29 11:30:20 crc kubenswrapper[4752]: I0929 11:30:20.483106 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="f840aee8-6059-4658-89f6-09d799f64614" containerName="watcher-decision-engine" Sep 29 11:30:20 crc kubenswrapper[4752]: E0929 11:30:20.483118 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f840aee8-6059-4658-89f6-09d799f64614" containerName="watcher-decision-engine" Sep 29 11:30:20 crc kubenswrapper[4752]: I0929 11:30:20.483125 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="f840aee8-6059-4658-89f6-09d799f64614" containerName="watcher-decision-engine" Sep 29 11:30:20 crc kubenswrapper[4752]: E0929 11:30:20.483138 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f840aee8-6059-4658-89f6-09d799f64614" containerName="watcher-decision-engine" Sep 29 11:30:20 crc kubenswrapper[4752]: I0929 11:30:20.483145 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="f840aee8-6059-4658-89f6-09d799f64614" containerName="watcher-decision-engine" Sep 29 11:30:20 crc kubenswrapper[4752]: E0929 11:30:20.483156 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f840aee8-6059-4658-89f6-09d799f64614" containerName="watcher-decision-engine" Sep 29 11:30:20 crc kubenswrapper[4752]: I0929 11:30:20.483163 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="f840aee8-6059-4658-89f6-09d799f64614" containerName="watcher-decision-engine" Sep 29 11:30:20 crc kubenswrapper[4752]: E0929 11:30:20.483179 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f840aee8-6059-4658-89f6-09d799f64614" containerName="watcher-decision-engine" Sep 29 11:30:20 crc kubenswrapper[4752]: I0929 11:30:20.483187 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="f840aee8-6059-4658-89f6-09d799f64614" containerName="watcher-decision-engine" Sep 29 11:30:20 crc kubenswrapper[4752]: E0929 11:30:20.483217 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70c139f9-abc5-4375-a70f-768959d3945e" containerName="proxy-httpd" Sep 29 11:30:20 crc kubenswrapper[4752]: I0929 11:30:20.483224 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="70c139f9-abc5-4375-a70f-768959d3945e" containerName="proxy-httpd" Sep 29 11:30:20 crc kubenswrapper[4752]: E0929 11:30:20.483235 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95d7502d-c8b2-4023-a56f-f69bf1dc4b0f" containerName="watcher-api" Sep 29 11:30:20 crc kubenswrapper[4752]: I0929 11:30:20.483242 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="95d7502d-c8b2-4023-a56f-f69bf1dc4b0f" containerName="watcher-api" Sep 29 11:30:20 crc kubenswrapper[4752]: E0929 11:30:20.483253 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bc68bdf-98e2-4ebf-be29-30fb59ebf7bd" containerName="mariadb-database-create" Sep 29 11:30:20 crc kubenswrapper[4752]: I0929 11:30:20.483261 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bc68bdf-98e2-4ebf-be29-30fb59ebf7bd" containerName="mariadb-database-create" Sep 29 11:30:20 crc kubenswrapper[4752]: E0929 11:30:20.483273 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95d7502d-c8b2-4023-a56f-f69bf1dc4b0f" containerName="watcher-kuttl-api-log" Sep 29 11:30:20 crc kubenswrapper[4752]: I0929 11:30:20.483281 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="95d7502d-c8b2-4023-a56f-f69bf1dc4b0f" containerName="watcher-kuttl-api-log" Sep 29 11:30:20 crc kubenswrapper[4752]: I0929 11:30:20.483488 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="514664cb-8dd6-4485-bd72-85346d81346c" containerName="watcher-kuttl-api-log" Sep 29 11:30:20 crc kubenswrapper[4752]: I0929 11:30:20.483505 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bc68bdf-98e2-4ebf-be29-30fb59ebf7bd" containerName="mariadb-database-create" Sep 29 11:30:20 crc kubenswrapper[4752]: I0929 11:30:20.483515 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="f840aee8-6059-4658-89f6-09d799f64614" containerName="watcher-decision-engine" Sep 29 11:30:20 crc kubenswrapper[4752]: I0929 11:30:20.483527 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="70c139f9-abc5-4375-a70f-768959d3945e" containerName="sg-core" Sep 29 11:30:20 crc kubenswrapper[4752]: I0929 11:30:20.483540 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="514664cb-8dd6-4485-bd72-85346d81346c" containerName="watcher-api" Sep 29 11:30:20 crc kubenswrapper[4752]: I0929 11:30:20.483554 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="70c139f9-abc5-4375-a70f-768959d3945e" containerName="ceilometer-central-agent" Sep 29 11:30:20 crc kubenswrapper[4752]: I0929 11:30:20.483568 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="f840aee8-6059-4658-89f6-09d799f64614" containerName="watcher-decision-engine" Sep 29 11:30:20 crc kubenswrapper[4752]: I0929 11:30:20.483580 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="f840aee8-6059-4658-89f6-09d799f64614" containerName="watcher-decision-engine" Sep 29 11:30:20 crc kubenswrapper[4752]: I0929 11:30:20.483593 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="f840aee8-6059-4658-89f6-09d799f64614" containerName="watcher-decision-engine" Sep 29 11:30:20 crc kubenswrapper[4752]: I0929 11:30:20.483604 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="95d7502d-c8b2-4023-a56f-f69bf1dc4b0f" containerName="watcher-api" Sep 29 11:30:20 crc kubenswrapper[4752]: I0929 11:30:20.483612 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="f840aee8-6059-4658-89f6-09d799f64614" containerName="watcher-decision-engine" Sep 29 11:30:20 crc kubenswrapper[4752]: I0929 11:30:20.483625 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="95d7502d-c8b2-4023-a56f-f69bf1dc4b0f" containerName="watcher-kuttl-api-log" Sep 29 11:30:20 crc kubenswrapper[4752]: I0929 11:30:20.483638 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="70c139f9-abc5-4375-a70f-768959d3945e" containerName="proxy-httpd" Sep 29 11:30:20 crc kubenswrapper[4752]: I0929 11:30:20.483648 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="70c139f9-abc5-4375-a70f-768959d3945e" containerName="ceilometer-notification-agent" Sep 29 11:30:20 crc kubenswrapper[4752]: I0929 11:30:20.484354 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="f840aee8-6059-4658-89f6-09d799f64614" containerName="watcher-decision-engine" Sep 29 11:30:20 crc kubenswrapper[4752]: I0929 11:30:20.485728 4752 scope.go:117] "RemoveContainer" containerID="92ba532bef5e98659babc1ec73055f9b66e157b9db2ceb6db7057163e037d15a" Sep 29 11:30:20 crc kubenswrapper[4752]: I0929 11:30:20.485848 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:30:20 crc kubenswrapper[4752]: I0929 11:30:20.489462 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Sep 29 11:30:20 crc kubenswrapper[4752]: I0929 11:30:20.489684 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-ceilometer-internal-svc" Sep 29 11:30:20 crc kubenswrapper[4752]: I0929 11:30:20.493384 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Sep 29 11:30:20 crc kubenswrapper[4752]: I0929 11:30:20.502321 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Sep 29 11:30:20 crc kubenswrapper[4752]: I0929 11:30:20.505872 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgnnl\" (UniqueName: \"kubernetes.io/projected/43876fe6-a17d-40d0-b0f3-5389fcf7c179-kube-api-access-wgnnl\") pod \"ceilometer-0\" (UID: \"43876fe6-a17d-40d0-b0f3-5389fcf7c179\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:30:20 crc kubenswrapper[4752]: I0929 11:30:20.505977 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43876fe6-a17d-40d0-b0f3-5389fcf7c179-config-data\") pod \"ceilometer-0\" (UID: \"43876fe6-a17d-40d0-b0f3-5389fcf7c179\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:30:20 crc kubenswrapper[4752]: I0929 11:30:20.506006 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43876fe6-a17d-40d0-b0f3-5389fcf7c179-scripts\") pod \"ceilometer-0\" (UID: \"43876fe6-a17d-40d0-b0f3-5389fcf7c179\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:30:20 crc kubenswrapper[4752]: I0929 11:30:20.506171 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/43876fe6-a17d-40d0-b0f3-5389fcf7c179-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"43876fe6-a17d-40d0-b0f3-5389fcf7c179\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:30:20 crc kubenswrapper[4752]: I0929 11:30:20.506235 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/43876fe6-a17d-40d0-b0f3-5389fcf7c179-log-httpd\") pod \"ceilometer-0\" (UID: \"43876fe6-a17d-40d0-b0f3-5389fcf7c179\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:30:20 crc kubenswrapper[4752]: I0929 11:30:20.506272 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/43876fe6-a17d-40d0-b0f3-5389fcf7c179-run-httpd\") pod \"ceilometer-0\" (UID: \"43876fe6-a17d-40d0-b0f3-5389fcf7c179\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:30:20 crc kubenswrapper[4752]: I0929 11:30:20.506320 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/43876fe6-a17d-40d0-b0f3-5389fcf7c179-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"43876fe6-a17d-40d0-b0f3-5389fcf7c179\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:30:20 crc kubenswrapper[4752]: I0929 11:30:20.506361 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43876fe6-a17d-40d0-b0f3-5389fcf7c179-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"43876fe6-a17d-40d0-b0f3-5389fcf7c179\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:30:20 crc kubenswrapper[4752]: I0929 11:30:20.524998 4752 scope.go:117] "RemoveContainer" containerID="f29594268803a043a1620f6e0479c1da3affafabd1524e87c5cd3d694969ded8" Sep 29 11:30:20 crc kubenswrapper[4752]: E0929 11:30:20.525453 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f29594268803a043a1620f6e0479c1da3affafabd1524e87c5cd3d694969ded8\": container with ID starting with f29594268803a043a1620f6e0479c1da3affafabd1524e87c5cd3d694969ded8 not found: ID does not exist" containerID="f29594268803a043a1620f6e0479c1da3affafabd1524e87c5cd3d694969ded8" Sep 29 11:30:20 crc kubenswrapper[4752]: I0929 11:30:20.525483 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f29594268803a043a1620f6e0479c1da3affafabd1524e87c5cd3d694969ded8"} err="failed to get container status \"f29594268803a043a1620f6e0479c1da3affafabd1524e87c5cd3d694969ded8\": rpc error: code = NotFound desc = could not find container \"f29594268803a043a1620f6e0479c1da3affafabd1524e87c5cd3d694969ded8\": container with ID starting with f29594268803a043a1620f6e0479c1da3affafabd1524e87c5cd3d694969ded8 not found: ID does not exist" Sep 29 11:30:20 crc kubenswrapper[4752]: I0929 11:30:20.525504 4752 scope.go:117] "RemoveContainer" containerID="2de9987615f58684a718a1384a0714af8a8938e628d0a1a3962a6135810349b1" Sep 29 11:30:20 crc kubenswrapper[4752]: E0929 11:30:20.525765 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2de9987615f58684a718a1384a0714af8a8938e628d0a1a3962a6135810349b1\": container with ID starting with 2de9987615f58684a718a1384a0714af8a8938e628d0a1a3962a6135810349b1 not found: ID does not exist" containerID="2de9987615f58684a718a1384a0714af8a8938e628d0a1a3962a6135810349b1" Sep 29 11:30:20 crc kubenswrapper[4752]: I0929 11:30:20.525794 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2de9987615f58684a718a1384a0714af8a8938e628d0a1a3962a6135810349b1"} err="failed to get container status \"2de9987615f58684a718a1384a0714af8a8938e628d0a1a3962a6135810349b1\": rpc error: code = NotFound desc = could not find container \"2de9987615f58684a718a1384a0714af8a8938e628d0a1a3962a6135810349b1\": container with ID starting with 2de9987615f58684a718a1384a0714af8a8938e628d0a1a3962a6135810349b1 not found: ID does not exist" Sep 29 11:30:20 crc kubenswrapper[4752]: I0929 11:30:20.525828 4752 scope.go:117] "RemoveContainer" containerID="7e94e0121f7521fb5e12d0c79f85472c2a3a837e529ded4449b1778d4863e3d5" Sep 29 11:30:20 crc kubenswrapper[4752]: E0929 11:30:20.526224 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e94e0121f7521fb5e12d0c79f85472c2a3a837e529ded4449b1778d4863e3d5\": container with ID starting with 7e94e0121f7521fb5e12d0c79f85472c2a3a837e529ded4449b1778d4863e3d5 not found: ID does not exist" containerID="7e94e0121f7521fb5e12d0c79f85472c2a3a837e529ded4449b1778d4863e3d5" Sep 29 11:30:20 crc kubenswrapper[4752]: I0929 11:30:20.526277 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e94e0121f7521fb5e12d0c79f85472c2a3a837e529ded4449b1778d4863e3d5"} err="failed to get container status \"7e94e0121f7521fb5e12d0c79f85472c2a3a837e529ded4449b1778d4863e3d5\": rpc error: code = NotFound desc = could not find container \"7e94e0121f7521fb5e12d0c79f85472c2a3a837e529ded4449b1778d4863e3d5\": container with ID starting with 7e94e0121f7521fb5e12d0c79f85472c2a3a837e529ded4449b1778d4863e3d5 not found: ID does not exist" Sep 29 11:30:20 crc kubenswrapper[4752]: I0929 11:30:20.526314 4752 scope.go:117] "RemoveContainer" containerID="92ba532bef5e98659babc1ec73055f9b66e157b9db2ceb6db7057163e037d15a" Sep 29 11:30:20 crc kubenswrapper[4752]: E0929 11:30:20.526626 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92ba532bef5e98659babc1ec73055f9b66e157b9db2ceb6db7057163e037d15a\": container with ID starting with 92ba532bef5e98659babc1ec73055f9b66e157b9db2ceb6db7057163e037d15a not found: ID does not exist" containerID="92ba532bef5e98659babc1ec73055f9b66e157b9db2ceb6db7057163e037d15a" Sep 29 11:30:20 crc kubenswrapper[4752]: I0929 11:30:20.526662 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92ba532bef5e98659babc1ec73055f9b66e157b9db2ceb6db7057163e037d15a"} err="failed to get container status \"92ba532bef5e98659babc1ec73055f9b66e157b9db2ceb6db7057163e037d15a\": rpc error: code = NotFound desc = could not find container \"92ba532bef5e98659babc1ec73055f9b66e157b9db2ceb6db7057163e037d15a\": container with ID starting with 92ba532bef5e98659babc1ec73055f9b66e157b9db2ceb6db7057163e037d15a not found: ID does not exist" Sep 29 11:30:20 crc kubenswrapper[4752]: I0929 11:30:20.607727 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43876fe6-a17d-40d0-b0f3-5389fcf7c179-config-data\") pod \"ceilometer-0\" (UID: \"43876fe6-a17d-40d0-b0f3-5389fcf7c179\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:30:20 crc kubenswrapper[4752]: I0929 11:30:20.607785 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43876fe6-a17d-40d0-b0f3-5389fcf7c179-scripts\") pod \"ceilometer-0\" (UID: \"43876fe6-a17d-40d0-b0f3-5389fcf7c179\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:30:20 crc kubenswrapper[4752]: I0929 11:30:20.607875 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/43876fe6-a17d-40d0-b0f3-5389fcf7c179-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"43876fe6-a17d-40d0-b0f3-5389fcf7c179\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:30:20 crc kubenswrapper[4752]: I0929 11:30:20.607916 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/43876fe6-a17d-40d0-b0f3-5389fcf7c179-log-httpd\") pod \"ceilometer-0\" (UID: \"43876fe6-a17d-40d0-b0f3-5389fcf7c179\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:30:20 crc kubenswrapper[4752]: I0929 11:30:20.607940 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/43876fe6-a17d-40d0-b0f3-5389fcf7c179-run-httpd\") pod \"ceilometer-0\" (UID: \"43876fe6-a17d-40d0-b0f3-5389fcf7c179\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:30:20 crc kubenswrapper[4752]: I0929 11:30:20.607971 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/43876fe6-a17d-40d0-b0f3-5389fcf7c179-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"43876fe6-a17d-40d0-b0f3-5389fcf7c179\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:30:20 crc kubenswrapper[4752]: I0929 11:30:20.608002 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43876fe6-a17d-40d0-b0f3-5389fcf7c179-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"43876fe6-a17d-40d0-b0f3-5389fcf7c179\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:30:20 crc kubenswrapper[4752]: I0929 11:30:20.608040 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgnnl\" (UniqueName: \"kubernetes.io/projected/43876fe6-a17d-40d0-b0f3-5389fcf7c179-kube-api-access-wgnnl\") pod \"ceilometer-0\" (UID: \"43876fe6-a17d-40d0-b0f3-5389fcf7c179\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:30:20 crc kubenswrapper[4752]: I0929 11:30:20.608659 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/43876fe6-a17d-40d0-b0f3-5389fcf7c179-log-httpd\") pod \"ceilometer-0\" (UID: \"43876fe6-a17d-40d0-b0f3-5389fcf7c179\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:30:20 crc kubenswrapper[4752]: I0929 11:30:20.608686 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/43876fe6-a17d-40d0-b0f3-5389fcf7c179-run-httpd\") pod \"ceilometer-0\" (UID: \"43876fe6-a17d-40d0-b0f3-5389fcf7c179\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:30:20 crc kubenswrapper[4752]: I0929 11:30:20.612319 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/43876fe6-a17d-40d0-b0f3-5389fcf7c179-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"43876fe6-a17d-40d0-b0f3-5389fcf7c179\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:30:20 crc kubenswrapper[4752]: I0929 11:30:20.613033 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43876fe6-a17d-40d0-b0f3-5389fcf7c179-scripts\") pod \"ceilometer-0\" (UID: \"43876fe6-a17d-40d0-b0f3-5389fcf7c179\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:30:20 crc kubenswrapper[4752]: I0929 11:30:20.613674 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/43876fe6-a17d-40d0-b0f3-5389fcf7c179-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"43876fe6-a17d-40d0-b0f3-5389fcf7c179\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:30:20 crc kubenswrapper[4752]: I0929 11:30:20.614051 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43876fe6-a17d-40d0-b0f3-5389fcf7c179-config-data\") pod \"ceilometer-0\" (UID: \"43876fe6-a17d-40d0-b0f3-5389fcf7c179\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:30:20 crc kubenswrapper[4752]: I0929 11:30:20.617424 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43876fe6-a17d-40d0-b0f3-5389fcf7c179-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"43876fe6-a17d-40d0-b0f3-5389fcf7c179\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:30:20 crc kubenswrapper[4752]: I0929 11:30:20.627246 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgnnl\" (UniqueName: \"kubernetes.io/projected/43876fe6-a17d-40d0-b0f3-5389fcf7c179-kube-api-access-wgnnl\") pod \"ceilometer-0\" (UID: \"43876fe6-a17d-40d0-b0f3-5389fcf7c179\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:30:20 crc kubenswrapper[4752]: I0929 11:30:20.803764 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:30:21 crc kubenswrapper[4752]: I0929 11:30:21.125004 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Sep 29 11:30:21 crc kubenswrapper[4752]: I0929 11:30:21.441237 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"43876fe6-a17d-40d0-b0f3-5389fcf7c179","Type":"ContainerStarted","Data":"ef3ffa114153547c9fbaa36df20578bf866dca015457ffc8f1813ad52b601a3f"} Sep 29 11:30:22 crc kubenswrapper[4752]: I0929 11:30:22.043443 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70c139f9-abc5-4375-a70f-768959d3945e" path="/var/lib/kubelet/pods/70c139f9-abc5-4375-a70f-768959d3945e/volumes" Sep 29 11:30:22 crc kubenswrapper[4752]: I0929 11:30:22.458168 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"43876fe6-a17d-40d0-b0f3-5389fcf7c179","Type":"ContainerStarted","Data":"7a211094737485805926e0f428448efffff5f17e32d47c5b891bb32355b78353"} Sep 29 11:30:22 crc kubenswrapper[4752]: I0929 11:30:22.458777 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"43876fe6-a17d-40d0-b0f3-5389fcf7c179","Type":"ContainerStarted","Data":"96d9f0d061ed429cb0d14164aaa01daee3d32bd9e68ac061b00d9881568691b5"} Sep 29 11:30:22 crc kubenswrapper[4752]: I0929 11:30:22.655937 4752 scope.go:117] "RemoveContainer" containerID="0f2d0a93bf241ea6876c7045753bca8c3b6ffb6faf20b37b4418f0b7c8b82cc4" Sep 29 11:30:23 crc kubenswrapper[4752]: I0929 11:30:23.468542 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"43876fe6-a17d-40d0-b0f3-5389fcf7c179","Type":"ContainerStarted","Data":"9e2a41b23f24f68d12ad7d7f0df2e1b1a0aba00ffb05714477bca67fb8e879af"} Sep 29 11:30:24 crc kubenswrapper[4752]: E0929 11:30:24.455096 4752 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="86992edd597bf5fa1f0ab937814c8b38338a68164063335ef3b4711dc0323086" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Sep 29 11:30:24 crc kubenswrapper[4752]: E0929 11:30:24.456742 4752 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="86992edd597bf5fa1f0ab937814c8b38338a68164063335ef3b4711dc0323086" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Sep 29 11:30:24 crc kubenswrapper[4752]: E0929 11:30:24.458531 4752 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="86992edd597bf5fa1f0ab937814c8b38338a68164063335ef3b4711dc0323086" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Sep 29 11:30:24 crc kubenswrapper[4752]: E0929 11:30:24.458608 4752 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podUID="a72ff6e9-9f96-4da4-979c-7ef8afdc59c7" containerName="watcher-applier" Sep 29 11:30:25 crc kubenswrapper[4752]: I0929 11:30:25.487449 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"43876fe6-a17d-40d0-b0f3-5389fcf7c179","Type":"ContainerStarted","Data":"00bcd9f4c5cfada498cfd0d38aa30b19bc40ff1a783f39496e9595b131972799"} Sep 29 11:30:25 crc kubenswrapper[4752]: I0929 11:30:25.488040 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:30:25 crc kubenswrapper[4752]: I0929 11:30:25.517848 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/ceilometer-0" podStartSLOduration=1.628821635 podStartE2EDuration="5.517825892s" podCreationTimestamp="2025-09-29 11:30:20 +0000 UTC" firstStartedPulling="2025-09-29 11:30:21.136477993 +0000 UTC m=+2761.925619660" lastFinishedPulling="2025-09-29 11:30:25.02548225 +0000 UTC m=+2765.814623917" observedRunningTime="2025-09-29 11:30:25.512133112 +0000 UTC m=+2766.301274809" watchObservedRunningTime="2025-09-29 11:30:25.517825892 +0000 UTC m=+2766.306967559" Sep 29 11:30:28 crc kubenswrapper[4752]: I0929 11:30:28.930147 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-20c0-account-create-jr4ls"] Sep 29 11:30:28 crc kubenswrapper[4752]: I0929 11:30:28.932414 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-20c0-account-create-jr4ls" Sep 29 11:30:28 crc kubenswrapper[4752]: I0929 11:30:28.936748 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-20c0-account-create-jr4ls"] Sep 29 11:30:28 crc kubenswrapper[4752]: I0929 11:30:28.937030 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-db-secret" Sep 29 11:30:28 crc kubenswrapper[4752]: I0929 11:30:28.965239 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2rn6\" (UniqueName: \"kubernetes.io/projected/307f40b1-f2e8-408e-9d6b-068f0259c850-kube-api-access-n2rn6\") pod \"watcher-20c0-account-create-jr4ls\" (UID: \"307f40b1-f2e8-408e-9d6b-068f0259c850\") " pod="watcher-kuttl-default/watcher-20c0-account-create-jr4ls" Sep 29 11:30:29 crc kubenswrapper[4752]: I0929 11:30:29.067234 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2rn6\" (UniqueName: \"kubernetes.io/projected/307f40b1-f2e8-408e-9d6b-068f0259c850-kube-api-access-n2rn6\") pod \"watcher-20c0-account-create-jr4ls\" (UID: \"307f40b1-f2e8-408e-9d6b-068f0259c850\") " pod="watcher-kuttl-default/watcher-20c0-account-create-jr4ls" Sep 29 11:30:29 crc kubenswrapper[4752]: I0929 11:30:29.090085 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2rn6\" (UniqueName: \"kubernetes.io/projected/307f40b1-f2e8-408e-9d6b-068f0259c850-kube-api-access-n2rn6\") pod \"watcher-20c0-account-create-jr4ls\" (UID: \"307f40b1-f2e8-408e-9d6b-068f0259c850\") " pod="watcher-kuttl-default/watcher-20c0-account-create-jr4ls" Sep 29 11:30:29 crc kubenswrapper[4752]: I0929 11:30:29.252686 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-20c0-account-create-jr4ls" Sep 29 11:30:29 crc kubenswrapper[4752]: E0929 11:30:29.463442 4752 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="86992edd597bf5fa1f0ab937814c8b38338a68164063335ef3b4711dc0323086" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Sep 29 11:30:29 crc kubenswrapper[4752]: E0929 11:30:29.465185 4752 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="86992edd597bf5fa1f0ab937814c8b38338a68164063335ef3b4711dc0323086" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Sep 29 11:30:29 crc kubenswrapper[4752]: E0929 11:30:29.466415 4752 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="86992edd597bf5fa1f0ab937814c8b38338a68164063335ef3b4711dc0323086" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Sep 29 11:30:29 crc kubenswrapper[4752]: E0929 11:30:29.466492 4752 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podUID="a72ff6e9-9f96-4da4-979c-7ef8afdc59c7" containerName="watcher-applier" Sep 29 11:30:29 crc kubenswrapper[4752]: I0929 11:30:29.684400 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-20c0-account-create-jr4ls"] Sep 29 11:30:29 crc kubenswrapper[4752]: W0929 11:30:29.695610 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod307f40b1_f2e8_408e_9d6b_068f0259c850.slice/crio-87001da7cc5f4f9f06cd5ab675a5cf2f198f1b56d0bf9e78532211bcafbb3504 WatchSource:0}: Error finding container 87001da7cc5f4f9f06cd5ab675a5cf2f198f1b56d0bf9e78532211bcafbb3504: Status 404 returned error can't find the container with id 87001da7cc5f4f9f06cd5ab675a5cf2f198f1b56d0bf9e78532211bcafbb3504 Sep 29 11:30:30 crc kubenswrapper[4752]: I0929 11:30:30.546296 4752 generic.go:334] "Generic (PLEG): container finished" podID="307f40b1-f2e8-408e-9d6b-068f0259c850" containerID="3d7e7f3e450c6ea4f1c1ef55f7b063fcbd91f92640ab62a4ba48a2ea1225a5cb" exitCode=0 Sep 29 11:30:30 crc kubenswrapper[4752]: I0929 11:30:30.546349 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-20c0-account-create-jr4ls" event={"ID":"307f40b1-f2e8-408e-9d6b-068f0259c850","Type":"ContainerDied","Data":"3d7e7f3e450c6ea4f1c1ef55f7b063fcbd91f92640ab62a4ba48a2ea1225a5cb"} Sep 29 11:30:30 crc kubenswrapper[4752]: I0929 11:30:30.546388 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-20c0-account-create-jr4ls" event={"ID":"307f40b1-f2e8-408e-9d6b-068f0259c850","Type":"ContainerStarted","Data":"87001da7cc5f4f9f06cd5ab675a5cf2f198f1b56d0bf9e78532211bcafbb3504"} Sep 29 11:30:31 crc kubenswrapper[4752]: I0929 11:30:31.879541 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-20c0-account-create-jr4ls" Sep 29 11:30:31 crc kubenswrapper[4752]: I0929 11:30:31.912382 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n2rn6\" (UniqueName: \"kubernetes.io/projected/307f40b1-f2e8-408e-9d6b-068f0259c850-kube-api-access-n2rn6\") pod \"307f40b1-f2e8-408e-9d6b-068f0259c850\" (UID: \"307f40b1-f2e8-408e-9d6b-068f0259c850\") " Sep 29 11:30:31 crc kubenswrapper[4752]: I0929 11:30:31.919024 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/307f40b1-f2e8-408e-9d6b-068f0259c850-kube-api-access-n2rn6" (OuterVolumeSpecName: "kube-api-access-n2rn6") pod "307f40b1-f2e8-408e-9d6b-068f0259c850" (UID: "307f40b1-f2e8-408e-9d6b-068f0259c850"). InnerVolumeSpecName "kube-api-access-n2rn6". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 11:30:32 crc kubenswrapper[4752]: I0929 11:30:32.014369 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n2rn6\" (UniqueName: \"kubernetes.io/projected/307f40b1-f2e8-408e-9d6b-068f0259c850-kube-api-access-n2rn6\") on node \"crc\" DevicePath \"\"" Sep 29 11:30:32 crc kubenswrapper[4752]: I0929 11:30:32.563099 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-20c0-account-create-jr4ls" event={"ID":"307f40b1-f2e8-408e-9d6b-068f0259c850","Type":"ContainerDied","Data":"87001da7cc5f4f9f06cd5ab675a5cf2f198f1b56d0bf9e78532211bcafbb3504"} Sep 29 11:30:32 crc kubenswrapper[4752]: I0929 11:30:32.563137 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="87001da7cc5f4f9f06cd5ab675a5cf2f198f1b56d0bf9e78532211bcafbb3504" Sep 29 11:30:32 crc kubenswrapper[4752]: I0929 11:30:32.563178 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-20c0-account-create-jr4ls" Sep 29 11:30:34 crc kubenswrapper[4752]: I0929 11:30:34.152994 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-fnlm6"] Sep 29 11:30:34 crc kubenswrapper[4752]: E0929 11:30:34.153726 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="307f40b1-f2e8-408e-9d6b-068f0259c850" containerName="mariadb-account-create" Sep 29 11:30:34 crc kubenswrapper[4752]: I0929 11:30:34.153741 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="307f40b1-f2e8-408e-9d6b-068f0259c850" containerName="mariadb-account-create" Sep 29 11:30:34 crc kubenswrapper[4752]: I0929 11:30:34.153974 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="307f40b1-f2e8-408e-9d6b-068f0259c850" containerName="mariadb-account-create" Sep 29 11:30:34 crc kubenswrapper[4752]: I0929 11:30:34.154691 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-fnlm6" Sep 29 11:30:34 crc kubenswrapper[4752]: I0929 11:30:34.157956 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-watcher-kuttl-dockercfg-sdd7c" Sep 29 11:30:34 crc kubenswrapper[4752]: I0929 11:30:34.157979 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-config-data" Sep 29 11:30:34 crc kubenswrapper[4752]: I0929 11:30:34.164169 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-fnlm6"] Sep 29 11:30:34 crc kubenswrapper[4752]: I0929 11:30:34.252956 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c788fc02-bde8-4ee8-a380-0c50d4b56c65-config-data\") pod \"watcher-kuttl-db-sync-fnlm6\" (UID: \"c788fc02-bde8-4ee8-a380-0c50d4b56c65\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-fnlm6" Sep 29 11:30:34 crc kubenswrapper[4752]: I0929 11:30:34.253105 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c788fc02-bde8-4ee8-a380-0c50d4b56c65-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-fnlm6\" (UID: \"c788fc02-bde8-4ee8-a380-0c50d4b56c65\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-fnlm6" Sep 29 11:30:34 crc kubenswrapper[4752]: I0929 11:30:34.253155 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nvv9\" (UniqueName: \"kubernetes.io/projected/c788fc02-bde8-4ee8-a380-0c50d4b56c65-kube-api-access-7nvv9\") pod \"watcher-kuttl-db-sync-fnlm6\" (UID: \"c788fc02-bde8-4ee8-a380-0c50d4b56c65\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-fnlm6" Sep 29 11:30:34 crc kubenswrapper[4752]: I0929 11:30:34.253174 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c788fc02-bde8-4ee8-a380-0c50d4b56c65-db-sync-config-data\") pod \"watcher-kuttl-db-sync-fnlm6\" (UID: \"c788fc02-bde8-4ee8-a380-0c50d4b56c65\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-fnlm6" Sep 29 11:30:34 crc kubenswrapper[4752]: I0929 11:30:34.355176 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c788fc02-bde8-4ee8-a380-0c50d4b56c65-config-data\") pod \"watcher-kuttl-db-sync-fnlm6\" (UID: \"c788fc02-bde8-4ee8-a380-0c50d4b56c65\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-fnlm6" Sep 29 11:30:34 crc kubenswrapper[4752]: I0929 11:30:34.355244 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c788fc02-bde8-4ee8-a380-0c50d4b56c65-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-fnlm6\" (UID: \"c788fc02-bde8-4ee8-a380-0c50d4b56c65\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-fnlm6" Sep 29 11:30:34 crc kubenswrapper[4752]: I0929 11:30:34.355279 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7nvv9\" (UniqueName: \"kubernetes.io/projected/c788fc02-bde8-4ee8-a380-0c50d4b56c65-kube-api-access-7nvv9\") pod \"watcher-kuttl-db-sync-fnlm6\" (UID: \"c788fc02-bde8-4ee8-a380-0c50d4b56c65\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-fnlm6" Sep 29 11:30:34 crc kubenswrapper[4752]: I0929 11:30:34.355296 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c788fc02-bde8-4ee8-a380-0c50d4b56c65-db-sync-config-data\") pod \"watcher-kuttl-db-sync-fnlm6\" (UID: \"c788fc02-bde8-4ee8-a380-0c50d4b56c65\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-fnlm6" Sep 29 11:30:34 crc kubenswrapper[4752]: I0929 11:30:34.359903 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c788fc02-bde8-4ee8-a380-0c50d4b56c65-config-data\") pod \"watcher-kuttl-db-sync-fnlm6\" (UID: \"c788fc02-bde8-4ee8-a380-0c50d4b56c65\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-fnlm6" Sep 29 11:30:34 crc kubenswrapper[4752]: I0929 11:30:34.360533 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c788fc02-bde8-4ee8-a380-0c50d4b56c65-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-fnlm6\" (UID: \"c788fc02-bde8-4ee8-a380-0c50d4b56c65\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-fnlm6" Sep 29 11:30:34 crc kubenswrapper[4752]: I0929 11:30:34.367235 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c788fc02-bde8-4ee8-a380-0c50d4b56c65-db-sync-config-data\") pod \"watcher-kuttl-db-sync-fnlm6\" (UID: \"c788fc02-bde8-4ee8-a380-0c50d4b56c65\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-fnlm6" Sep 29 11:30:34 crc kubenswrapper[4752]: I0929 11:30:34.378859 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nvv9\" (UniqueName: \"kubernetes.io/projected/c788fc02-bde8-4ee8-a380-0c50d4b56c65-kube-api-access-7nvv9\") pod \"watcher-kuttl-db-sync-fnlm6\" (UID: \"c788fc02-bde8-4ee8-a380-0c50d4b56c65\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-fnlm6" Sep 29 11:30:34 crc kubenswrapper[4752]: E0929 11:30:34.454689 4752 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="86992edd597bf5fa1f0ab937814c8b38338a68164063335ef3b4711dc0323086" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Sep 29 11:30:34 crc kubenswrapper[4752]: E0929 11:30:34.456309 4752 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="86992edd597bf5fa1f0ab937814c8b38338a68164063335ef3b4711dc0323086" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Sep 29 11:30:34 crc kubenswrapper[4752]: E0929 11:30:34.457372 4752 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="86992edd597bf5fa1f0ab937814c8b38338a68164063335ef3b4711dc0323086" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Sep 29 11:30:34 crc kubenswrapper[4752]: E0929 11:30:34.457449 4752 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podUID="a72ff6e9-9f96-4da4-979c-7ef8afdc59c7" containerName="watcher-applier" Sep 29 11:30:34 crc kubenswrapper[4752]: I0929 11:30:34.474651 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-fnlm6" Sep 29 11:30:34 crc kubenswrapper[4752]: I0929 11:30:34.908167 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-fnlm6"] Sep 29 11:30:34 crc kubenswrapper[4752]: W0929 11:30:34.914628 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc788fc02_bde8_4ee8_a380_0c50d4b56c65.slice/crio-8cc55116a7028e9047928a4d23112c84400ba47e4d4022e39601aa4ac7be2f38 WatchSource:0}: Error finding container 8cc55116a7028e9047928a4d23112c84400ba47e4d4022e39601aa4ac7be2f38: Status 404 returned error can't find the container with id 8cc55116a7028e9047928a4d23112c84400ba47e4d4022e39601aa4ac7be2f38 Sep 29 11:30:35 crc kubenswrapper[4752]: I0929 11:30:35.589104 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-fnlm6" event={"ID":"c788fc02-bde8-4ee8-a380-0c50d4b56c65","Type":"ContainerStarted","Data":"6f2de9bbeb6be56bd04b3f9016ee5f773fc52594a8f8e7e200ac0433ccec7f63"} Sep 29 11:30:35 crc kubenswrapper[4752]: I0929 11:30:35.589152 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-fnlm6" event={"ID":"c788fc02-bde8-4ee8-a380-0c50d4b56c65","Type":"ContainerStarted","Data":"8cc55116a7028e9047928a4d23112c84400ba47e4d4022e39601aa4ac7be2f38"} Sep 29 11:30:35 crc kubenswrapper[4752]: I0929 11:30:35.611348 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-db-sync-fnlm6" podStartSLOduration=1.611328604 podStartE2EDuration="1.611328604s" podCreationTimestamp="2025-09-29 11:30:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 11:30:35.606003725 +0000 UTC m=+2776.395145392" watchObservedRunningTime="2025-09-29 11:30:35.611328604 +0000 UTC m=+2776.400470271" Sep 29 11:30:37 crc kubenswrapper[4752]: I0929 11:30:37.604160 4752 generic.go:334] "Generic (PLEG): container finished" podID="c788fc02-bde8-4ee8-a380-0c50d4b56c65" containerID="6f2de9bbeb6be56bd04b3f9016ee5f773fc52594a8f8e7e200ac0433ccec7f63" exitCode=0 Sep 29 11:30:37 crc kubenswrapper[4752]: I0929 11:30:37.604292 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-fnlm6" event={"ID":"c788fc02-bde8-4ee8-a380-0c50d4b56c65","Type":"ContainerDied","Data":"6f2de9bbeb6be56bd04b3f9016ee5f773fc52594a8f8e7e200ac0433ccec7f63"} Sep 29 11:30:38 crc kubenswrapper[4752]: I0929 11:30:38.949329 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-fnlm6" Sep 29 11:30:39 crc kubenswrapper[4752]: I0929 11:30:39.023741 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c788fc02-bde8-4ee8-a380-0c50d4b56c65-db-sync-config-data\") pod \"c788fc02-bde8-4ee8-a380-0c50d4b56c65\" (UID: \"c788fc02-bde8-4ee8-a380-0c50d4b56c65\") " Sep 29 11:30:39 crc kubenswrapper[4752]: I0929 11:30:39.023864 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c788fc02-bde8-4ee8-a380-0c50d4b56c65-config-data\") pod \"c788fc02-bde8-4ee8-a380-0c50d4b56c65\" (UID: \"c788fc02-bde8-4ee8-a380-0c50d4b56c65\") " Sep 29 11:30:39 crc kubenswrapper[4752]: I0929 11:30:39.023930 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7nvv9\" (UniqueName: \"kubernetes.io/projected/c788fc02-bde8-4ee8-a380-0c50d4b56c65-kube-api-access-7nvv9\") pod \"c788fc02-bde8-4ee8-a380-0c50d4b56c65\" (UID: \"c788fc02-bde8-4ee8-a380-0c50d4b56c65\") " Sep 29 11:30:39 crc kubenswrapper[4752]: I0929 11:30:39.024026 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c788fc02-bde8-4ee8-a380-0c50d4b56c65-combined-ca-bundle\") pod \"c788fc02-bde8-4ee8-a380-0c50d4b56c65\" (UID: \"c788fc02-bde8-4ee8-a380-0c50d4b56c65\") " Sep 29 11:30:39 crc kubenswrapper[4752]: I0929 11:30:39.030441 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c788fc02-bde8-4ee8-a380-0c50d4b56c65-kube-api-access-7nvv9" (OuterVolumeSpecName: "kube-api-access-7nvv9") pod "c788fc02-bde8-4ee8-a380-0c50d4b56c65" (UID: "c788fc02-bde8-4ee8-a380-0c50d4b56c65"). InnerVolumeSpecName "kube-api-access-7nvv9". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 11:30:39 crc kubenswrapper[4752]: I0929 11:30:39.045347 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c788fc02-bde8-4ee8-a380-0c50d4b56c65-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "c788fc02-bde8-4ee8-a380-0c50d4b56c65" (UID: "c788fc02-bde8-4ee8-a380-0c50d4b56c65"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:30:39 crc kubenswrapper[4752]: I0929 11:30:39.051063 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c788fc02-bde8-4ee8-a380-0c50d4b56c65-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c788fc02-bde8-4ee8-a380-0c50d4b56c65" (UID: "c788fc02-bde8-4ee8-a380-0c50d4b56c65"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:30:39 crc kubenswrapper[4752]: I0929 11:30:39.080824 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c788fc02-bde8-4ee8-a380-0c50d4b56c65-config-data" (OuterVolumeSpecName: "config-data") pod "c788fc02-bde8-4ee8-a380-0c50d4b56c65" (UID: "c788fc02-bde8-4ee8-a380-0c50d4b56c65"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:30:39 crc kubenswrapper[4752]: I0929 11:30:39.126940 4752 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c788fc02-bde8-4ee8-a380-0c50d4b56c65-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 11:30:39 crc kubenswrapper[4752]: I0929 11:30:39.126977 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c788fc02-bde8-4ee8-a380-0c50d4b56c65-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 11:30:39 crc kubenswrapper[4752]: I0929 11:30:39.126987 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7nvv9\" (UniqueName: \"kubernetes.io/projected/c788fc02-bde8-4ee8-a380-0c50d4b56c65-kube-api-access-7nvv9\") on node \"crc\" DevicePath \"\"" Sep 29 11:30:39 crc kubenswrapper[4752]: I0929 11:30:39.126998 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c788fc02-bde8-4ee8-a380-0c50d4b56c65-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 11:30:39 crc kubenswrapper[4752]: E0929 11:30:39.454450 4752 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="86992edd597bf5fa1f0ab937814c8b38338a68164063335ef3b4711dc0323086" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Sep 29 11:30:39 crc kubenswrapper[4752]: E0929 11:30:39.455775 4752 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="86992edd597bf5fa1f0ab937814c8b38338a68164063335ef3b4711dc0323086" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Sep 29 11:30:39 crc kubenswrapper[4752]: E0929 11:30:39.457353 4752 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="86992edd597bf5fa1f0ab937814c8b38338a68164063335ef3b4711dc0323086" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Sep 29 11:30:39 crc kubenswrapper[4752]: E0929 11:30:39.457391 4752 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podUID="a72ff6e9-9f96-4da4-979c-7ef8afdc59c7" containerName="watcher-applier" Sep 29 11:30:39 crc kubenswrapper[4752]: I0929 11:30:39.621482 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-fnlm6" event={"ID":"c788fc02-bde8-4ee8-a380-0c50d4b56c65","Type":"ContainerDied","Data":"8cc55116a7028e9047928a4d23112c84400ba47e4d4022e39601aa4ac7be2f38"} Sep 29 11:30:39 crc kubenswrapper[4752]: I0929 11:30:39.621527 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8cc55116a7028e9047928a4d23112c84400ba47e4d4022e39601aa4ac7be2f38" Sep 29 11:30:39 crc kubenswrapper[4752]: I0929 11:30:39.621543 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-fnlm6" Sep 29 11:30:40 crc kubenswrapper[4752]: I0929 11:30:40.237913 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Sep 29 11:30:40 crc kubenswrapper[4752]: E0929 11:30:40.238266 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c788fc02-bde8-4ee8-a380-0c50d4b56c65" containerName="watcher-kuttl-db-sync" Sep 29 11:30:40 crc kubenswrapper[4752]: I0929 11:30:40.238280 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="c788fc02-bde8-4ee8-a380-0c50d4b56c65" containerName="watcher-kuttl-db-sync" Sep 29 11:30:40 crc kubenswrapper[4752]: I0929 11:30:40.238430 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="c788fc02-bde8-4ee8-a380-0c50d4b56c65" containerName="watcher-kuttl-db-sync" Sep 29 11:30:40 crc kubenswrapper[4752]: I0929 11:30:40.239346 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:30:40 crc kubenswrapper[4752]: I0929 11:30:40.241371 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-watcher-kuttl-dockercfg-sdd7c" Sep 29 11:30:40 crc kubenswrapper[4752]: I0929 11:30:40.243101 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-api-config-data" Sep 29 11:30:40 crc kubenswrapper[4752]: I0929 11:30:40.260962 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Sep 29 11:30:40 crc kubenswrapper[4752]: I0929 11:30:40.269902 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Sep 29 11:30:40 crc kubenswrapper[4752]: I0929 11:30:40.271591 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:30:40 crc kubenswrapper[4752]: I0929 11:30:40.300976 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-decision-engine-config-data" Sep 29 11:30:40 crc kubenswrapper[4752]: I0929 11:30:40.312392 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Sep 29 11:30:40 crc kubenswrapper[4752]: I0929 11:30:40.343903 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvvfz\" (UniqueName: \"kubernetes.io/projected/2de6e6b7-7c8e-4e4d-9e56-c341913f282c-kube-api-access-vvvfz\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"2de6e6b7-7c8e-4e4d-9e56-c341913f282c\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:30:40 crc kubenswrapper[4752]: I0929 11:30:40.343987 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04d72897-fee3-42f8-8b75-4ade4c6b4f9a-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"04d72897-fee3-42f8-8b75-4ade4c6b4f9a\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:30:40 crc kubenswrapper[4752]: I0929 11:30:40.344013 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/04d72897-fee3-42f8-8b75-4ade4c6b4f9a-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"04d72897-fee3-42f8-8b75-4ade4c6b4f9a\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:30:40 crc kubenswrapper[4752]: I0929 11:30:40.344036 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2de6e6b7-7c8e-4e4d-9e56-c341913f282c-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"2de6e6b7-7c8e-4e4d-9e56-c341913f282c\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:30:40 crc kubenswrapper[4752]: I0929 11:30:40.344071 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/2de6e6b7-7c8e-4e4d-9e56-c341913f282c-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"2de6e6b7-7c8e-4e4d-9e56-c341913f282c\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:30:40 crc kubenswrapper[4752]: I0929 11:30:40.344099 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04d72897-fee3-42f8-8b75-4ade4c6b4f9a-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"04d72897-fee3-42f8-8b75-4ade4c6b4f9a\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:30:40 crc kubenswrapper[4752]: I0929 11:30:40.344147 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2de6e6b7-7c8e-4e4d-9e56-c341913f282c-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"2de6e6b7-7c8e-4e4d-9e56-c341913f282c\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:30:40 crc kubenswrapper[4752]: I0929 11:30:40.344227 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04d72897-fee3-42f8-8b75-4ade4c6b4f9a-logs\") pod \"watcher-kuttl-api-0\" (UID: \"04d72897-fee3-42f8-8b75-4ade4c6b4f9a\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:30:40 crc kubenswrapper[4752]: I0929 11:30:40.344255 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qss5\" (UniqueName: \"kubernetes.io/projected/04d72897-fee3-42f8-8b75-4ade4c6b4f9a-kube-api-access-2qss5\") pod \"watcher-kuttl-api-0\" (UID: \"04d72897-fee3-42f8-8b75-4ade4c6b4f9a\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:30:40 crc kubenswrapper[4752]: I0929 11:30:40.344297 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2de6e6b7-7c8e-4e4d-9e56-c341913f282c-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"2de6e6b7-7c8e-4e4d-9e56-c341913f282c\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:30:40 crc kubenswrapper[4752]: I0929 11:30:40.445553 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2de6e6b7-7c8e-4e4d-9e56-c341913f282c-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"2de6e6b7-7c8e-4e4d-9e56-c341913f282c\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:30:40 crc kubenswrapper[4752]: I0929 11:30:40.445635 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04d72897-fee3-42f8-8b75-4ade4c6b4f9a-logs\") pod \"watcher-kuttl-api-0\" (UID: \"04d72897-fee3-42f8-8b75-4ade4c6b4f9a\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:30:40 crc kubenswrapper[4752]: I0929 11:30:40.445652 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qss5\" (UniqueName: \"kubernetes.io/projected/04d72897-fee3-42f8-8b75-4ade4c6b4f9a-kube-api-access-2qss5\") pod \"watcher-kuttl-api-0\" (UID: \"04d72897-fee3-42f8-8b75-4ade4c6b4f9a\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:30:40 crc kubenswrapper[4752]: I0929 11:30:40.445674 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2de6e6b7-7c8e-4e4d-9e56-c341913f282c-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"2de6e6b7-7c8e-4e4d-9e56-c341913f282c\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:30:40 crc kubenswrapper[4752]: I0929 11:30:40.445709 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvvfz\" (UniqueName: \"kubernetes.io/projected/2de6e6b7-7c8e-4e4d-9e56-c341913f282c-kube-api-access-vvvfz\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"2de6e6b7-7c8e-4e4d-9e56-c341913f282c\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:30:40 crc kubenswrapper[4752]: I0929 11:30:40.445750 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04d72897-fee3-42f8-8b75-4ade4c6b4f9a-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"04d72897-fee3-42f8-8b75-4ade4c6b4f9a\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:30:40 crc kubenswrapper[4752]: I0929 11:30:40.445766 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/04d72897-fee3-42f8-8b75-4ade4c6b4f9a-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"04d72897-fee3-42f8-8b75-4ade4c6b4f9a\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:30:40 crc kubenswrapper[4752]: I0929 11:30:40.445785 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2de6e6b7-7c8e-4e4d-9e56-c341913f282c-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"2de6e6b7-7c8e-4e4d-9e56-c341913f282c\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:30:40 crc kubenswrapper[4752]: I0929 11:30:40.445816 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/2de6e6b7-7c8e-4e4d-9e56-c341913f282c-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"2de6e6b7-7c8e-4e4d-9e56-c341913f282c\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:30:40 crc kubenswrapper[4752]: I0929 11:30:40.445839 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04d72897-fee3-42f8-8b75-4ade4c6b4f9a-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"04d72897-fee3-42f8-8b75-4ade4c6b4f9a\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:30:40 crc kubenswrapper[4752]: I0929 11:30:40.446237 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2de6e6b7-7c8e-4e4d-9e56-c341913f282c-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"2de6e6b7-7c8e-4e4d-9e56-c341913f282c\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:30:40 crc kubenswrapper[4752]: I0929 11:30:40.446261 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04d72897-fee3-42f8-8b75-4ade4c6b4f9a-logs\") pod \"watcher-kuttl-api-0\" (UID: \"04d72897-fee3-42f8-8b75-4ade4c6b4f9a\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:30:40 crc kubenswrapper[4752]: I0929 11:30:40.454539 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2de6e6b7-7c8e-4e4d-9e56-c341913f282c-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"2de6e6b7-7c8e-4e4d-9e56-c341913f282c\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:30:40 crc kubenswrapper[4752]: I0929 11:30:40.456392 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2de6e6b7-7c8e-4e4d-9e56-c341913f282c-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"2de6e6b7-7c8e-4e4d-9e56-c341913f282c\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:30:40 crc kubenswrapper[4752]: I0929 11:30:40.464537 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04d72897-fee3-42f8-8b75-4ade4c6b4f9a-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"04d72897-fee3-42f8-8b75-4ade4c6b4f9a\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:30:40 crc kubenswrapper[4752]: I0929 11:30:40.465691 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04d72897-fee3-42f8-8b75-4ade4c6b4f9a-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"04d72897-fee3-42f8-8b75-4ade4c6b4f9a\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:30:40 crc kubenswrapper[4752]: I0929 11:30:40.467286 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/2de6e6b7-7c8e-4e4d-9e56-c341913f282c-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"2de6e6b7-7c8e-4e4d-9e56-c341913f282c\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:30:40 crc kubenswrapper[4752]: I0929 11:30:40.468998 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/04d72897-fee3-42f8-8b75-4ade4c6b4f9a-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"04d72897-fee3-42f8-8b75-4ade4c6b4f9a\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:30:40 crc kubenswrapper[4752]: I0929 11:30:40.470066 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qss5\" (UniqueName: \"kubernetes.io/projected/04d72897-fee3-42f8-8b75-4ade4c6b4f9a-kube-api-access-2qss5\") pod \"watcher-kuttl-api-0\" (UID: \"04d72897-fee3-42f8-8b75-4ade4c6b4f9a\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:30:40 crc kubenswrapper[4752]: I0929 11:30:40.473352 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvvfz\" (UniqueName: \"kubernetes.io/projected/2de6e6b7-7c8e-4e4d-9e56-c341913f282c-kube-api-access-vvvfz\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"2de6e6b7-7c8e-4e4d-9e56-c341913f282c\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:30:40 crc kubenswrapper[4752]: I0929 11:30:40.558013 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:30:40 crc kubenswrapper[4752]: I0929 11:30:40.608316 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:30:41 crc kubenswrapper[4752]: I0929 11:30:41.009263 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Sep 29 11:30:41 crc kubenswrapper[4752]: W0929 11:30:41.111779 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2de6e6b7_7c8e_4e4d_9e56_c341913f282c.slice/crio-e3492ba4302947fe98a825e9a22aa706802fe09f06979de32f67aa54ed5e71c1 WatchSource:0}: Error finding container e3492ba4302947fe98a825e9a22aa706802fe09f06979de32f67aa54ed5e71c1: Status 404 returned error can't find the container with id e3492ba4302947fe98a825e9a22aa706802fe09f06979de32f67aa54ed5e71c1 Sep 29 11:30:41 crc kubenswrapper[4752]: I0929 11:30:41.113310 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Sep 29 11:30:41 crc kubenswrapper[4752]: I0929 11:30:41.637503 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"2de6e6b7-7c8e-4e4d-9e56-c341913f282c","Type":"ContainerStarted","Data":"3a0bdbf0973ffedfae78989f6b62c1dbb5a38b5c74537635c5541fc752d20855"} Sep 29 11:30:41 crc kubenswrapper[4752]: I0929 11:30:41.637756 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"2de6e6b7-7c8e-4e4d-9e56-c341913f282c","Type":"ContainerStarted","Data":"e3492ba4302947fe98a825e9a22aa706802fe09f06979de32f67aa54ed5e71c1"} Sep 29 11:30:41 crc kubenswrapper[4752]: I0929 11:30:41.639867 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"04d72897-fee3-42f8-8b75-4ade4c6b4f9a","Type":"ContainerStarted","Data":"3f1c44ef854679414556c83116749e85ec7c69e6498953e3a2e97ce223ff5770"} Sep 29 11:30:41 crc kubenswrapper[4752]: I0929 11:30:41.639906 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"04d72897-fee3-42f8-8b75-4ade4c6b4f9a","Type":"ContainerStarted","Data":"9fca2e09621402701312dd90c894a181cb12524bfdcbd45ae9fb21e5c0297f41"} Sep 29 11:30:41 crc kubenswrapper[4752]: I0929 11:30:41.639919 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"04d72897-fee3-42f8-8b75-4ade4c6b4f9a","Type":"ContainerStarted","Data":"cb0f129e2a1219cd017fa5e4a4339d1e48e44d4d833c29a6cf7e760cccb063eb"} Sep 29 11:30:41 crc kubenswrapper[4752]: I0929 11:30:41.640605 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:30:41 crc kubenswrapper[4752]: I0929 11:30:41.657414 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podStartSLOduration=1.657397378 podStartE2EDuration="1.657397378s" podCreationTimestamp="2025-09-29 11:30:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 11:30:41.653474715 +0000 UTC m=+2782.442616382" watchObservedRunningTime="2025-09-29 11:30:41.657397378 +0000 UTC m=+2782.446539045" Sep 29 11:30:41 crc kubenswrapper[4752]: I0929 11:30:41.677026 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-api-0" podStartSLOduration=1.6770062110000001 podStartE2EDuration="1.677006211s" podCreationTimestamp="2025-09-29 11:30:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 11:30:41.672041861 +0000 UTC m=+2782.461183528" watchObservedRunningTime="2025-09-29 11:30:41.677006211 +0000 UTC m=+2782.466147878" Sep 29 11:30:43 crc kubenswrapper[4752]: I0929 11:30:43.653880 4752 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 29 11:30:43 crc kubenswrapper[4752]: I0929 11:30:43.835704 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:30:44 crc kubenswrapper[4752]: E0929 11:30:44.453266 4752 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 86992edd597bf5fa1f0ab937814c8b38338a68164063335ef3b4711dc0323086 is running failed: container process not found" containerID="86992edd597bf5fa1f0ab937814c8b38338a68164063335ef3b4711dc0323086" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Sep 29 11:30:44 crc kubenswrapper[4752]: E0929 11:30:44.453785 4752 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 86992edd597bf5fa1f0ab937814c8b38338a68164063335ef3b4711dc0323086 is running failed: container process not found" containerID="86992edd597bf5fa1f0ab937814c8b38338a68164063335ef3b4711dc0323086" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Sep 29 11:30:44 crc kubenswrapper[4752]: E0929 11:30:44.454121 4752 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 86992edd597bf5fa1f0ab937814c8b38338a68164063335ef3b4711dc0323086 is running failed: container process not found" containerID="86992edd597bf5fa1f0ab937814c8b38338a68164063335ef3b4711dc0323086" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Sep 29 11:30:44 crc kubenswrapper[4752]: E0929 11:30:44.454170 4752 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 86992edd597bf5fa1f0ab937814c8b38338a68164063335ef3b4711dc0323086 is running failed: container process not found" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podUID="a72ff6e9-9f96-4da4-979c-7ef8afdc59c7" containerName="watcher-applier" Sep 29 11:30:44 crc kubenswrapper[4752]: I0929 11:30:44.555074 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:30:44 crc kubenswrapper[4752]: I0929 11:30:44.616326 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cpvll\" (UniqueName: \"kubernetes.io/projected/a72ff6e9-9f96-4da4-979c-7ef8afdc59c7-kube-api-access-cpvll\") pod \"a72ff6e9-9f96-4da4-979c-7ef8afdc59c7\" (UID: \"a72ff6e9-9f96-4da4-979c-7ef8afdc59c7\") " Sep 29 11:30:44 crc kubenswrapper[4752]: I0929 11:30:44.616425 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a72ff6e9-9f96-4da4-979c-7ef8afdc59c7-config-data\") pod \"a72ff6e9-9f96-4da4-979c-7ef8afdc59c7\" (UID: \"a72ff6e9-9f96-4da4-979c-7ef8afdc59c7\") " Sep 29 11:30:44 crc kubenswrapper[4752]: I0929 11:30:44.616580 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a72ff6e9-9f96-4da4-979c-7ef8afdc59c7-logs\") pod \"a72ff6e9-9f96-4da4-979c-7ef8afdc59c7\" (UID: \"a72ff6e9-9f96-4da4-979c-7ef8afdc59c7\") " Sep 29 11:30:44 crc kubenswrapper[4752]: I0929 11:30:44.617323 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a72ff6e9-9f96-4da4-979c-7ef8afdc59c7-logs" (OuterVolumeSpecName: "logs") pod "a72ff6e9-9f96-4da4-979c-7ef8afdc59c7" (UID: "a72ff6e9-9f96-4da4-979c-7ef8afdc59c7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 11:30:44 crc kubenswrapper[4752]: I0929 11:30:44.634021 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a72ff6e9-9f96-4da4-979c-7ef8afdc59c7-kube-api-access-cpvll" (OuterVolumeSpecName: "kube-api-access-cpvll") pod "a72ff6e9-9f96-4da4-979c-7ef8afdc59c7" (UID: "a72ff6e9-9f96-4da4-979c-7ef8afdc59c7"). InnerVolumeSpecName "kube-api-access-cpvll". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 11:30:44 crc kubenswrapper[4752]: I0929 11:30:44.689162 4752 generic.go:334] "Generic (PLEG): container finished" podID="a72ff6e9-9f96-4da4-979c-7ef8afdc59c7" containerID="86992edd597bf5fa1f0ab937814c8b38338a68164063335ef3b4711dc0323086" exitCode=137 Sep 29 11:30:44 crc kubenswrapper[4752]: I0929 11:30:44.690237 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:30:44 crc kubenswrapper[4752]: I0929 11:30:44.690636 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"a72ff6e9-9f96-4da4-979c-7ef8afdc59c7","Type":"ContainerDied","Data":"86992edd597bf5fa1f0ab937814c8b38338a68164063335ef3b4711dc0323086"} Sep 29 11:30:44 crc kubenswrapper[4752]: I0929 11:30:44.690672 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"a72ff6e9-9f96-4da4-979c-7ef8afdc59c7","Type":"ContainerDied","Data":"6b808184fa8d5c32f0c6b1b6b6066850985fff4296c723b7af28f8c8380523a1"} Sep 29 11:30:44 crc kubenswrapper[4752]: I0929 11:30:44.690692 4752 scope.go:117] "RemoveContainer" containerID="86992edd597bf5fa1f0ab937814c8b38338a68164063335ef3b4711dc0323086" Sep 29 11:30:44 crc kubenswrapper[4752]: I0929 11:30:44.708951 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a72ff6e9-9f96-4da4-979c-7ef8afdc59c7-config-data" (OuterVolumeSpecName: "config-data") pod "a72ff6e9-9f96-4da4-979c-7ef8afdc59c7" (UID: "a72ff6e9-9f96-4da4-979c-7ef8afdc59c7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:30:44 crc kubenswrapper[4752]: I0929 11:30:44.719725 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cpvll\" (UniqueName: \"kubernetes.io/projected/a72ff6e9-9f96-4da4-979c-7ef8afdc59c7-kube-api-access-cpvll\") on node \"crc\" DevicePath \"\"" Sep 29 11:30:44 crc kubenswrapper[4752]: I0929 11:30:44.719757 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a72ff6e9-9f96-4da4-979c-7ef8afdc59c7-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 11:30:44 crc kubenswrapper[4752]: I0929 11:30:44.719766 4752 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a72ff6e9-9f96-4da4-979c-7ef8afdc59c7-logs\") on node \"crc\" DevicePath \"\"" Sep 29 11:30:44 crc kubenswrapper[4752]: I0929 11:30:44.758607 4752 scope.go:117] "RemoveContainer" containerID="86992edd597bf5fa1f0ab937814c8b38338a68164063335ef3b4711dc0323086" Sep 29 11:30:44 crc kubenswrapper[4752]: E0929 11:30:44.759441 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86992edd597bf5fa1f0ab937814c8b38338a68164063335ef3b4711dc0323086\": container with ID starting with 86992edd597bf5fa1f0ab937814c8b38338a68164063335ef3b4711dc0323086 not found: ID does not exist" containerID="86992edd597bf5fa1f0ab937814c8b38338a68164063335ef3b4711dc0323086" Sep 29 11:30:44 crc kubenswrapper[4752]: I0929 11:30:44.759470 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86992edd597bf5fa1f0ab937814c8b38338a68164063335ef3b4711dc0323086"} err="failed to get container status \"86992edd597bf5fa1f0ab937814c8b38338a68164063335ef3b4711dc0323086\": rpc error: code = NotFound desc = could not find container \"86992edd597bf5fa1f0ab937814c8b38338a68164063335ef3b4711dc0323086\": container with ID starting with 86992edd597bf5fa1f0ab937814c8b38338a68164063335ef3b4711dc0323086 not found: ID does not exist" Sep 29 11:30:45 crc kubenswrapper[4752]: I0929 11:30:45.019188 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Sep 29 11:30:45 crc kubenswrapper[4752]: I0929 11:30:45.024697 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Sep 29 11:30:45 crc kubenswrapper[4752]: I0929 11:30:45.558598 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:30:45 crc kubenswrapper[4752]: I0929 11:30:45.571730 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Sep 29 11:30:45 crc kubenswrapper[4752]: E0929 11:30:45.572126 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a72ff6e9-9f96-4da4-979c-7ef8afdc59c7" containerName="watcher-applier" Sep 29 11:30:45 crc kubenswrapper[4752]: I0929 11:30:45.572147 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="a72ff6e9-9f96-4da4-979c-7ef8afdc59c7" containerName="watcher-applier" Sep 29 11:30:45 crc kubenswrapper[4752]: I0929 11:30:45.572358 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="a72ff6e9-9f96-4da4-979c-7ef8afdc59c7" containerName="watcher-applier" Sep 29 11:30:45 crc kubenswrapper[4752]: I0929 11:30:45.573086 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:30:45 crc kubenswrapper[4752]: I0929 11:30:45.576120 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-applier-config-data" Sep 29 11:30:45 crc kubenswrapper[4752]: I0929 11:30:45.587710 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Sep 29 11:30:45 crc kubenswrapper[4752]: I0929 11:30:45.632194 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/555f166b-3ad0-4620-86ec-28c701938327-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"555f166b-3ad0-4620-86ec-28c701938327\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:30:45 crc kubenswrapper[4752]: I0929 11:30:45.632313 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/555f166b-3ad0-4620-86ec-28c701938327-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"555f166b-3ad0-4620-86ec-28c701938327\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:30:45 crc kubenswrapper[4752]: I0929 11:30:45.632355 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/555f166b-3ad0-4620-86ec-28c701938327-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"555f166b-3ad0-4620-86ec-28c701938327\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:30:45 crc kubenswrapper[4752]: I0929 11:30:45.632402 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2skt\" (UniqueName: \"kubernetes.io/projected/555f166b-3ad0-4620-86ec-28c701938327-kube-api-access-z2skt\") pod \"watcher-kuttl-applier-0\" (UID: \"555f166b-3ad0-4620-86ec-28c701938327\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:30:45 crc kubenswrapper[4752]: I0929 11:30:45.734136 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/555f166b-3ad0-4620-86ec-28c701938327-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"555f166b-3ad0-4620-86ec-28c701938327\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:30:45 crc kubenswrapper[4752]: I0929 11:30:45.734611 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/555f166b-3ad0-4620-86ec-28c701938327-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"555f166b-3ad0-4620-86ec-28c701938327\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:30:45 crc kubenswrapper[4752]: I0929 11:30:45.734786 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2skt\" (UniqueName: \"kubernetes.io/projected/555f166b-3ad0-4620-86ec-28c701938327-kube-api-access-z2skt\") pod \"watcher-kuttl-applier-0\" (UID: \"555f166b-3ad0-4620-86ec-28c701938327\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:30:45 crc kubenswrapper[4752]: I0929 11:30:45.734908 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/555f166b-3ad0-4620-86ec-28c701938327-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"555f166b-3ad0-4620-86ec-28c701938327\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:30:45 crc kubenswrapper[4752]: I0929 11:30:45.735338 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/555f166b-3ad0-4620-86ec-28c701938327-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"555f166b-3ad0-4620-86ec-28c701938327\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:30:45 crc kubenswrapper[4752]: I0929 11:30:45.739903 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/555f166b-3ad0-4620-86ec-28c701938327-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"555f166b-3ad0-4620-86ec-28c701938327\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:30:45 crc kubenswrapper[4752]: I0929 11:30:45.740230 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/555f166b-3ad0-4620-86ec-28c701938327-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"555f166b-3ad0-4620-86ec-28c701938327\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:30:45 crc kubenswrapper[4752]: I0929 11:30:45.761095 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2skt\" (UniqueName: \"kubernetes.io/projected/555f166b-3ad0-4620-86ec-28c701938327-kube-api-access-z2skt\") pod \"watcher-kuttl-applier-0\" (UID: \"555f166b-3ad0-4620-86ec-28c701938327\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:30:45 crc kubenswrapper[4752]: I0929 11:30:45.903316 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:30:46 crc kubenswrapper[4752]: I0929 11:30:46.044889 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a72ff6e9-9f96-4da4-979c-7ef8afdc59c7" path="/var/lib/kubelet/pods/a72ff6e9-9f96-4da4-979c-7ef8afdc59c7/volumes" Sep 29 11:30:46 crc kubenswrapper[4752]: I0929 11:30:46.328515 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Sep 29 11:30:46 crc kubenswrapper[4752]: W0929 11:30:46.331294 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod555f166b_3ad0_4620_86ec_28c701938327.slice/crio-37b6f4dbc577e2dc12a3a045777d8811ed4b74e3ddd9ebd4895993d1872ade02 WatchSource:0}: Error finding container 37b6f4dbc577e2dc12a3a045777d8811ed4b74e3ddd9ebd4895993d1872ade02: Status 404 returned error can't find the container with id 37b6f4dbc577e2dc12a3a045777d8811ed4b74e3ddd9ebd4895993d1872ade02 Sep 29 11:30:46 crc kubenswrapper[4752]: I0929 11:30:46.710164 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"555f166b-3ad0-4620-86ec-28c701938327","Type":"ContainerStarted","Data":"eb83a566d3a59f889c8e11bbc0ad80f7bc076e7e81bbbbf9cc0ed889de0ccc2c"} Sep 29 11:30:46 crc kubenswrapper[4752]: I0929 11:30:46.710216 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"555f166b-3ad0-4620-86ec-28c701938327","Type":"ContainerStarted","Data":"37b6f4dbc577e2dc12a3a045777d8811ed4b74e3ddd9ebd4895993d1872ade02"} Sep 29 11:30:46 crc kubenswrapper[4752]: I0929 11:30:46.726120 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podStartSLOduration=1.726104709 podStartE2EDuration="1.726104709s" podCreationTimestamp="2025-09-29 11:30:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 11:30:46.724542688 +0000 UTC m=+2787.513684365" watchObservedRunningTime="2025-09-29 11:30:46.726104709 +0000 UTC m=+2787.515246366" Sep 29 11:30:50 crc kubenswrapper[4752]: I0929 11:30:50.559155 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:30:50 crc kubenswrapper[4752]: I0929 11:30:50.563449 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:30:50 crc kubenswrapper[4752]: I0929 11:30:50.608794 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:30:50 crc kubenswrapper[4752]: I0929 11:30:50.633442 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:30:50 crc kubenswrapper[4752]: I0929 11:30:50.739484 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:30:50 crc kubenswrapper[4752]: I0929 11:30:50.745461 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Sep 29 11:30:50 crc kubenswrapper[4752]: I0929 11:30:50.769517 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Sep 29 11:30:50 crc kubenswrapper[4752]: I0929 11:30:50.826941 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:30:50 crc kubenswrapper[4752]: I0929 11:30:50.904450 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:30:52 crc kubenswrapper[4752]: I0929 11:30:52.920369 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Sep 29 11:30:52 crc kubenswrapper[4752]: I0929 11:30:52.921238 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="43876fe6-a17d-40d0-b0f3-5389fcf7c179" containerName="ceilometer-central-agent" containerID="cri-o://96d9f0d061ed429cb0d14164aaa01daee3d32bd9e68ac061b00d9881568691b5" gracePeriod=30 Sep 29 11:30:52 crc kubenswrapper[4752]: I0929 11:30:52.921294 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="43876fe6-a17d-40d0-b0f3-5389fcf7c179" containerName="sg-core" containerID="cri-o://9e2a41b23f24f68d12ad7d7f0df2e1b1a0aba00ffb05714477bca67fb8e879af" gracePeriod=30 Sep 29 11:30:52 crc kubenswrapper[4752]: I0929 11:30:52.921336 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="43876fe6-a17d-40d0-b0f3-5389fcf7c179" containerName="proxy-httpd" containerID="cri-o://00bcd9f4c5cfada498cfd0d38aa30b19bc40ff1a783f39496e9595b131972799" gracePeriod=30 Sep 29 11:30:52 crc kubenswrapper[4752]: I0929 11:30:52.921295 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="43876fe6-a17d-40d0-b0f3-5389fcf7c179" containerName="ceilometer-notification-agent" containerID="cri-o://7a211094737485805926e0f428448efffff5f17e32d47c5b891bb32355b78353" gracePeriod=30 Sep 29 11:30:53 crc kubenswrapper[4752]: I0929 11:30:53.787192 4752 generic.go:334] "Generic (PLEG): container finished" podID="43876fe6-a17d-40d0-b0f3-5389fcf7c179" containerID="00bcd9f4c5cfada498cfd0d38aa30b19bc40ff1a783f39496e9595b131972799" exitCode=0 Sep 29 11:30:53 crc kubenswrapper[4752]: I0929 11:30:53.787234 4752 generic.go:334] "Generic (PLEG): container finished" podID="43876fe6-a17d-40d0-b0f3-5389fcf7c179" containerID="9e2a41b23f24f68d12ad7d7f0df2e1b1a0aba00ffb05714477bca67fb8e879af" exitCode=2 Sep 29 11:30:53 crc kubenswrapper[4752]: I0929 11:30:53.787242 4752 generic.go:334] "Generic (PLEG): container finished" podID="43876fe6-a17d-40d0-b0f3-5389fcf7c179" containerID="96d9f0d061ed429cb0d14164aaa01daee3d32bd9e68ac061b00d9881568691b5" exitCode=0 Sep 29 11:30:53 crc kubenswrapper[4752]: I0929 11:30:53.787299 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"43876fe6-a17d-40d0-b0f3-5389fcf7c179","Type":"ContainerDied","Data":"00bcd9f4c5cfada498cfd0d38aa30b19bc40ff1a783f39496e9595b131972799"} Sep 29 11:30:53 crc kubenswrapper[4752]: I0929 11:30:53.787370 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"43876fe6-a17d-40d0-b0f3-5389fcf7c179","Type":"ContainerDied","Data":"9e2a41b23f24f68d12ad7d7f0df2e1b1a0aba00ffb05714477bca67fb8e879af"} Sep 29 11:30:53 crc kubenswrapper[4752]: I0929 11:30:53.787385 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"43876fe6-a17d-40d0-b0f3-5389fcf7c179","Type":"ContainerDied","Data":"96d9f0d061ed429cb0d14164aaa01daee3d32bd9e68ac061b00d9881568691b5"} Sep 29 11:30:55 crc kubenswrapper[4752]: I0929 11:30:55.904976 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:30:55 crc kubenswrapper[4752]: I0929 11:30:55.943653 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:30:56 crc kubenswrapper[4752]: I0929 11:30:56.175617 4752 patch_prober.go:28] interesting pod/machine-config-daemon-mgrvs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 11:30:56 crc kubenswrapper[4752]: I0929 11:30:56.175694 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 11:30:56 crc kubenswrapper[4752]: I0929 11:30:56.859147 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Sep 29 11:30:58 crc kubenswrapper[4752]: I0929 11:30:58.395655 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:30:58 crc kubenswrapper[4752]: I0929 11:30:58.448598 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43876fe6-a17d-40d0-b0f3-5389fcf7c179-config-data\") pod \"43876fe6-a17d-40d0-b0f3-5389fcf7c179\" (UID: \"43876fe6-a17d-40d0-b0f3-5389fcf7c179\") " Sep 29 11:30:58 crc kubenswrapper[4752]: I0929 11:30:58.448670 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43876fe6-a17d-40d0-b0f3-5389fcf7c179-combined-ca-bundle\") pod \"43876fe6-a17d-40d0-b0f3-5389fcf7c179\" (UID: \"43876fe6-a17d-40d0-b0f3-5389fcf7c179\") " Sep 29 11:30:58 crc kubenswrapper[4752]: I0929 11:30:58.448720 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wgnnl\" (UniqueName: \"kubernetes.io/projected/43876fe6-a17d-40d0-b0f3-5389fcf7c179-kube-api-access-wgnnl\") pod \"43876fe6-a17d-40d0-b0f3-5389fcf7c179\" (UID: \"43876fe6-a17d-40d0-b0f3-5389fcf7c179\") " Sep 29 11:30:58 crc kubenswrapper[4752]: I0929 11:30:58.448754 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/43876fe6-a17d-40d0-b0f3-5389fcf7c179-run-httpd\") pod \"43876fe6-a17d-40d0-b0f3-5389fcf7c179\" (UID: \"43876fe6-a17d-40d0-b0f3-5389fcf7c179\") " Sep 29 11:30:58 crc kubenswrapper[4752]: I0929 11:30:58.448781 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/43876fe6-a17d-40d0-b0f3-5389fcf7c179-log-httpd\") pod \"43876fe6-a17d-40d0-b0f3-5389fcf7c179\" (UID: \"43876fe6-a17d-40d0-b0f3-5389fcf7c179\") " Sep 29 11:30:58 crc kubenswrapper[4752]: I0929 11:30:58.449483 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43876fe6-a17d-40d0-b0f3-5389fcf7c179-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "43876fe6-a17d-40d0-b0f3-5389fcf7c179" (UID: "43876fe6-a17d-40d0-b0f3-5389fcf7c179"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 11:30:58 crc kubenswrapper[4752]: I0929 11:30:58.454335 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43876fe6-a17d-40d0-b0f3-5389fcf7c179-kube-api-access-wgnnl" (OuterVolumeSpecName: "kube-api-access-wgnnl") pod "43876fe6-a17d-40d0-b0f3-5389fcf7c179" (UID: "43876fe6-a17d-40d0-b0f3-5389fcf7c179"). InnerVolumeSpecName "kube-api-access-wgnnl". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 11:30:58 crc kubenswrapper[4752]: I0929 11:30:58.454592 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43876fe6-a17d-40d0-b0f3-5389fcf7c179-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "43876fe6-a17d-40d0-b0f3-5389fcf7c179" (UID: "43876fe6-a17d-40d0-b0f3-5389fcf7c179"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 11:30:58 crc kubenswrapper[4752]: I0929 11:30:58.512130 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43876fe6-a17d-40d0-b0f3-5389fcf7c179-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "43876fe6-a17d-40d0-b0f3-5389fcf7c179" (UID: "43876fe6-a17d-40d0-b0f3-5389fcf7c179"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:30:58 crc kubenswrapper[4752]: I0929 11:30:58.536660 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43876fe6-a17d-40d0-b0f3-5389fcf7c179-config-data" (OuterVolumeSpecName: "config-data") pod "43876fe6-a17d-40d0-b0f3-5389fcf7c179" (UID: "43876fe6-a17d-40d0-b0f3-5389fcf7c179"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:30:58 crc kubenswrapper[4752]: I0929 11:30:58.549987 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/43876fe6-a17d-40d0-b0f3-5389fcf7c179-ceilometer-tls-certs\") pod \"43876fe6-a17d-40d0-b0f3-5389fcf7c179\" (UID: \"43876fe6-a17d-40d0-b0f3-5389fcf7c179\") " Sep 29 11:30:58 crc kubenswrapper[4752]: I0929 11:30:58.550034 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43876fe6-a17d-40d0-b0f3-5389fcf7c179-scripts\") pod \"43876fe6-a17d-40d0-b0f3-5389fcf7c179\" (UID: \"43876fe6-a17d-40d0-b0f3-5389fcf7c179\") " Sep 29 11:30:58 crc kubenswrapper[4752]: I0929 11:30:58.550183 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/43876fe6-a17d-40d0-b0f3-5389fcf7c179-sg-core-conf-yaml\") pod \"43876fe6-a17d-40d0-b0f3-5389fcf7c179\" (UID: \"43876fe6-a17d-40d0-b0f3-5389fcf7c179\") " Sep 29 11:30:58 crc kubenswrapper[4752]: I0929 11:30:58.550513 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43876fe6-a17d-40d0-b0f3-5389fcf7c179-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 11:30:58 crc kubenswrapper[4752]: I0929 11:30:58.550534 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wgnnl\" (UniqueName: \"kubernetes.io/projected/43876fe6-a17d-40d0-b0f3-5389fcf7c179-kube-api-access-wgnnl\") on node \"crc\" DevicePath \"\"" Sep 29 11:30:58 crc kubenswrapper[4752]: I0929 11:30:58.550548 4752 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/43876fe6-a17d-40d0-b0f3-5389fcf7c179-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 29 11:30:58 crc kubenswrapper[4752]: I0929 11:30:58.550559 4752 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/43876fe6-a17d-40d0-b0f3-5389fcf7c179-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 29 11:30:58 crc kubenswrapper[4752]: I0929 11:30:58.550570 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43876fe6-a17d-40d0-b0f3-5389fcf7c179-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 11:30:58 crc kubenswrapper[4752]: I0929 11:30:58.553121 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43876fe6-a17d-40d0-b0f3-5389fcf7c179-scripts" (OuterVolumeSpecName: "scripts") pod "43876fe6-a17d-40d0-b0f3-5389fcf7c179" (UID: "43876fe6-a17d-40d0-b0f3-5389fcf7c179"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:30:58 crc kubenswrapper[4752]: I0929 11:30:58.569608 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43876fe6-a17d-40d0-b0f3-5389fcf7c179-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "43876fe6-a17d-40d0-b0f3-5389fcf7c179" (UID: "43876fe6-a17d-40d0-b0f3-5389fcf7c179"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:30:58 crc kubenswrapper[4752]: I0929 11:30:58.589618 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43876fe6-a17d-40d0-b0f3-5389fcf7c179-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "43876fe6-a17d-40d0-b0f3-5389fcf7c179" (UID: "43876fe6-a17d-40d0-b0f3-5389fcf7c179"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 11:30:58 crc kubenswrapper[4752]: I0929 11:30:58.652250 4752 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/43876fe6-a17d-40d0-b0f3-5389fcf7c179-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 29 11:30:58 crc kubenswrapper[4752]: I0929 11:30:58.652548 4752 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43876fe6-a17d-40d0-b0f3-5389fcf7c179-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 11:30:58 crc kubenswrapper[4752]: I0929 11:30:58.652562 4752 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/43876fe6-a17d-40d0-b0f3-5389fcf7c179-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 29 11:30:58 crc kubenswrapper[4752]: I0929 11:30:58.848114 4752 generic.go:334] "Generic (PLEG): container finished" podID="43876fe6-a17d-40d0-b0f3-5389fcf7c179" containerID="7a211094737485805926e0f428448efffff5f17e32d47c5b891bb32355b78353" exitCode=0 Sep 29 11:30:58 crc kubenswrapper[4752]: I0929 11:30:58.848162 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"43876fe6-a17d-40d0-b0f3-5389fcf7c179","Type":"ContainerDied","Data":"7a211094737485805926e0f428448efffff5f17e32d47c5b891bb32355b78353"} Sep 29 11:30:58 crc kubenswrapper[4752]: I0929 11:30:58.848195 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:30:58 crc kubenswrapper[4752]: I0929 11:30:58.848218 4752 scope.go:117] "RemoveContainer" containerID="00bcd9f4c5cfada498cfd0d38aa30b19bc40ff1a783f39496e9595b131972799" Sep 29 11:30:58 crc kubenswrapper[4752]: I0929 11:30:58.848206 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"43876fe6-a17d-40d0-b0f3-5389fcf7c179","Type":"ContainerDied","Data":"ef3ffa114153547c9fbaa36df20578bf866dca015457ffc8f1813ad52b601a3f"} Sep 29 11:30:58 crc kubenswrapper[4752]: I0929 11:30:58.867352 4752 scope.go:117] "RemoveContainer" containerID="9e2a41b23f24f68d12ad7d7f0df2e1b1a0aba00ffb05714477bca67fb8e879af" Sep 29 11:30:58 crc kubenswrapper[4752]: I0929 11:30:58.881110 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Sep 29 11:30:58 crc kubenswrapper[4752]: I0929 11:30:58.888475 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Sep 29 11:30:58 crc kubenswrapper[4752]: I0929 11:30:58.890757 4752 scope.go:117] "RemoveContainer" containerID="7a211094737485805926e0f428448efffff5f17e32d47c5b891bb32355b78353" Sep 29 11:30:58 crc kubenswrapper[4752]: I0929 11:30:58.920631 4752 scope.go:117] "RemoveContainer" containerID="96d9f0d061ed429cb0d14164aaa01daee3d32bd9e68ac061b00d9881568691b5" Sep 29 11:30:58 crc kubenswrapper[4752]: I0929 11:30:58.925478 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Sep 29 11:30:58 crc kubenswrapper[4752]: E0929 11:30:58.925887 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43876fe6-a17d-40d0-b0f3-5389fcf7c179" containerName="sg-core" Sep 29 11:30:58 crc kubenswrapper[4752]: I0929 11:30:58.925902 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="43876fe6-a17d-40d0-b0f3-5389fcf7c179" containerName="sg-core" Sep 29 11:30:58 crc kubenswrapper[4752]: E0929 11:30:58.925919 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43876fe6-a17d-40d0-b0f3-5389fcf7c179" containerName="proxy-httpd" Sep 29 11:30:58 crc kubenswrapper[4752]: I0929 11:30:58.925925 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="43876fe6-a17d-40d0-b0f3-5389fcf7c179" containerName="proxy-httpd" Sep 29 11:30:58 crc kubenswrapper[4752]: E0929 11:30:58.925947 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43876fe6-a17d-40d0-b0f3-5389fcf7c179" containerName="ceilometer-notification-agent" Sep 29 11:30:58 crc kubenswrapper[4752]: I0929 11:30:58.925954 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="43876fe6-a17d-40d0-b0f3-5389fcf7c179" containerName="ceilometer-notification-agent" Sep 29 11:30:58 crc kubenswrapper[4752]: E0929 11:30:58.925969 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43876fe6-a17d-40d0-b0f3-5389fcf7c179" containerName="ceilometer-central-agent" Sep 29 11:30:58 crc kubenswrapper[4752]: I0929 11:30:58.925974 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="43876fe6-a17d-40d0-b0f3-5389fcf7c179" containerName="ceilometer-central-agent" Sep 29 11:30:58 crc kubenswrapper[4752]: I0929 11:30:58.926120 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="43876fe6-a17d-40d0-b0f3-5389fcf7c179" containerName="ceilometer-central-agent" Sep 29 11:30:58 crc kubenswrapper[4752]: I0929 11:30:58.926133 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="43876fe6-a17d-40d0-b0f3-5389fcf7c179" containerName="sg-core" Sep 29 11:30:58 crc kubenswrapper[4752]: I0929 11:30:58.926140 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="43876fe6-a17d-40d0-b0f3-5389fcf7c179" containerName="ceilometer-notification-agent" Sep 29 11:30:58 crc kubenswrapper[4752]: I0929 11:30:58.926155 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="43876fe6-a17d-40d0-b0f3-5389fcf7c179" containerName="proxy-httpd" Sep 29 11:30:58 crc kubenswrapper[4752]: I0929 11:30:58.927564 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:30:58 crc kubenswrapper[4752]: I0929 11:30:58.930539 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Sep 29 11:30:58 crc kubenswrapper[4752]: I0929 11:30:58.931057 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Sep 29 11:30:58 crc kubenswrapper[4752]: I0929 11:30:58.931228 4752 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-ceilometer-internal-svc" Sep 29 11:30:58 crc kubenswrapper[4752]: I0929 11:30:58.942101 4752 scope.go:117] "RemoveContainer" containerID="00bcd9f4c5cfada498cfd0d38aa30b19bc40ff1a783f39496e9595b131972799" Sep 29 11:30:58 crc kubenswrapper[4752]: E0929 11:30:58.942782 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00bcd9f4c5cfada498cfd0d38aa30b19bc40ff1a783f39496e9595b131972799\": container with ID starting with 00bcd9f4c5cfada498cfd0d38aa30b19bc40ff1a783f39496e9595b131972799 not found: ID does not exist" containerID="00bcd9f4c5cfada498cfd0d38aa30b19bc40ff1a783f39496e9595b131972799" Sep 29 11:30:58 crc kubenswrapper[4752]: I0929 11:30:58.942836 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00bcd9f4c5cfada498cfd0d38aa30b19bc40ff1a783f39496e9595b131972799"} err="failed to get container status \"00bcd9f4c5cfada498cfd0d38aa30b19bc40ff1a783f39496e9595b131972799\": rpc error: code = NotFound desc = could not find container \"00bcd9f4c5cfada498cfd0d38aa30b19bc40ff1a783f39496e9595b131972799\": container with ID starting with 00bcd9f4c5cfada498cfd0d38aa30b19bc40ff1a783f39496e9595b131972799 not found: ID does not exist" Sep 29 11:30:58 crc kubenswrapper[4752]: I0929 11:30:58.942861 4752 scope.go:117] "RemoveContainer" containerID="9e2a41b23f24f68d12ad7d7f0df2e1b1a0aba00ffb05714477bca67fb8e879af" Sep 29 11:30:58 crc kubenswrapper[4752]: E0929 11:30:58.943920 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e2a41b23f24f68d12ad7d7f0df2e1b1a0aba00ffb05714477bca67fb8e879af\": container with ID starting with 9e2a41b23f24f68d12ad7d7f0df2e1b1a0aba00ffb05714477bca67fb8e879af not found: ID does not exist" containerID="9e2a41b23f24f68d12ad7d7f0df2e1b1a0aba00ffb05714477bca67fb8e879af" Sep 29 11:30:58 crc kubenswrapper[4752]: I0929 11:30:58.943951 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e2a41b23f24f68d12ad7d7f0df2e1b1a0aba00ffb05714477bca67fb8e879af"} err="failed to get container status \"9e2a41b23f24f68d12ad7d7f0df2e1b1a0aba00ffb05714477bca67fb8e879af\": rpc error: code = NotFound desc = could not find container \"9e2a41b23f24f68d12ad7d7f0df2e1b1a0aba00ffb05714477bca67fb8e879af\": container with ID starting with 9e2a41b23f24f68d12ad7d7f0df2e1b1a0aba00ffb05714477bca67fb8e879af not found: ID does not exist" Sep 29 11:30:58 crc kubenswrapper[4752]: I0929 11:30:58.943972 4752 scope.go:117] "RemoveContainer" containerID="7a211094737485805926e0f428448efffff5f17e32d47c5b891bb32355b78353" Sep 29 11:30:58 crc kubenswrapper[4752]: E0929 11:30:58.944832 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a211094737485805926e0f428448efffff5f17e32d47c5b891bb32355b78353\": container with ID starting with 7a211094737485805926e0f428448efffff5f17e32d47c5b891bb32355b78353 not found: ID does not exist" containerID="7a211094737485805926e0f428448efffff5f17e32d47c5b891bb32355b78353" Sep 29 11:30:58 crc kubenswrapper[4752]: I0929 11:30:58.944873 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a211094737485805926e0f428448efffff5f17e32d47c5b891bb32355b78353"} err="failed to get container status \"7a211094737485805926e0f428448efffff5f17e32d47c5b891bb32355b78353\": rpc error: code = NotFound desc = could not find container \"7a211094737485805926e0f428448efffff5f17e32d47c5b891bb32355b78353\": container with ID starting with 7a211094737485805926e0f428448efffff5f17e32d47c5b891bb32355b78353 not found: ID does not exist" Sep 29 11:30:58 crc kubenswrapper[4752]: I0929 11:30:58.944889 4752 scope.go:117] "RemoveContainer" containerID="96d9f0d061ed429cb0d14164aaa01daee3d32bd9e68ac061b00d9881568691b5" Sep 29 11:30:58 crc kubenswrapper[4752]: E0929 11:30:58.945136 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96d9f0d061ed429cb0d14164aaa01daee3d32bd9e68ac061b00d9881568691b5\": container with ID starting with 96d9f0d061ed429cb0d14164aaa01daee3d32bd9e68ac061b00d9881568691b5 not found: ID does not exist" containerID="96d9f0d061ed429cb0d14164aaa01daee3d32bd9e68ac061b00d9881568691b5" Sep 29 11:30:58 crc kubenswrapper[4752]: I0929 11:30:58.945171 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96d9f0d061ed429cb0d14164aaa01daee3d32bd9e68ac061b00d9881568691b5"} err="failed to get container status \"96d9f0d061ed429cb0d14164aaa01daee3d32bd9e68ac061b00d9881568691b5\": rpc error: code = NotFound desc = could not find container \"96d9f0d061ed429cb0d14164aaa01daee3d32bd9e68ac061b00d9881568691b5\": container with ID starting with 96d9f0d061ed429cb0d14164aaa01daee3d32bd9e68ac061b00d9881568691b5 not found: ID does not exist" Sep 29 11:30:58 crc kubenswrapper[4752]: I0929 11:30:58.947678 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Sep 29 11:30:58 crc kubenswrapper[4752]: I0929 11:30:58.956163 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b6facb1-a514-4754-bc86-f3b7acd0fc99-scripts\") pod \"ceilometer-0\" (UID: \"7b6facb1-a514-4754-bc86-f3b7acd0fc99\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:30:58 crc kubenswrapper[4752]: I0929 11:30:58.956202 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b6facb1-a514-4754-bc86-f3b7acd0fc99-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7b6facb1-a514-4754-bc86-f3b7acd0fc99\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:30:58 crc kubenswrapper[4752]: I0929 11:30:58.956224 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qjrb\" (UniqueName: \"kubernetes.io/projected/7b6facb1-a514-4754-bc86-f3b7acd0fc99-kube-api-access-7qjrb\") pod \"ceilometer-0\" (UID: \"7b6facb1-a514-4754-bc86-f3b7acd0fc99\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:30:58 crc kubenswrapper[4752]: I0929 11:30:58.956241 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7b6facb1-a514-4754-bc86-f3b7acd0fc99-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7b6facb1-a514-4754-bc86-f3b7acd0fc99\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:30:58 crc kubenswrapper[4752]: I0929 11:30:58.956274 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7b6facb1-a514-4754-bc86-f3b7acd0fc99-run-httpd\") pod \"ceilometer-0\" (UID: \"7b6facb1-a514-4754-bc86-f3b7acd0fc99\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:30:58 crc kubenswrapper[4752]: I0929 11:30:58.956292 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b6facb1-a514-4754-bc86-f3b7acd0fc99-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7b6facb1-a514-4754-bc86-f3b7acd0fc99\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:30:58 crc kubenswrapper[4752]: I0929 11:30:58.956334 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7b6facb1-a514-4754-bc86-f3b7acd0fc99-log-httpd\") pod \"ceilometer-0\" (UID: \"7b6facb1-a514-4754-bc86-f3b7acd0fc99\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:30:58 crc kubenswrapper[4752]: I0929 11:30:58.956366 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b6facb1-a514-4754-bc86-f3b7acd0fc99-config-data\") pod \"ceilometer-0\" (UID: \"7b6facb1-a514-4754-bc86-f3b7acd0fc99\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:30:59 crc kubenswrapper[4752]: I0929 11:30:59.057151 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qjrb\" (UniqueName: \"kubernetes.io/projected/7b6facb1-a514-4754-bc86-f3b7acd0fc99-kube-api-access-7qjrb\") pod \"ceilometer-0\" (UID: \"7b6facb1-a514-4754-bc86-f3b7acd0fc99\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:30:59 crc kubenswrapper[4752]: I0929 11:30:59.057188 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7b6facb1-a514-4754-bc86-f3b7acd0fc99-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7b6facb1-a514-4754-bc86-f3b7acd0fc99\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:30:59 crc kubenswrapper[4752]: I0929 11:30:59.057223 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7b6facb1-a514-4754-bc86-f3b7acd0fc99-run-httpd\") pod \"ceilometer-0\" (UID: \"7b6facb1-a514-4754-bc86-f3b7acd0fc99\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:30:59 crc kubenswrapper[4752]: I0929 11:30:59.057240 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b6facb1-a514-4754-bc86-f3b7acd0fc99-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7b6facb1-a514-4754-bc86-f3b7acd0fc99\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:30:59 crc kubenswrapper[4752]: I0929 11:30:59.057378 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7b6facb1-a514-4754-bc86-f3b7acd0fc99-log-httpd\") pod \"ceilometer-0\" (UID: \"7b6facb1-a514-4754-bc86-f3b7acd0fc99\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:30:59 crc kubenswrapper[4752]: I0929 11:30:59.057565 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b6facb1-a514-4754-bc86-f3b7acd0fc99-config-data\") pod \"ceilometer-0\" (UID: \"7b6facb1-a514-4754-bc86-f3b7acd0fc99\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:30:59 crc kubenswrapper[4752]: I0929 11:30:59.057736 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b6facb1-a514-4754-bc86-f3b7acd0fc99-scripts\") pod \"ceilometer-0\" (UID: \"7b6facb1-a514-4754-bc86-f3b7acd0fc99\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:30:59 crc kubenswrapper[4752]: I0929 11:30:59.057776 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b6facb1-a514-4754-bc86-f3b7acd0fc99-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7b6facb1-a514-4754-bc86-f3b7acd0fc99\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:30:59 crc kubenswrapper[4752]: I0929 11:30:59.058068 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7b6facb1-a514-4754-bc86-f3b7acd0fc99-run-httpd\") pod \"ceilometer-0\" (UID: \"7b6facb1-a514-4754-bc86-f3b7acd0fc99\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:30:59 crc kubenswrapper[4752]: I0929 11:30:59.058145 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7b6facb1-a514-4754-bc86-f3b7acd0fc99-log-httpd\") pod \"ceilometer-0\" (UID: \"7b6facb1-a514-4754-bc86-f3b7acd0fc99\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:30:59 crc kubenswrapper[4752]: I0929 11:30:59.061604 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7b6facb1-a514-4754-bc86-f3b7acd0fc99-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7b6facb1-a514-4754-bc86-f3b7acd0fc99\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:30:59 crc kubenswrapper[4752]: I0929 11:30:59.062220 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b6facb1-a514-4754-bc86-f3b7acd0fc99-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7b6facb1-a514-4754-bc86-f3b7acd0fc99\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:30:59 crc kubenswrapper[4752]: I0929 11:30:59.062694 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b6facb1-a514-4754-bc86-f3b7acd0fc99-scripts\") pod \"ceilometer-0\" (UID: \"7b6facb1-a514-4754-bc86-f3b7acd0fc99\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:30:59 crc kubenswrapper[4752]: I0929 11:30:59.062997 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b6facb1-a514-4754-bc86-f3b7acd0fc99-config-data\") pod \"ceilometer-0\" (UID: \"7b6facb1-a514-4754-bc86-f3b7acd0fc99\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:30:59 crc kubenswrapper[4752]: I0929 11:30:59.064669 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b6facb1-a514-4754-bc86-f3b7acd0fc99-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7b6facb1-a514-4754-bc86-f3b7acd0fc99\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:30:59 crc kubenswrapper[4752]: I0929 11:30:59.076713 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qjrb\" (UniqueName: \"kubernetes.io/projected/7b6facb1-a514-4754-bc86-f3b7acd0fc99-kube-api-access-7qjrb\") pod \"ceilometer-0\" (UID: \"7b6facb1-a514-4754-bc86-f3b7acd0fc99\") " pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:30:59 crc kubenswrapper[4752]: I0929 11:30:59.245185 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:30:59 crc kubenswrapper[4752]: I0929 11:30:59.675583 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Sep 29 11:30:59 crc kubenswrapper[4752]: I0929 11:30:59.681592 4752 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 29 11:30:59 crc kubenswrapper[4752]: I0929 11:30:59.857165 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"7b6facb1-a514-4754-bc86-f3b7acd0fc99","Type":"ContainerStarted","Data":"fb8d1ae52d4693b4dce6c1b8f13f7904c32dc291053f9fcb39d752195d3142f8"} Sep 29 11:31:00 crc kubenswrapper[4752]: I0929 11:31:00.041907 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43876fe6-a17d-40d0-b0f3-5389fcf7c179" path="/var/lib/kubelet/pods/43876fe6-a17d-40d0-b0f3-5389fcf7c179/volumes" Sep 29 11:31:00 crc kubenswrapper[4752]: I0929 11:31:00.868733 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"7b6facb1-a514-4754-bc86-f3b7acd0fc99","Type":"ContainerStarted","Data":"78ee0e16b969d9cbb0e570465e7d06ffb20c70ff0f79e5a431480df20f00be77"} Sep 29 11:31:01 crc kubenswrapper[4752]: I0929 11:31:01.880176 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"7b6facb1-a514-4754-bc86-f3b7acd0fc99","Type":"ContainerStarted","Data":"940766b14d168ef67d37dd425f93e5e96109aa9599498ea38c6de7f8dee0fc71"} Sep 29 11:31:01 crc kubenswrapper[4752]: I0929 11:31:01.880793 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"7b6facb1-a514-4754-bc86-f3b7acd0fc99","Type":"ContainerStarted","Data":"fd773cb7fc705a57cbef31dc82a0b46f7755a6dc8361ca6471119c487aa19d9f"} Sep 29 11:31:03 crc kubenswrapper[4752]: I0929 11:31:03.897461 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"7b6facb1-a514-4754-bc86-f3b7acd0fc99","Type":"ContainerStarted","Data":"74ba7ba755e7b3188f9c86e82c5f7b4e4d629818db186c8c19c7887a71aeef40"} Sep 29 11:31:03 crc kubenswrapper[4752]: I0929 11:31:03.897830 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:31:03 crc kubenswrapper[4752]: I0929 11:31:03.925305 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/ceilometer-0" podStartSLOduration=2.480417055 podStartE2EDuration="5.925284733s" podCreationTimestamp="2025-09-29 11:30:58 +0000 UTC" firstStartedPulling="2025-09-29 11:30:59.681311324 +0000 UTC m=+2800.470452991" lastFinishedPulling="2025-09-29 11:31:03.126179002 +0000 UTC m=+2803.915320669" observedRunningTime="2025-09-29 11:31:03.922337475 +0000 UTC m=+2804.711479152" watchObservedRunningTime="2025-09-29 11:31:03.925284733 +0000 UTC m=+2804.714426400" Sep 29 11:31:22 crc kubenswrapper[4752]: I0929 11:31:22.757083 4752 scope.go:117] "RemoveContainer" containerID="460a2a199251eca257f5d76305a4ea39ad8b86f7cc868993b6371f1270ca6437" Sep 29 11:31:26 crc kubenswrapper[4752]: I0929 11:31:26.175290 4752 patch_prober.go:28] interesting pod/machine-config-daemon-mgrvs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 11:31:26 crc kubenswrapper[4752]: I0929 11:31:26.176037 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 11:31:29 crc kubenswrapper[4752]: I0929 11:31:29.254016 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/ceilometer-0" Sep 29 11:31:56 crc kubenswrapper[4752]: I0929 11:31:56.175540 4752 patch_prober.go:28] interesting pod/machine-config-daemon-mgrvs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 11:31:56 crc kubenswrapper[4752]: I0929 11:31:56.176106 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 11:31:56 crc kubenswrapper[4752]: I0929 11:31:56.176151 4752 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" Sep 29 11:31:56 crc kubenswrapper[4752]: I0929 11:31:56.176755 4752 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4edd0d5d46b3e4dc4505574db2621f4af50e77316cad134acc80e3c6d10d13c6"} pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 29 11:31:56 crc kubenswrapper[4752]: I0929 11:31:56.176798 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" containerName="machine-config-daemon" containerID="cri-o://4edd0d5d46b3e4dc4505574db2621f4af50e77316cad134acc80e3c6d10d13c6" gracePeriod=600 Sep 29 11:31:56 crc kubenswrapper[4752]: E0929 11:31:56.306058 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgrvs_openshift-machine-config-operator(5863c243-797d-462a-b11f-71aaf005f8d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" Sep 29 11:31:56 crc kubenswrapper[4752]: I0929 11:31:56.326957 4752 generic.go:334] "Generic (PLEG): container finished" podID="5863c243-797d-462a-b11f-71aaf005f8d1" containerID="4edd0d5d46b3e4dc4505574db2621f4af50e77316cad134acc80e3c6d10d13c6" exitCode=0 Sep 29 11:31:56 crc kubenswrapper[4752]: I0929 11:31:56.327001 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" event={"ID":"5863c243-797d-462a-b11f-71aaf005f8d1","Type":"ContainerDied","Data":"4edd0d5d46b3e4dc4505574db2621f4af50e77316cad134acc80e3c6d10d13c6"} Sep 29 11:31:56 crc kubenswrapper[4752]: I0929 11:31:56.327044 4752 scope.go:117] "RemoveContainer" containerID="62794379c57b608ef54dfd72ad21536541e2a354845aabac9be8f3dde3444511" Sep 29 11:31:56 crc kubenswrapper[4752]: I0929 11:31:56.328033 4752 scope.go:117] "RemoveContainer" containerID="4edd0d5d46b3e4dc4505574db2621f4af50e77316cad134acc80e3c6d10d13c6" Sep 29 11:31:56 crc kubenswrapper[4752]: E0929 11:31:56.328339 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgrvs_openshift-machine-config-operator(5863c243-797d-462a-b11f-71aaf005f8d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" Sep 29 11:32:11 crc kubenswrapper[4752]: I0929 11:32:11.431227 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6nrpv"] Sep 29 11:32:11 crc kubenswrapper[4752]: I0929 11:32:11.438817 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6nrpv" Sep 29 11:32:11 crc kubenswrapper[4752]: I0929 11:32:11.446892 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6nrpv"] Sep 29 11:32:11 crc kubenswrapper[4752]: I0929 11:32:11.468451 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drh82\" (UniqueName: \"kubernetes.io/projected/4b22e897-bb2a-4714-8d4c-3c7a659bda9f-kube-api-access-drh82\") pod \"community-operators-6nrpv\" (UID: \"4b22e897-bb2a-4714-8d4c-3c7a659bda9f\") " pod="openshift-marketplace/community-operators-6nrpv" Sep 29 11:32:11 crc kubenswrapper[4752]: I0929 11:32:11.468567 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b22e897-bb2a-4714-8d4c-3c7a659bda9f-utilities\") pod \"community-operators-6nrpv\" (UID: \"4b22e897-bb2a-4714-8d4c-3c7a659bda9f\") " pod="openshift-marketplace/community-operators-6nrpv" Sep 29 11:32:11 crc kubenswrapper[4752]: I0929 11:32:11.468625 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b22e897-bb2a-4714-8d4c-3c7a659bda9f-catalog-content\") pod \"community-operators-6nrpv\" (UID: \"4b22e897-bb2a-4714-8d4c-3c7a659bda9f\") " pod="openshift-marketplace/community-operators-6nrpv" Sep 29 11:32:11 crc kubenswrapper[4752]: I0929 11:32:11.569665 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drh82\" (UniqueName: \"kubernetes.io/projected/4b22e897-bb2a-4714-8d4c-3c7a659bda9f-kube-api-access-drh82\") pod \"community-operators-6nrpv\" (UID: \"4b22e897-bb2a-4714-8d4c-3c7a659bda9f\") " pod="openshift-marketplace/community-operators-6nrpv" Sep 29 11:32:11 crc kubenswrapper[4752]: I0929 11:32:11.569755 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b22e897-bb2a-4714-8d4c-3c7a659bda9f-utilities\") pod \"community-operators-6nrpv\" (UID: \"4b22e897-bb2a-4714-8d4c-3c7a659bda9f\") " pod="openshift-marketplace/community-operators-6nrpv" Sep 29 11:32:11 crc kubenswrapper[4752]: I0929 11:32:11.569881 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b22e897-bb2a-4714-8d4c-3c7a659bda9f-catalog-content\") pod \"community-operators-6nrpv\" (UID: \"4b22e897-bb2a-4714-8d4c-3c7a659bda9f\") " pod="openshift-marketplace/community-operators-6nrpv" Sep 29 11:32:11 crc kubenswrapper[4752]: I0929 11:32:11.570545 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b22e897-bb2a-4714-8d4c-3c7a659bda9f-catalog-content\") pod \"community-operators-6nrpv\" (UID: \"4b22e897-bb2a-4714-8d4c-3c7a659bda9f\") " pod="openshift-marketplace/community-operators-6nrpv" Sep 29 11:32:11 crc kubenswrapper[4752]: I0929 11:32:11.570589 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b22e897-bb2a-4714-8d4c-3c7a659bda9f-utilities\") pod \"community-operators-6nrpv\" (UID: \"4b22e897-bb2a-4714-8d4c-3c7a659bda9f\") " pod="openshift-marketplace/community-operators-6nrpv" Sep 29 11:32:11 crc kubenswrapper[4752]: I0929 11:32:11.593361 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drh82\" (UniqueName: \"kubernetes.io/projected/4b22e897-bb2a-4714-8d4c-3c7a659bda9f-kube-api-access-drh82\") pod \"community-operators-6nrpv\" (UID: \"4b22e897-bb2a-4714-8d4c-3c7a659bda9f\") " pod="openshift-marketplace/community-operators-6nrpv" Sep 29 11:32:11 crc kubenswrapper[4752]: I0929 11:32:11.769314 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6nrpv" Sep 29 11:32:12 crc kubenswrapper[4752]: I0929 11:32:12.033593 4752 scope.go:117] "RemoveContainer" containerID="4edd0d5d46b3e4dc4505574db2621f4af50e77316cad134acc80e3c6d10d13c6" Sep 29 11:32:12 crc kubenswrapper[4752]: E0929 11:32:12.033949 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgrvs_openshift-machine-config-operator(5863c243-797d-462a-b11f-71aaf005f8d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" Sep 29 11:32:12 crc kubenswrapper[4752]: I0929 11:32:12.231905 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6nrpv"] Sep 29 11:32:12 crc kubenswrapper[4752]: I0929 11:32:12.465191 4752 generic.go:334] "Generic (PLEG): container finished" podID="4b22e897-bb2a-4714-8d4c-3c7a659bda9f" containerID="4601aebbb0c6217ec48818f5ec9f565f3e69afc3a67103482a1021fa764432c0" exitCode=0 Sep 29 11:32:12 crc kubenswrapper[4752]: I0929 11:32:12.465259 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6nrpv" event={"ID":"4b22e897-bb2a-4714-8d4c-3c7a659bda9f","Type":"ContainerDied","Data":"4601aebbb0c6217ec48818f5ec9f565f3e69afc3a67103482a1021fa764432c0"} Sep 29 11:32:12 crc kubenswrapper[4752]: I0929 11:32:12.466650 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6nrpv" event={"ID":"4b22e897-bb2a-4714-8d4c-3c7a659bda9f","Type":"ContainerStarted","Data":"d30ae39c82e5be5e6ba384031f3217c725fcb125c2fe9b71f6dba6ac5aadd7ae"} Sep 29 11:32:13 crc kubenswrapper[4752]: I0929 11:32:13.475715 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6nrpv" event={"ID":"4b22e897-bb2a-4714-8d4c-3c7a659bda9f","Type":"ContainerStarted","Data":"f94fe12816468594f3510559bfa5a35cacb8b320ac03f2b62eedba7fbdaf713e"} Sep 29 11:32:14 crc kubenswrapper[4752]: I0929 11:32:14.485920 4752 generic.go:334] "Generic (PLEG): container finished" podID="4b22e897-bb2a-4714-8d4c-3c7a659bda9f" containerID="f94fe12816468594f3510559bfa5a35cacb8b320ac03f2b62eedba7fbdaf713e" exitCode=0 Sep 29 11:32:14 crc kubenswrapper[4752]: I0929 11:32:14.486010 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6nrpv" event={"ID":"4b22e897-bb2a-4714-8d4c-3c7a659bda9f","Type":"ContainerDied","Data":"f94fe12816468594f3510559bfa5a35cacb8b320ac03f2b62eedba7fbdaf713e"} Sep 29 11:32:15 crc kubenswrapper[4752]: I0929 11:32:15.496637 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6nrpv" event={"ID":"4b22e897-bb2a-4714-8d4c-3c7a659bda9f","Type":"ContainerStarted","Data":"103d78a98759778e44f2a34a31201442952e53ea7dedc36916f9f37361177b02"} Sep 29 11:32:15 crc kubenswrapper[4752]: I0929 11:32:15.519660 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6nrpv" podStartSLOduration=1.850422405 podStartE2EDuration="4.519642845s" podCreationTimestamp="2025-09-29 11:32:11 +0000 UTC" firstStartedPulling="2025-09-29 11:32:12.466766481 +0000 UTC m=+2873.255908148" lastFinishedPulling="2025-09-29 11:32:15.135986921 +0000 UTC m=+2875.925128588" observedRunningTime="2025-09-29 11:32:15.512663882 +0000 UTC m=+2876.301805559" watchObservedRunningTime="2025-09-29 11:32:15.519642845 +0000 UTC m=+2876.308784522" Sep 29 11:32:21 crc kubenswrapper[4752]: I0929 11:32:21.770715 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6nrpv" Sep 29 11:32:21 crc kubenswrapper[4752]: I0929 11:32:21.771165 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6nrpv" Sep 29 11:32:21 crc kubenswrapper[4752]: I0929 11:32:21.855430 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6nrpv" Sep 29 11:32:22 crc kubenswrapper[4752]: I0929 11:32:22.615682 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6nrpv" Sep 29 11:32:22 crc kubenswrapper[4752]: I0929 11:32:22.844469 4752 scope.go:117] "RemoveContainer" containerID="3601f11d6fce7e797b457f7c41b8cf7b4a88ba2c4df1e732ad5453804b2f295f" Sep 29 11:32:22 crc kubenswrapper[4752]: I0929 11:32:22.873360 4752 scope.go:117] "RemoveContainer" containerID="94070899e8acd4d5e8e47c4c0add85ef44035fb6afb55608e67393ef4d7d1331" Sep 29 11:32:22 crc kubenswrapper[4752]: I0929 11:32:22.917338 4752 scope.go:117] "RemoveContainer" containerID="cec29d708524b25c21787a568e273c2da24bcda5189c74bd71dbe5c47cb35afa" Sep 29 11:32:22 crc kubenswrapper[4752]: I0929 11:32:22.970007 4752 scope.go:117] "RemoveContainer" containerID="c678f0d8a2213be62656dc64a578177c4c5f70f2a4ba2376857466766218c920" Sep 29 11:32:24 crc kubenswrapper[4752]: I0929 11:32:24.031468 4752 scope.go:117] "RemoveContainer" containerID="4edd0d5d46b3e4dc4505574db2621f4af50e77316cad134acc80e3c6d10d13c6" Sep 29 11:32:24 crc kubenswrapper[4752]: E0929 11:32:24.031966 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgrvs_openshift-machine-config-operator(5863c243-797d-462a-b11f-71aaf005f8d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" Sep 29 11:32:25 crc kubenswrapper[4752]: I0929 11:32:25.411963 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6nrpv"] Sep 29 11:32:25 crc kubenswrapper[4752]: I0929 11:32:25.412519 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-6nrpv" podUID="4b22e897-bb2a-4714-8d4c-3c7a659bda9f" containerName="registry-server" containerID="cri-o://103d78a98759778e44f2a34a31201442952e53ea7dedc36916f9f37361177b02" gracePeriod=2 Sep 29 11:32:25 crc kubenswrapper[4752]: I0929 11:32:25.585114 4752 generic.go:334] "Generic (PLEG): container finished" podID="4b22e897-bb2a-4714-8d4c-3c7a659bda9f" containerID="103d78a98759778e44f2a34a31201442952e53ea7dedc36916f9f37361177b02" exitCode=0 Sep 29 11:32:25 crc kubenswrapper[4752]: I0929 11:32:25.585158 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6nrpv" event={"ID":"4b22e897-bb2a-4714-8d4c-3c7a659bda9f","Type":"ContainerDied","Data":"103d78a98759778e44f2a34a31201442952e53ea7dedc36916f9f37361177b02"} Sep 29 11:32:25 crc kubenswrapper[4752]: I0929 11:32:25.826345 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6nrpv" Sep 29 11:32:26 crc kubenswrapper[4752]: I0929 11:32:26.011840 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b22e897-bb2a-4714-8d4c-3c7a659bda9f-utilities\") pod \"4b22e897-bb2a-4714-8d4c-3c7a659bda9f\" (UID: \"4b22e897-bb2a-4714-8d4c-3c7a659bda9f\") " Sep 29 11:32:26 crc kubenswrapper[4752]: I0929 11:32:26.012142 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-drh82\" (UniqueName: \"kubernetes.io/projected/4b22e897-bb2a-4714-8d4c-3c7a659bda9f-kube-api-access-drh82\") pod \"4b22e897-bb2a-4714-8d4c-3c7a659bda9f\" (UID: \"4b22e897-bb2a-4714-8d4c-3c7a659bda9f\") " Sep 29 11:32:26 crc kubenswrapper[4752]: I0929 11:32:26.012370 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b22e897-bb2a-4714-8d4c-3c7a659bda9f-catalog-content\") pod \"4b22e897-bb2a-4714-8d4c-3c7a659bda9f\" (UID: \"4b22e897-bb2a-4714-8d4c-3c7a659bda9f\") " Sep 29 11:32:26 crc kubenswrapper[4752]: I0929 11:32:26.012788 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b22e897-bb2a-4714-8d4c-3c7a659bda9f-utilities" (OuterVolumeSpecName: "utilities") pod "4b22e897-bb2a-4714-8d4c-3c7a659bda9f" (UID: "4b22e897-bb2a-4714-8d4c-3c7a659bda9f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 11:32:26 crc kubenswrapper[4752]: I0929 11:32:26.035002 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b22e897-bb2a-4714-8d4c-3c7a659bda9f-kube-api-access-drh82" (OuterVolumeSpecName: "kube-api-access-drh82") pod "4b22e897-bb2a-4714-8d4c-3c7a659bda9f" (UID: "4b22e897-bb2a-4714-8d4c-3c7a659bda9f"). InnerVolumeSpecName "kube-api-access-drh82". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 11:32:26 crc kubenswrapper[4752]: I0929 11:32:26.080013 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b22e897-bb2a-4714-8d4c-3c7a659bda9f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4b22e897-bb2a-4714-8d4c-3c7a659bda9f" (UID: "4b22e897-bb2a-4714-8d4c-3c7a659bda9f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 11:32:26 crc kubenswrapper[4752]: I0929 11:32:26.114693 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b22e897-bb2a-4714-8d4c-3c7a659bda9f-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 11:32:26 crc kubenswrapper[4752]: I0929 11:32:26.114728 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-drh82\" (UniqueName: \"kubernetes.io/projected/4b22e897-bb2a-4714-8d4c-3c7a659bda9f-kube-api-access-drh82\") on node \"crc\" DevicePath \"\"" Sep 29 11:32:26 crc kubenswrapper[4752]: I0929 11:32:26.114738 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b22e897-bb2a-4714-8d4c-3c7a659bda9f-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 11:32:26 crc kubenswrapper[4752]: I0929 11:32:26.595876 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6nrpv" event={"ID":"4b22e897-bb2a-4714-8d4c-3c7a659bda9f","Type":"ContainerDied","Data":"d30ae39c82e5be5e6ba384031f3217c725fcb125c2fe9b71f6dba6ac5aadd7ae"} Sep 29 11:32:26 crc kubenswrapper[4752]: I0929 11:32:26.595935 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6nrpv" Sep 29 11:32:26 crc kubenswrapper[4752]: I0929 11:32:26.597191 4752 scope.go:117] "RemoveContainer" containerID="103d78a98759778e44f2a34a31201442952e53ea7dedc36916f9f37361177b02" Sep 29 11:32:26 crc kubenswrapper[4752]: I0929 11:32:26.632192 4752 scope.go:117] "RemoveContainer" containerID="f94fe12816468594f3510559bfa5a35cacb8b320ac03f2b62eedba7fbdaf713e" Sep 29 11:32:26 crc kubenswrapper[4752]: I0929 11:32:26.639976 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6nrpv"] Sep 29 11:32:26 crc kubenswrapper[4752]: I0929 11:32:26.657078 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-6nrpv"] Sep 29 11:32:26 crc kubenswrapper[4752]: I0929 11:32:26.675493 4752 scope.go:117] "RemoveContainer" containerID="4601aebbb0c6217ec48818f5ec9f565f3e69afc3a67103482a1021fa764432c0" Sep 29 11:32:28 crc kubenswrapper[4752]: I0929 11:32:28.041497 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b22e897-bb2a-4714-8d4c-3c7a659bda9f" path="/var/lib/kubelet/pods/4b22e897-bb2a-4714-8d4c-3c7a659bda9f/volumes" Sep 29 11:32:36 crc kubenswrapper[4752]: I0929 11:32:36.034581 4752 scope.go:117] "RemoveContainer" containerID="4edd0d5d46b3e4dc4505574db2621f4af50e77316cad134acc80e3c6d10d13c6" Sep 29 11:32:36 crc kubenswrapper[4752]: E0929 11:32:36.035348 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgrvs_openshift-machine-config-operator(5863c243-797d-462a-b11f-71aaf005f8d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" Sep 29 11:32:51 crc kubenswrapper[4752]: I0929 11:32:51.031508 4752 scope.go:117] "RemoveContainer" containerID="4edd0d5d46b3e4dc4505574db2621f4af50e77316cad134acc80e3c6d10d13c6" Sep 29 11:32:51 crc kubenswrapper[4752]: E0929 11:32:51.032331 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgrvs_openshift-machine-config-operator(5863c243-797d-462a-b11f-71aaf005f8d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" Sep 29 11:33:04 crc kubenswrapper[4752]: I0929 11:33:04.030788 4752 scope.go:117] "RemoveContainer" containerID="4edd0d5d46b3e4dc4505574db2621f4af50e77316cad134acc80e3c6d10d13c6" Sep 29 11:33:04 crc kubenswrapper[4752]: E0929 11:33:04.031659 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgrvs_openshift-machine-config-operator(5863c243-797d-462a-b11f-71aaf005f8d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" Sep 29 11:33:18 crc kubenswrapper[4752]: I0929 11:33:18.031670 4752 scope.go:117] "RemoveContainer" containerID="4edd0d5d46b3e4dc4505574db2621f4af50e77316cad134acc80e3c6d10d13c6" Sep 29 11:33:18 crc kubenswrapper[4752]: E0929 11:33:18.032576 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgrvs_openshift-machine-config-operator(5863c243-797d-462a-b11f-71aaf005f8d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" Sep 29 11:33:31 crc kubenswrapper[4752]: I0929 11:33:31.031402 4752 scope.go:117] "RemoveContainer" containerID="4edd0d5d46b3e4dc4505574db2621f4af50e77316cad134acc80e3c6d10d13c6" Sep 29 11:33:31 crc kubenswrapper[4752]: E0929 11:33:31.032165 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgrvs_openshift-machine-config-operator(5863c243-797d-462a-b11f-71aaf005f8d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" Sep 29 11:33:37 crc kubenswrapper[4752]: I0929 11:33:37.028486 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-pnxzg"] Sep 29 11:33:37 crc kubenswrapper[4752]: E0929 11:33:37.029561 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b22e897-bb2a-4714-8d4c-3c7a659bda9f" containerName="extract-utilities" Sep 29 11:33:37 crc kubenswrapper[4752]: I0929 11:33:37.029579 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b22e897-bb2a-4714-8d4c-3c7a659bda9f" containerName="extract-utilities" Sep 29 11:33:37 crc kubenswrapper[4752]: E0929 11:33:37.029601 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b22e897-bb2a-4714-8d4c-3c7a659bda9f" containerName="extract-content" Sep 29 11:33:37 crc kubenswrapper[4752]: I0929 11:33:37.029610 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b22e897-bb2a-4714-8d4c-3c7a659bda9f" containerName="extract-content" Sep 29 11:33:37 crc kubenswrapper[4752]: E0929 11:33:37.029647 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b22e897-bb2a-4714-8d4c-3c7a659bda9f" containerName="registry-server" Sep 29 11:33:37 crc kubenswrapper[4752]: I0929 11:33:37.029656 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b22e897-bb2a-4714-8d4c-3c7a659bda9f" containerName="registry-server" Sep 29 11:33:37 crc kubenswrapper[4752]: I0929 11:33:37.029881 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b22e897-bb2a-4714-8d4c-3c7a659bda9f" containerName="registry-server" Sep 29 11:33:37 crc kubenswrapper[4752]: I0929 11:33:37.031679 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pnxzg" Sep 29 11:33:37 crc kubenswrapper[4752]: I0929 11:33:37.041113 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pnxzg"] Sep 29 11:33:37 crc kubenswrapper[4752]: I0929 11:33:37.148702 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d3a8ea2-c5d1-43e3-911d-74b6402135ee-catalog-content\") pod \"redhat-operators-pnxzg\" (UID: \"0d3a8ea2-c5d1-43e3-911d-74b6402135ee\") " pod="openshift-marketplace/redhat-operators-pnxzg" Sep 29 11:33:37 crc kubenswrapper[4752]: I0929 11:33:37.149040 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d3a8ea2-c5d1-43e3-911d-74b6402135ee-utilities\") pod \"redhat-operators-pnxzg\" (UID: \"0d3a8ea2-c5d1-43e3-911d-74b6402135ee\") " pod="openshift-marketplace/redhat-operators-pnxzg" Sep 29 11:33:37 crc kubenswrapper[4752]: I0929 11:33:37.149193 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxxgm\" (UniqueName: \"kubernetes.io/projected/0d3a8ea2-c5d1-43e3-911d-74b6402135ee-kube-api-access-pxxgm\") pod \"redhat-operators-pnxzg\" (UID: \"0d3a8ea2-c5d1-43e3-911d-74b6402135ee\") " pod="openshift-marketplace/redhat-operators-pnxzg" Sep 29 11:33:37 crc kubenswrapper[4752]: I0929 11:33:37.250688 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d3a8ea2-c5d1-43e3-911d-74b6402135ee-utilities\") pod \"redhat-operators-pnxzg\" (UID: \"0d3a8ea2-c5d1-43e3-911d-74b6402135ee\") " pod="openshift-marketplace/redhat-operators-pnxzg" Sep 29 11:33:37 crc kubenswrapper[4752]: I0929 11:33:37.251069 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxxgm\" (UniqueName: \"kubernetes.io/projected/0d3a8ea2-c5d1-43e3-911d-74b6402135ee-kube-api-access-pxxgm\") pod \"redhat-operators-pnxzg\" (UID: \"0d3a8ea2-c5d1-43e3-911d-74b6402135ee\") " pod="openshift-marketplace/redhat-operators-pnxzg" Sep 29 11:33:37 crc kubenswrapper[4752]: I0929 11:33:37.251173 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d3a8ea2-c5d1-43e3-911d-74b6402135ee-catalog-content\") pod \"redhat-operators-pnxzg\" (UID: \"0d3a8ea2-c5d1-43e3-911d-74b6402135ee\") " pod="openshift-marketplace/redhat-operators-pnxzg" Sep 29 11:33:37 crc kubenswrapper[4752]: I0929 11:33:37.251258 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d3a8ea2-c5d1-43e3-911d-74b6402135ee-utilities\") pod \"redhat-operators-pnxzg\" (UID: \"0d3a8ea2-c5d1-43e3-911d-74b6402135ee\") " pod="openshift-marketplace/redhat-operators-pnxzg" Sep 29 11:33:37 crc kubenswrapper[4752]: I0929 11:33:37.251651 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d3a8ea2-c5d1-43e3-911d-74b6402135ee-catalog-content\") pod \"redhat-operators-pnxzg\" (UID: \"0d3a8ea2-c5d1-43e3-911d-74b6402135ee\") " pod="openshift-marketplace/redhat-operators-pnxzg" Sep 29 11:33:37 crc kubenswrapper[4752]: I0929 11:33:37.270974 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxxgm\" (UniqueName: \"kubernetes.io/projected/0d3a8ea2-c5d1-43e3-911d-74b6402135ee-kube-api-access-pxxgm\") pod \"redhat-operators-pnxzg\" (UID: \"0d3a8ea2-c5d1-43e3-911d-74b6402135ee\") " pod="openshift-marketplace/redhat-operators-pnxzg" Sep 29 11:33:37 crc kubenswrapper[4752]: I0929 11:33:37.352976 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pnxzg" Sep 29 11:33:37 crc kubenswrapper[4752]: I0929 11:33:37.597432 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pnxzg"] Sep 29 11:33:38 crc kubenswrapper[4752]: I0929 11:33:38.172696 4752 generic.go:334] "Generic (PLEG): container finished" podID="0d3a8ea2-c5d1-43e3-911d-74b6402135ee" containerID="69a19c73734901b941d9160f813e5eddf67cf23d0e587fa0b46f9f08f7845fb9" exitCode=0 Sep 29 11:33:38 crc kubenswrapper[4752]: I0929 11:33:38.172755 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pnxzg" event={"ID":"0d3a8ea2-c5d1-43e3-911d-74b6402135ee","Type":"ContainerDied","Data":"69a19c73734901b941d9160f813e5eddf67cf23d0e587fa0b46f9f08f7845fb9"} Sep 29 11:33:38 crc kubenswrapper[4752]: I0929 11:33:38.173044 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pnxzg" event={"ID":"0d3a8ea2-c5d1-43e3-911d-74b6402135ee","Type":"ContainerStarted","Data":"4c0b0aeb31f3628d1e470bc9ab856242f07b86423946ef59b741a9c8c8f0da42"} Sep 29 11:33:39 crc kubenswrapper[4752]: I0929 11:33:39.195686 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pnxzg" event={"ID":"0d3a8ea2-c5d1-43e3-911d-74b6402135ee","Type":"ContainerStarted","Data":"e39cedc08c58faaedc9641a38a9291885e99c79f643885a132078d5a2d8925ec"} Sep 29 11:33:40 crc kubenswrapper[4752]: I0929 11:33:40.206053 4752 generic.go:334] "Generic (PLEG): container finished" podID="0d3a8ea2-c5d1-43e3-911d-74b6402135ee" containerID="e39cedc08c58faaedc9641a38a9291885e99c79f643885a132078d5a2d8925ec" exitCode=0 Sep 29 11:33:40 crc kubenswrapper[4752]: I0929 11:33:40.206115 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pnxzg" event={"ID":"0d3a8ea2-c5d1-43e3-911d-74b6402135ee","Type":"ContainerDied","Data":"e39cedc08c58faaedc9641a38a9291885e99c79f643885a132078d5a2d8925ec"} Sep 29 11:33:41 crc kubenswrapper[4752]: I0929 11:33:41.220372 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pnxzg" event={"ID":"0d3a8ea2-c5d1-43e3-911d-74b6402135ee","Type":"ContainerStarted","Data":"f6a67a0b98c4573f28092a502b95480bc2ab1a7f24b87c05e9d9042398894be4"} Sep 29 11:33:45 crc kubenswrapper[4752]: I0929 11:33:45.031362 4752 scope.go:117] "RemoveContainer" containerID="4edd0d5d46b3e4dc4505574db2621f4af50e77316cad134acc80e3c6d10d13c6" Sep 29 11:33:45 crc kubenswrapper[4752]: E0929 11:33:45.031901 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgrvs_openshift-machine-config-operator(5863c243-797d-462a-b11f-71aaf005f8d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" Sep 29 11:33:47 crc kubenswrapper[4752]: I0929 11:33:47.353680 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-pnxzg" Sep 29 11:33:47 crc kubenswrapper[4752]: I0929 11:33:47.353982 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-pnxzg" Sep 29 11:33:47 crc kubenswrapper[4752]: I0929 11:33:47.405588 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-pnxzg" Sep 29 11:33:47 crc kubenswrapper[4752]: I0929 11:33:47.434155 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-pnxzg" podStartSLOduration=8.003355079 podStartE2EDuration="10.434134036s" podCreationTimestamp="2025-09-29 11:33:37 +0000 UTC" firstStartedPulling="2025-09-29 11:33:38.174207064 +0000 UTC m=+2958.963348731" lastFinishedPulling="2025-09-29 11:33:40.604986021 +0000 UTC m=+2961.394127688" observedRunningTime="2025-09-29 11:33:41.275910051 +0000 UTC m=+2962.065051738" watchObservedRunningTime="2025-09-29 11:33:47.434134036 +0000 UTC m=+2968.223275703" Sep 29 11:33:48 crc kubenswrapper[4752]: I0929 11:33:48.321778 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-pnxzg" Sep 29 11:33:51 crc kubenswrapper[4752]: I0929 11:33:51.014431 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pnxzg"] Sep 29 11:33:51 crc kubenswrapper[4752]: I0929 11:33:51.015141 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-pnxzg" podUID="0d3a8ea2-c5d1-43e3-911d-74b6402135ee" containerName="registry-server" containerID="cri-o://f6a67a0b98c4573f28092a502b95480bc2ab1a7f24b87c05e9d9042398894be4" gracePeriod=2 Sep 29 11:33:51 crc kubenswrapper[4752]: I0929 11:33:51.302173 4752 generic.go:334] "Generic (PLEG): container finished" podID="0d3a8ea2-c5d1-43e3-911d-74b6402135ee" containerID="f6a67a0b98c4573f28092a502b95480bc2ab1a7f24b87c05e9d9042398894be4" exitCode=0 Sep 29 11:33:51 crc kubenswrapper[4752]: I0929 11:33:51.302243 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pnxzg" event={"ID":"0d3a8ea2-c5d1-43e3-911d-74b6402135ee","Type":"ContainerDied","Data":"f6a67a0b98c4573f28092a502b95480bc2ab1a7f24b87c05e9d9042398894be4"} Sep 29 11:33:52 crc kubenswrapper[4752]: I0929 11:33:52.018681 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pnxzg" Sep 29 11:33:52 crc kubenswrapper[4752]: I0929 11:33:52.087475 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d3a8ea2-c5d1-43e3-911d-74b6402135ee-catalog-content\") pod \"0d3a8ea2-c5d1-43e3-911d-74b6402135ee\" (UID: \"0d3a8ea2-c5d1-43e3-911d-74b6402135ee\") " Sep 29 11:33:52 crc kubenswrapper[4752]: I0929 11:33:52.087704 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d3a8ea2-c5d1-43e3-911d-74b6402135ee-utilities\") pod \"0d3a8ea2-c5d1-43e3-911d-74b6402135ee\" (UID: \"0d3a8ea2-c5d1-43e3-911d-74b6402135ee\") " Sep 29 11:33:52 crc kubenswrapper[4752]: I0929 11:33:52.087737 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pxxgm\" (UniqueName: \"kubernetes.io/projected/0d3a8ea2-c5d1-43e3-911d-74b6402135ee-kube-api-access-pxxgm\") pod \"0d3a8ea2-c5d1-43e3-911d-74b6402135ee\" (UID: \"0d3a8ea2-c5d1-43e3-911d-74b6402135ee\") " Sep 29 11:33:52 crc kubenswrapper[4752]: I0929 11:33:52.088738 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d3a8ea2-c5d1-43e3-911d-74b6402135ee-utilities" (OuterVolumeSpecName: "utilities") pod "0d3a8ea2-c5d1-43e3-911d-74b6402135ee" (UID: "0d3a8ea2-c5d1-43e3-911d-74b6402135ee"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 11:33:52 crc kubenswrapper[4752]: I0929 11:33:52.092349 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d3a8ea2-c5d1-43e3-911d-74b6402135ee-kube-api-access-pxxgm" (OuterVolumeSpecName: "kube-api-access-pxxgm") pod "0d3a8ea2-c5d1-43e3-911d-74b6402135ee" (UID: "0d3a8ea2-c5d1-43e3-911d-74b6402135ee"). InnerVolumeSpecName "kube-api-access-pxxgm". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 11:33:52 crc kubenswrapper[4752]: I0929 11:33:52.174603 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d3a8ea2-c5d1-43e3-911d-74b6402135ee-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0d3a8ea2-c5d1-43e3-911d-74b6402135ee" (UID: "0d3a8ea2-c5d1-43e3-911d-74b6402135ee"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 11:33:52 crc kubenswrapper[4752]: I0929 11:33:52.189509 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d3a8ea2-c5d1-43e3-911d-74b6402135ee-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 11:33:52 crc kubenswrapper[4752]: I0929 11:33:52.189543 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pxxgm\" (UniqueName: \"kubernetes.io/projected/0d3a8ea2-c5d1-43e3-911d-74b6402135ee-kube-api-access-pxxgm\") on node \"crc\" DevicePath \"\"" Sep 29 11:33:52 crc kubenswrapper[4752]: I0929 11:33:52.189554 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d3a8ea2-c5d1-43e3-911d-74b6402135ee-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 11:33:52 crc kubenswrapper[4752]: I0929 11:33:52.314405 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pnxzg" event={"ID":"0d3a8ea2-c5d1-43e3-911d-74b6402135ee","Type":"ContainerDied","Data":"4c0b0aeb31f3628d1e470bc9ab856242f07b86423946ef59b741a9c8c8f0da42"} Sep 29 11:33:52 crc kubenswrapper[4752]: I0929 11:33:52.314442 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pnxzg" Sep 29 11:33:52 crc kubenswrapper[4752]: I0929 11:33:52.314507 4752 scope.go:117] "RemoveContainer" containerID="f6a67a0b98c4573f28092a502b95480bc2ab1a7f24b87c05e9d9042398894be4" Sep 29 11:33:52 crc kubenswrapper[4752]: I0929 11:33:52.333732 4752 scope.go:117] "RemoveContainer" containerID="e39cedc08c58faaedc9641a38a9291885e99c79f643885a132078d5a2d8925ec" Sep 29 11:33:52 crc kubenswrapper[4752]: I0929 11:33:52.347788 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pnxzg"] Sep 29 11:33:52 crc kubenswrapper[4752]: I0929 11:33:52.355606 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-pnxzg"] Sep 29 11:33:52 crc kubenswrapper[4752]: I0929 11:33:52.369626 4752 scope.go:117] "RemoveContainer" containerID="69a19c73734901b941d9160f813e5eddf67cf23d0e587fa0b46f9f08f7845fb9" Sep 29 11:33:54 crc kubenswrapper[4752]: I0929 11:33:54.043245 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d3a8ea2-c5d1-43e3-911d-74b6402135ee" path="/var/lib/kubelet/pods/0d3a8ea2-c5d1-43e3-911d-74b6402135ee/volumes" Sep 29 11:33:56 crc kubenswrapper[4752]: I0929 11:33:56.032447 4752 scope.go:117] "RemoveContainer" containerID="4edd0d5d46b3e4dc4505574db2621f4af50e77316cad134acc80e3c6d10d13c6" Sep 29 11:33:56 crc kubenswrapper[4752]: E0929 11:33:56.032861 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgrvs_openshift-machine-config-operator(5863c243-797d-462a-b11f-71aaf005f8d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" Sep 29 11:34:09 crc kubenswrapper[4752]: I0929 11:34:09.031709 4752 scope.go:117] "RemoveContainer" containerID="4edd0d5d46b3e4dc4505574db2621f4af50e77316cad134acc80e3c6d10d13c6" Sep 29 11:34:09 crc kubenswrapper[4752]: E0929 11:34:09.032572 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgrvs_openshift-machine-config-operator(5863c243-797d-462a-b11f-71aaf005f8d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" Sep 29 11:34:24 crc kubenswrapper[4752]: I0929 11:34:24.031927 4752 scope.go:117] "RemoveContainer" containerID="4edd0d5d46b3e4dc4505574db2621f4af50e77316cad134acc80e3c6d10d13c6" Sep 29 11:34:24 crc kubenswrapper[4752]: E0929 11:34:24.033015 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgrvs_openshift-machine-config-operator(5863c243-797d-462a-b11f-71aaf005f8d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" Sep 29 11:34:39 crc kubenswrapper[4752]: I0929 11:34:39.030383 4752 scope.go:117] "RemoveContainer" containerID="4edd0d5d46b3e4dc4505574db2621f4af50e77316cad134acc80e3c6d10d13c6" Sep 29 11:34:39 crc kubenswrapper[4752]: E0929 11:34:39.031208 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgrvs_openshift-machine-config-operator(5863c243-797d-462a-b11f-71aaf005f8d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" Sep 29 11:34:51 crc kubenswrapper[4752]: I0929 11:34:51.037694 4752 scope.go:117] "RemoveContainer" containerID="4edd0d5d46b3e4dc4505574db2621f4af50e77316cad134acc80e3c6d10d13c6" Sep 29 11:34:51 crc kubenswrapper[4752]: E0929 11:34:51.038644 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgrvs_openshift-machine-config-operator(5863c243-797d-462a-b11f-71aaf005f8d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" Sep 29 11:35:04 crc kubenswrapper[4752]: I0929 11:35:04.031180 4752 scope.go:117] "RemoveContainer" containerID="4edd0d5d46b3e4dc4505574db2621f4af50e77316cad134acc80e3c6d10d13c6" Sep 29 11:35:04 crc kubenswrapper[4752]: E0929 11:35:04.031963 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgrvs_openshift-machine-config-operator(5863c243-797d-462a-b11f-71aaf005f8d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" Sep 29 11:35:18 crc kubenswrapper[4752]: I0929 11:35:18.031236 4752 scope.go:117] "RemoveContainer" containerID="4edd0d5d46b3e4dc4505574db2621f4af50e77316cad134acc80e3c6d10d13c6" Sep 29 11:35:18 crc kubenswrapper[4752]: E0929 11:35:18.033111 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgrvs_openshift-machine-config-operator(5863c243-797d-462a-b11f-71aaf005f8d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" Sep 29 11:35:21 crc kubenswrapper[4752]: I0929 11:35:21.423895 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9qjpl"] Sep 29 11:35:21 crc kubenswrapper[4752]: E0929 11:35:21.424546 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d3a8ea2-c5d1-43e3-911d-74b6402135ee" containerName="extract-utilities" Sep 29 11:35:21 crc kubenswrapper[4752]: I0929 11:35:21.424562 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d3a8ea2-c5d1-43e3-911d-74b6402135ee" containerName="extract-utilities" Sep 29 11:35:21 crc kubenswrapper[4752]: E0929 11:35:21.424579 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d3a8ea2-c5d1-43e3-911d-74b6402135ee" containerName="registry-server" Sep 29 11:35:21 crc kubenswrapper[4752]: I0929 11:35:21.424588 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d3a8ea2-c5d1-43e3-911d-74b6402135ee" containerName="registry-server" Sep 29 11:35:21 crc kubenswrapper[4752]: E0929 11:35:21.424605 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d3a8ea2-c5d1-43e3-911d-74b6402135ee" containerName="extract-content" Sep 29 11:35:21 crc kubenswrapper[4752]: I0929 11:35:21.424610 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d3a8ea2-c5d1-43e3-911d-74b6402135ee" containerName="extract-content" Sep 29 11:35:21 crc kubenswrapper[4752]: I0929 11:35:21.424759 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d3a8ea2-c5d1-43e3-911d-74b6402135ee" containerName="registry-server" Sep 29 11:35:21 crc kubenswrapper[4752]: I0929 11:35:21.426048 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9qjpl" Sep 29 11:35:21 crc kubenswrapper[4752]: I0929 11:35:21.442774 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9qjpl"] Sep 29 11:35:21 crc kubenswrapper[4752]: I0929 11:35:21.526393 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14090610-1109-42b7-851d-847e3d9b4f7a-catalog-content\") pod \"certified-operators-9qjpl\" (UID: \"14090610-1109-42b7-851d-847e3d9b4f7a\") " pod="openshift-marketplace/certified-operators-9qjpl" Sep 29 11:35:21 crc kubenswrapper[4752]: I0929 11:35:21.526453 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jzfr\" (UniqueName: \"kubernetes.io/projected/14090610-1109-42b7-851d-847e3d9b4f7a-kube-api-access-7jzfr\") pod \"certified-operators-9qjpl\" (UID: \"14090610-1109-42b7-851d-847e3d9b4f7a\") " pod="openshift-marketplace/certified-operators-9qjpl" Sep 29 11:35:21 crc kubenswrapper[4752]: I0929 11:35:21.526539 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14090610-1109-42b7-851d-847e3d9b4f7a-utilities\") pod \"certified-operators-9qjpl\" (UID: \"14090610-1109-42b7-851d-847e3d9b4f7a\") " pod="openshift-marketplace/certified-operators-9qjpl" Sep 29 11:35:21 crc kubenswrapper[4752]: I0929 11:35:21.627774 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14090610-1109-42b7-851d-847e3d9b4f7a-catalog-content\") pod \"certified-operators-9qjpl\" (UID: \"14090610-1109-42b7-851d-847e3d9b4f7a\") " pod="openshift-marketplace/certified-operators-9qjpl" Sep 29 11:35:21 crc kubenswrapper[4752]: I0929 11:35:21.627861 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jzfr\" (UniqueName: \"kubernetes.io/projected/14090610-1109-42b7-851d-847e3d9b4f7a-kube-api-access-7jzfr\") pod \"certified-operators-9qjpl\" (UID: \"14090610-1109-42b7-851d-847e3d9b4f7a\") " pod="openshift-marketplace/certified-operators-9qjpl" Sep 29 11:35:21 crc kubenswrapper[4752]: I0929 11:35:21.627941 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14090610-1109-42b7-851d-847e3d9b4f7a-utilities\") pod \"certified-operators-9qjpl\" (UID: \"14090610-1109-42b7-851d-847e3d9b4f7a\") " pod="openshift-marketplace/certified-operators-9qjpl" Sep 29 11:35:21 crc kubenswrapper[4752]: I0929 11:35:21.628472 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14090610-1109-42b7-851d-847e3d9b4f7a-utilities\") pod \"certified-operators-9qjpl\" (UID: \"14090610-1109-42b7-851d-847e3d9b4f7a\") " pod="openshift-marketplace/certified-operators-9qjpl" Sep 29 11:35:21 crc kubenswrapper[4752]: I0929 11:35:21.628723 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14090610-1109-42b7-851d-847e3d9b4f7a-catalog-content\") pod \"certified-operators-9qjpl\" (UID: \"14090610-1109-42b7-851d-847e3d9b4f7a\") " pod="openshift-marketplace/certified-operators-9qjpl" Sep 29 11:35:21 crc kubenswrapper[4752]: I0929 11:35:21.650990 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jzfr\" (UniqueName: \"kubernetes.io/projected/14090610-1109-42b7-851d-847e3d9b4f7a-kube-api-access-7jzfr\") pod \"certified-operators-9qjpl\" (UID: \"14090610-1109-42b7-851d-847e3d9b4f7a\") " pod="openshift-marketplace/certified-operators-9qjpl" Sep 29 11:35:21 crc kubenswrapper[4752]: I0929 11:35:21.747445 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9qjpl" Sep 29 11:35:22 crc kubenswrapper[4752]: I0929 11:35:22.197223 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9qjpl"] Sep 29 11:35:23 crc kubenswrapper[4752]: I0929 11:35:23.076969 4752 generic.go:334] "Generic (PLEG): container finished" podID="14090610-1109-42b7-851d-847e3d9b4f7a" containerID="a58a61d87422443f392c3d362cbe6f918c1914617603edc28def613df79f29ad" exitCode=0 Sep 29 11:35:23 crc kubenswrapper[4752]: I0929 11:35:23.077070 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9qjpl" event={"ID":"14090610-1109-42b7-851d-847e3d9b4f7a","Type":"ContainerDied","Data":"a58a61d87422443f392c3d362cbe6f918c1914617603edc28def613df79f29ad"} Sep 29 11:35:23 crc kubenswrapper[4752]: I0929 11:35:23.077273 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9qjpl" event={"ID":"14090610-1109-42b7-851d-847e3d9b4f7a","Type":"ContainerStarted","Data":"f34d3942e4377158dd9c35ae864979b75508901277e9ba13e583e9399128db0d"} Sep 29 11:35:24 crc kubenswrapper[4752]: I0929 11:35:24.091458 4752 generic.go:334] "Generic (PLEG): container finished" podID="14090610-1109-42b7-851d-847e3d9b4f7a" containerID="cede6878b5f399e6c6c82a1f90cae743db634b278c47f662821b65e68055b814" exitCode=0 Sep 29 11:35:24 crc kubenswrapper[4752]: I0929 11:35:24.091628 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9qjpl" event={"ID":"14090610-1109-42b7-851d-847e3d9b4f7a","Type":"ContainerDied","Data":"cede6878b5f399e6c6c82a1f90cae743db634b278c47f662821b65e68055b814"} Sep 29 11:35:25 crc kubenswrapper[4752]: I0929 11:35:25.104979 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9qjpl" event={"ID":"14090610-1109-42b7-851d-847e3d9b4f7a","Type":"ContainerStarted","Data":"3d5371d81d7fabe06b26cc70ae9f0d9597f3f60a8212d7ab1fd3f55b4fc986e3"} Sep 29 11:35:25 crc kubenswrapper[4752]: I0929 11:35:25.125664 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9qjpl" podStartSLOduration=2.684878645 podStartE2EDuration="4.125648541s" podCreationTimestamp="2025-09-29 11:35:21 +0000 UTC" firstStartedPulling="2025-09-29 11:35:23.080530345 +0000 UTC m=+3063.869672012" lastFinishedPulling="2025-09-29 11:35:24.521300221 +0000 UTC m=+3065.310441908" observedRunningTime="2025-09-29 11:35:25.121067181 +0000 UTC m=+3065.910208888" watchObservedRunningTime="2025-09-29 11:35:25.125648541 +0000 UTC m=+3065.914790208" Sep 29 11:35:29 crc kubenswrapper[4752]: I0929 11:35:29.031737 4752 scope.go:117] "RemoveContainer" containerID="4edd0d5d46b3e4dc4505574db2621f4af50e77316cad134acc80e3c6d10d13c6" Sep 29 11:35:29 crc kubenswrapper[4752]: E0929 11:35:29.032244 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgrvs_openshift-machine-config-operator(5863c243-797d-462a-b11f-71aaf005f8d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" Sep 29 11:35:31 crc kubenswrapper[4752]: I0929 11:35:31.748394 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9qjpl" Sep 29 11:35:31 crc kubenswrapper[4752]: I0929 11:35:31.748719 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9qjpl" Sep 29 11:35:31 crc kubenswrapper[4752]: I0929 11:35:31.805116 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9qjpl" Sep 29 11:35:32 crc kubenswrapper[4752]: I0929 11:35:32.202392 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9qjpl" Sep 29 11:35:35 crc kubenswrapper[4752]: I0929 11:35:35.420565 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9qjpl"] Sep 29 11:35:35 crc kubenswrapper[4752]: I0929 11:35:35.421309 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-9qjpl" podUID="14090610-1109-42b7-851d-847e3d9b4f7a" containerName="registry-server" containerID="cri-o://3d5371d81d7fabe06b26cc70ae9f0d9597f3f60a8212d7ab1fd3f55b4fc986e3" gracePeriod=2 Sep 29 11:35:35 crc kubenswrapper[4752]: I0929 11:35:35.844830 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9qjpl" Sep 29 11:35:35 crc kubenswrapper[4752]: I0929 11:35:35.958140 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14090610-1109-42b7-851d-847e3d9b4f7a-catalog-content\") pod \"14090610-1109-42b7-851d-847e3d9b4f7a\" (UID: \"14090610-1109-42b7-851d-847e3d9b4f7a\") " Sep 29 11:35:35 crc kubenswrapper[4752]: I0929 11:35:35.958201 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7jzfr\" (UniqueName: \"kubernetes.io/projected/14090610-1109-42b7-851d-847e3d9b4f7a-kube-api-access-7jzfr\") pod \"14090610-1109-42b7-851d-847e3d9b4f7a\" (UID: \"14090610-1109-42b7-851d-847e3d9b4f7a\") " Sep 29 11:35:35 crc kubenswrapper[4752]: I0929 11:35:35.958375 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14090610-1109-42b7-851d-847e3d9b4f7a-utilities\") pod \"14090610-1109-42b7-851d-847e3d9b4f7a\" (UID: \"14090610-1109-42b7-851d-847e3d9b4f7a\") " Sep 29 11:35:35 crc kubenswrapper[4752]: I0929 11:35:35.959595 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14090610-1109-42b7-851d-847e3d9b4f7a-utilities" (OuterVolumeSpecName: "utilities") pod "14090610-1109-42b7-851d-847e3d9b4f7a" (UID: "14090610-1109-42b7-851d-847e3d9b4f7a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 11:35:35 crc kubenswrapper[4752]: I0929 11:35:35.964630 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14090610-1109-42b7-851d-847e3d9b4f7a-kube-api-access-7jzfr" (OuterVolumeSpecName: "kube-api-access-7jzfr") pod "14090610-1109-42b7-851d-847e3d9b4f7a" (UID: "14090610-1109-42b7-851d-847e3d9b4f7a"). InnerVolumeSpecName "kube-api-access-7jzfr". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 11:35:35 crc kubenswrapper[4752]: I0929 11:35:35.992148 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-j48h8/must-gather-2f47z"] Sep 29 11:35:35 crc kubenswrapper[4752]: E0929 11:35:35.992534 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14090610-1109-42b7-851d-847e3d9b4f7a" containerName="extract-utilities" Sep 29 11:35:35 crc kubenswrapper[4752]: I0929 11:35:35.992559 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="14090610-1109-42b7-851d-847e3d9b4f7a" containerName="extract-utilities" Sep 29 11:35:35 crc kubenswrapper[4752]: E0929 11:35:35.992572 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14090610-1109-42b7-851d-847e3d9b4f7a" containerName="extract-content" Sep 29 11:35:35 crc kubenswrapper[4752]: I0929 11:35:35.992579 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="14090610-1109-42b7-851d-847e3d9b4f7a" containerName="extract-content" Sep 29 11:35:35 crc kubenswrapper[4752]: E0929 11:35:35.992590 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14090610-1109-42b7-851d-847e3d9b4f7a" containerName="registry-server" Sep 29 11:35:35 crc kubenswrapper[4752]: I0929 11:35:35.992597 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="14090610-1109-42b7-851d-847e3d9b4f7a" containerName="registry-server" Sep 29 11:35:35 crc kubenswrapper[4752]: I0929 11:35:35.992893 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="14090610-1109-42b7-851d-847e3d9b4f7a" containerName="registry-server" Sep 29 11:35:35 crc kubenswrapper[4752]: I0929 11:35:35.994455 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-j48h8/must-gather-2f47z" Sep 29 11:35:35 crc kubenswrapper[4752]: I0929 11:35:35.996439 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-j48h8"/"openshift-service-ca.crt" Sep 29 11:35:35 crc kubenswrapper[4752]: I0929 11:35:35.996523 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-j48h8"/"default-dockercfg-kg6cj" Sep 29 11:35:35 crc kubenswrapper[4752]: I0929 11:35:35.997411 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-j48h8"/"kube-root-ca.crt" Sep 29 11:35:36 crc kubenswrapper[4752]: I0929 11:35:36.010720 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-j48h8/must-gather-2f47z"] Sep 29 11:35:36 crc kubenswrapper[4752]: I0929 11:35:36.037513 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14090610-1109-42b7-851d-847e3d9b4f7a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "14090610-1109-42b7-851d-847e3d9b4f7a" (UID: "14090610-1109-42b7-851d-847e3d9b4f7a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 11:35:36 crc kubenswrapper[4752]: I0929 11:35:36.065397 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sd8k7\" (UniqueName: \"kubernetes.io/projected/95d94158-0ee1-4665-ab07-2656c87c4881-kube-api-access-sd8k7\") pod \"must-gather-2f47z\" (UID: \"95d94158-0ee1-4665-ab07-2656c87c4881\") " pod="openshift-must-gather-j48h8/must-gather-2f47z" Sep 29 11:35:36 crc kubenswrapper[4752]: I0929 11:35:36.065767 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/95d94158-0ee1-4665-ab07-2656c87c4881-must-gather-output\") pod \"must-gather-2f47z\" (UID: \"95d94158-0ee1-4665-ab07-2656c87c4881\") " pod="openshift-must-gather-j48h8/must-gather-2f47z" Sep 29 11:35:36 crc kubenswrapper[4752]: I0929 11:35:36.065978 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14090610-1109-42b7-851d-847e3d9b4f7a-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 11:35:36 crc kubenswrapper[4752]: I0929 11:35:36.066062 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14090610-1109-42b7-851d-847e3d9b4f7a-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 11:35:36 crc kubenswrapper[4752]: I0929 11:35:36.066159 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7jzfr\" (UniqueName: \"kubernetes.io/projected/14090610-1109-42b7-851d-847e3d9b4f7a-kube-api-access-7jzfr\") on node \"crc\" DevicePath \"\"" Sep 29 11:35:36 crc kubenswrapper[4752]: I0929 11:35:36.167904 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/95d94158-0ee1-4665-ab07-2656c87c4881-must-gather-output\") pod \"must-gather-2f47z\" (UID: \"95d94158-0ee1-4665-ab07-2656c87c4881\") " pod="openshift-must-gather-j48h8/must-gather-2f47z" Sep 29 11:35:36 crc kubenswrapper[4752]: I0929 11:35:36.168072 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sd8k7\" (UniqueName: \"kubernetes.io/projected/95d94158-0ee1-4665-ab07-2656c87c4881-kube-api-access-sd8k7\") pod \"must-gather-2f47z\" (UID: \"95d94158-0ee1-4665-ab07-2656c87c4881\") " pod="openshift-must-gather-j48h8/must-gather-2f47z" Sep 29 11:35:36 crc kubenswrapper[4752]: I0929 11:35:36.168449 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/95d94158-0ee1-4665-ab07-2656c87c4881-must-gather-output\") pod \"must-gather-2f47z\" (UID: \"95d94158-0ee1-4665-ab07-2656c87c4881\") " pod="openshift-must-gather-j48h8/must-gather-2f47z" Sep 29 11:35:36 crc kubenswrapper[4752]: I0929 11:35:36.188713 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sd8k7\" (UniqueName: \"kubernetes.io/projected/95d94158-0ee1-4665-ab07-2656c87c4881-kube-api-access-sd8k7\") pod \"must-gather-2f47z\" (UID: \"95d94158-0ee1-4665-ab07-2656c87c4881\") " pod="openshift-must-gather-j48h8/must-gather-2f47z" Sep 29 11:35:36 crc kubenswrapper[4752]: I0929 11:35:36.193281 4752 generic.go:334] "Generic (PLEG): container finished" podID="14090610-1109-42b7-851d-847e3d9b4f7a" containerID="3d5371d81d7fabe06b26cc70ae9f0d9597f3f60a8212d7ab1fd3f55b4fc986e3" exitCode=0 Sep 29 11:35:36 crc kubenswrapper[4752]: I0929 11:35:36.193318 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9qjpl" event={"ID":"14090610-1109-42b7-851d-847e3d9b4f7a","Type":"ContainerDied","Data":"3d5371d81d7fabe06b26cc70ae9f0d9597f3f60a8212d7ab1fd3f55b4fc986e3"} Sep 29 11:35:36 crc kubenswrapper[4752]: I0929 11:35:36.193343 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9qjpl" event={"ID":"14090610-1109-42b7-851d-847e3d9b4f7a","Type":"ContainerDied","Data":"f34d3942e4377158dd9c35ae864979b75508901277e9ba13e583e9399128db0d"} Sep 29 11:35:36 crc kubenswrapper[4752]: I0929 11:35:36.193363 4752 scope.go:117] "RemoveContainer" containerID="3d5371d81d7fabe06b26cc70ae9f0d9597f3f60a8212d7ab1fd3f55b4fc986e3" Sep 29 11:35:36 crc kubenswrapper[4752]: I0929 11:35:36.193395 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9qjpl" Sep 29 11:35:36 crc kubenswrapper[4752]: I0929 11:35:36.229954 4752 scope.go:117] "RemoveContainer" containerID="cede6878b5f399e6c6c82a1f90cae743db634b278c47f662821b65e68055b814" Sep 29 11:35:36 crc kubenswrapper[4752]: I0929 11:35:36.233590 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9qjpl"] Sep 29 11:35:36 crc kubenswrapper[4752]: I0929 11:35:36.249068 4752 scope.go:117] "RemoveContainer" containerID="a58a61d87422443f392c3d362cbe6f918c1914617603edc28def613df79f29ad" Sep 29 11:35:36 crc kubenswrapper[4752]: I0929 11:35:36.249969 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-9qjpl"] Sep 29 11:35:36 crc kubenswrapper[4752]: I0929 11:35:36.270422 4752 scope.go:117] "RemoveContainer" containerID="3d5371d81d7fabe06b26cc70ae9f0d9597f3f60a8212d7ab1fd3f55b4fc986e3" Sep 29 11:35:36 crc kubenswrapper[4752]: E0929 11:35:36.271053 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d5371d81d7fabe06b26cc70ae9f0d9597f3f60a8212d7ab1fd3f55b4fc986e3\": container with ID starting with 3d5371d81d7fabe06b26cc70ae9f0d9597f3f60a8212d7ab1fd3f55b4fc986e3 not found: ID does not exist" containerID="3d5371d81d7fabe06b26cc70ae9f0d9597f3f60a8212d7ab1fd3f55b4fc986e3" Sep 29 11:35:36 crc kubenswrapper[4752]: I0929 11:35:36.271108 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d5371d81d7fabe06b26cc70ae9f0d9597f3f60a8212d7ab1fd3f55b4fc986e3"} err="failed to get container status \"3d5371d81d7fabe06b26cc70ae9f0d9597f3f60a8212d7ab1fd3f55b4fc986e3\": rpc error: code = NotFound desc = could not find container \"3d5371d81d7fabe06b26cc70ae9f0d9597f3f60a8212d7ab1fd3f55b4fc986e3\": container with ID starting with 3d5371d81d7fabe06b26cc70ae9f0d9597f3f60a8212d7ab1fd3f55b4fc986e3 not found: ID does not exist" Sep 29 11:35:36 crc kubenswrapper[4752]: I0929 11:35:36.271140 4752 scope.go:117] "RemoveContainer" containerID="cede6878b5f399e6c6c82a1f90cae743db634b278c47f662821b65e68055b814" Sep 29 11:35:36 crc kubenswrapper[4752]: E0929 11:35:36.271494 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cede6878b5f399e6c6c82a1f90cae743db634b278c47f662821b65e68055b814\": container with ID starting with cede6878b5f399e6c6c82a1f90cae743db634b278c47f662821b65e68055b814 not found: ID does not exist" containerID="cede6878b5f399e6c6c82a1f90cae743db634b278c47f662821b65e68055b814" Sep 29 11:35:36 crc kubenswrapper[4752]: I0929 11:35:36.271604 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cede6878b5f399e6c6c82a1f90cae743db634b278c47f662821b65e68055b814"} err="failed to get container status \"cede6878b5f399e6c6c82a1f90cae743db634b278c47f662821b65e68055b814\": rpc error: code = NotFound desc = could not find container \"cede6878b5f399e6c6c82a1f90cae743db634b278c47f662821b65e68055b814\": container with ID starting with cede6878b5f399e6c6c82a1f90cae743db634b278c47f662821b65e68055b814 not found: ID does not exist" Sep 29 11:35:36 crc kubenswrapper[4752]: I0929 11:35:36.271711 4752 scope.go:117] "RemoveContainer" containerID="a58a61d87422443f392c3d362cbe6f918c1914617603edc28def613df79f29ad" Sep 29 11:35:36 crc kubenswrapper[4752]: E0929 11:35:36.272098 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a58a61d87422443f392c3d362cbe6f918c1914617603edc28def613df79f29ad\": container with ID starting with a58a61d87422443f392c3d362cbe6f918c1914617603edc28def613df79f29ad not found: ID does not exist" containerID="a58a61d87422443f392c3d362cbe6f918c1914617603edc28def613df79f29ad" Sep 29 11:35:36 crc kubenswrapper[4752]: I0929 11:35:36.272208 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a58a61d87422443f392c3d362cbe6f918c1914617603edc28def613df79f29ad"} err="failed to get container status \"a58a61d87422443f392c3d362cbe6f918c1914617603edc28def613df79f29ad\": rpc error: code = NotFound desc = could not find container \"a58a61d87422443f392c3d362cbe6f918c1914617603edc28def613df79f29ad\": container with ID starting with a58a61d87422443f392c3d362cbe6f918c1914617603edc28def613df79f29ad not found: ID does not exist" Sep 29 11:35:36 crc kubenswrapper[4752]: I0929 11:35:36.364942 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-j48h8/must-gather-2f47z" Sep 29 11:35:36 crc kubenswrapper[4752]: I0929 11:35:36.597976 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-j48h8/must-gather-2f47z"] Sep 29 11:35:36 crc kubenswrapper[4752]: W0929 11:35:36.598209 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod95d94158_0ee1_4665_ab07_2656c87c4881.slice/crio-6343b7113dcc245d8d8e6d4e0e3f26c640663eda3b5afc2e6867fc7f6bd631dc WatchSource:0}: Error finding container 6343b7113dcc245d8d8e6d4e0e3f26c640663eda3b5afc2e6867fc7f6bd631dc: Status 404 returned error can't find the container with id 6343b7113dcc245d8d8e6d4e0e3f26c640663eda3b5afc2e6867fc7f6bd631dc Sep 29 11:35:37 crc kubenswrapper[4752]: I0929 11:35:37.205448 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-j48h8/must-gather-2f47z" event={"ID":"95d94158-0ee1-4665-ab07-2656c87c4881","Type":"ContainerStarted","Data":"6343b7113dcc245d8d8e6d4e0e3f26c640663eda3b5afc2e6867fc7f6bd631dc"} Sep 29 11:35:38 crc kubenswrapper[4752]: I0929 11:35:38.044243 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14090610-1109-42b7-851d-847e3d9b4f7a" path="/var/lib/kubelet/pods/14090610-1109-42b7-851d-847e3d9b4f7a/volumes" Sep 29 11:35:41 crc kubenswrapper[4752]: I0929 11:35:41.259244 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-j48h8/must-gather-2f47z" event={"ID":"95d94158-0ee1-4665-ab07-2656c87c4881","Type":"ContainerStarted","Data":"cd561ceee4ab9eb71a590a7c10aa7091e297e3dd1e33f9f0d1d2986d87b7fee2"} Sep 29 11:35:41 crc kubenswrapper[4752]: I0929 11:35:41.259710 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-j48h8/must-gather-2f47z" event={"ID":"95d94158-0ee1-4665-ab07-2656c87c4881","Type":"ContainerStarted","Data":"73479a08349ee47fcb2639ac5576aa1eedca29689e911463f1118379c0be0f22"} Sep 29 11:35:41 crc kubenswrapper[4752]: I0929 11:35:41.277464 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-j48h8/must-gather-2f47z" podStartSLOduration=2.161049321 podStartE2EDuration="6.277443776s" podCreationTimestamp="2025-09-29 11:35:35 +0000 UTC" firstStartedPulling="2025-09-29 11:35:36.600708003 +0000 UTC m=+3077.389849670" lastFinishedPulling="2025-09-29 11:35:40.717102458 +0000 UTC m=+3081.506244125" observedRunningTime="2025-09-29 11:35:41.276793769 +0000 UTC m=+3082.065935446" watchObservedRunningTime="2025-09-29 11:35:41.277443776 +0000 UTC m=+3082.066585453" Sep 29 11:35:43 crc kubenswrapper[4752]: I0929 11:35:43.031666 4752 scope.go:117] "RemoveContainer" containerID="4edd0d5d46b3e4dc4505574db2621f4af50e77316cad134acc80e3c6d10d13c6" Sep 29 11:35:43 crc kubenswrapper[4752]: E0929 11:35:43.032253 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgrvs_openshift-machine-config-operator(5863c243-797d-462a-b11f-71aaf005f8d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" Sep 29 11:35:57 crc kubenswrapper[4752]: I0929 11:35:57.031647 4752 scope.go:117] "RemoveContainer" containerID="4edd0d5d46b3e4dc4505574db2621f4af50e77316cad134acc80e3c6d10d13c6" Sep 29 11:35:57 crc kubenswrapper[4752]: E0929 11:35:57.032513 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgrvs_openshift-machine-config-operator(5863c243-797d-462a-b11f-71aaf005f8d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" Sep 29 11:36:12 crc kubenswrapper[4752]: I0929 11:36:12.031926 4752 scope.go:117] "RemoveContainer" containerID="4edd0d5d46b3e4dc4505574db2621f4af50e77316cad134acc80e3c6d10d13c6" Sep 29 11:36:12 crc kubenswrapper[4752]: E0929 11:36:12.032653 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgrvs_openshift-machine-config-operator(5863c243-797d-462a-b11f-71aaf005f8d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" Sep 29 11:36:26 crc kubenswrapper[4752]: I0929 11:36:26.031178 4752 scope.go:117] "RemoveContainer" containerID="4edd0d5d46b3e4dc4505574db2621f4af50e77316cad134acc80e3c6d10d13c6" Sep 29 11:36:26 crc kubenswrapper[4752]: E0929 11:36:26.031942 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgrvs_openshift-machine-config-operator(5863c243-797d-462a-b11f-71aaf005f8d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" Sep 29 11:36:34 crc kubenswrapper[4752]: I0929 11:36:34.481824 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_24245cca302114c11807d483276eec81c7be35f7f8990fcd9b59e964097xccn_f94fa837-71f7-4540-b897-430e3f72928f/util/0.log" Sep 29 11:36:34 crc kubenswrapper[4752]: I0929 11:36:34.580304 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_24245cca302114c11807d483276eec81c7be35f7f8990fcd9b59e964097xccn_f94fa837-71f7-4540-b897-430e3f72928f/util/0.log" Sep 29 11:36:34 crc kubenswrapper[4752]: I0929 11:36:34.604268 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_24245cca302114c11807d483276eec81c7be35f7f8990fcd9b59e964097xccn_f94fa837-71f7-4540-b897-430e3f72928f/pull/0.log" Sep 29 11:36:34 crc kubenswrapper[4752]: I0929 11:36:34.614633 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_24245cca302114c11807d483276eec81c7be35f7f8990fcd9b59e964097xccn_f94fa837-71f7-4540-b897-430e3f72928f/pull/0.log" Sep 29 11:36:34 crc kubenswrapper[4752]: I0929 11:36:34.770562 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_24245cca302114c11807d483276eec81c7be35f7f8990fcd9b59e964097xccn_f94fa837-71f7-4540-b897-430e3f72928f/pull/0.log" Sep 29 11:36:34 crc kubenswrapper[4752]: I0929 11:36:34.786600 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_24245cca302114c11807d483276eec81c7be35f7f8990fcd9b59e964097xccn_f94fa837-71f7-4540-b897-430e3f72928f/util/0.log" Sep 29 11:36:34 crc kubenswrapper[4752]: I0929 11:36:34.787637 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_24245cca302114c11807d483276eec81c7be35f7f8990fcd9b59e964097xccn_f94fa837-71f7-4540-b897-430e3f72928f/extract/0.log" Sep 29 11:36:34 crc kubenswrapper[4752]: I0929 11:36:34.932534 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ac6ff96320e17949fb797c48f279c0d92700e860ad7f5bcf96b6fff45664bpm_f1ec6fb4-143b-4af3-ac83-9a4e73a138eb/util/0.log" Sep 29 11:36:35 crc kubenswrapper[4752]: I0929 11:36:35.091827 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ac6ff96320e17949fb797c48f279c0d92700e860ad7f5bcf96b6fff45664bpm_f1ec6fb4-143b-4af3-ac83-9a4e73a138eb/util/0.log" Sep 29 11:36:35 crc kubenswrapper[4752]: I0929 11:36:35.138105 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ac6ff96320e17949fb797c48f279c0d92700e860ad7f5bcf96b6fff45664bpm_f1ec6fb4-143b-4af3-ac83-9a4e73a138eb/pull/0.log" Sep 29 11:36:35 crc kubenswrapper[4752]: I0929 11:36:35.140088 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ac6ff96320e17949fb797c48f279c0d92700e860ad7f5bcf96b6fff45664bpm_f1ec6fb4-143b-4af3-ac83-9a4e73a138eb/pull/0.log" Sep 29 11:36:35 crc kubenswrapper[4752]: I0929 11:36:35.340744 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ac6ff96320e17949fb797c48f279c0d92700e860ad7f5bcf96b6fff45664bpm_f1ec6fb4-143b-4af3-ac83-9a4e73a138eb/extract/0.log" Sep 29 11:36:35 crc kubenswrapper[4752]: I0929 11:36:35.343874 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ac6ff96320e17949fb797c48f279c0d92700e860ad7f5bcf96b6fff45664bpm_f1ec6fb4-143b-4af3-ac83-9a4e73a138eb/util/0.log" Sep 29 11:36:35 crc kubenswrapper[4752]: I0929 11:36:35.344755 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ac6ff96320e17949fb797c48f279c0d92700e860ad7f5bcf96b6fff45664bpm_f1ec6fb4-143b-4af3-ac83-9a4e73a138eb/pull/0.log" Sep 29 11:36:35 crc kubenswrapper[4752]: I0929 11:36:35.498173 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6495d75b5-vdwdn_bcc941f9-bb40-42c6-a04a-2dbccdb5c63d/manager/0.log" Sep 29 11:36:35 crc kubenswrapper[4752]: I0929 11:36:35.527754 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6495d75b5-vdwdn_bcc941f9-bb40-42c6-a04a-2dbccdb5c63d/kube-rbac-proxy/0.log" Sep 29 11:36:35 crc kubenswrapper[4752]: I0929 11:36:35.534293 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-748c574d75-s6kn6_22f0a092-282c-4339-b57e-29bba94f1c26/kube-rbac-proxy/0.log" Sep 29 11:36:35 crc kubenswrapper[4752]: I0929 11:36:35.734506 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-7d74f4d695-gnrn4_d3d16719-6f3a-40f3-a68e-7ca209588644/kube-rbac-proxy/0.log" Sep 29 11:36:35 crc kubenswrapper[4752]: I0929 11:36:35.735172 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-748c574d75-s6kn6_22f0a092-282c-4339-b57e-29bba94f1c26/manager/0.log" Sep 29 11:36:35 crc kubenswrapper[4752]: I0929 11:36:35.808540 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-7d74f4d695-gnrn4_d3d16719-6f3a-40f3-a68e-7ca209588644/manager/0.log" Sep 29 11:36:35 crc kubenswrapper[4752]: I0929 11:36:35.991766 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-67b5d44b7f-fzv5x_02766095-b09a-47c5-b0c0-8577cf0c4df0/kube-rbac-proxy/0.log" Sep 29 11:36:36 crc kubenswrapper[4752]: I0929 11:36:36.033086 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-67b5d44b7f-fzv5x_02766095-b09a-47c5-b0c0-8577cf0c4df0/manager/0.log" Sep 29 11:36:36 crc kubenswrapper[4752]: I0929 11:36:36.153151 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-8ff95898-m96g4_ce29c55b-caec-4b27-b9fd-e815c897c38e/kube-rbac-proxy/0.log" Sep 29 11:36:36 crc kubenswrapper[4752]: I0929 11:36:36.215448 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-8ff95898-m96g4_ce29c55b-caec-4b27-b9fd-e815c897c38e/manager/0.log" Sep 29 11:36:36 crc kubenswrapper[4752]: I0929 11:36:36.278697 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-695847bc78-6ffsd_4c1a6d67-1063-44bf-a2fa-be8dde72fabf/kube-rbac-proxy/0.log" Sep 29 11:36:36 crc kubenswrapper[4752]: I0929 11:36:36.390905 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-695847bc78-6ffsd_4c1a6d67-1063-44bf-a2fa-be8dde72fabf/manager/0.log" Sep 29 11:36:36 crc kubenswrapper[4752]: I0929 11:36:36.526293 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-858cd69f49-c2twd_e06f89cc-db11-4692-ab2c-50405feb9ca1/kube-rbac-proxy/0.log" Sep 29 11:36:36 crc kubenswrapper[4752]: I0929 11:36:36.576552 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-858cd69f49-c2twd_e06f89cc-db11-4692-ab2c-50405feb9ca1/manager/0.log" Sep 29 11:36:36 crc kubenswrapper[4752]: I0929 11:36:36.698343 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-9fc8d5567-lncbz_6f6d7f0e-296b-493f-8e5b-82dc348a3e6d/kube-rbac-proxy/0.log" Sep 29 11:36:36 crc kubenswrapper[4752]: I0929 11:36:36.759646 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-9fc8d5567-lncbz_6f6d7f0e-296b-493f-8e5b-82dc348a3e6d/manager/0.log" Sep 29 11:36:36 crc kubenswrapper[4752]: I0929 11:36:36.893833 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7bf498966c-vwbvq_4ba7065e-6eff-42bb-acc3-2595f5cc8e71/kube-rbac-proxy/0.log" Sep 29 11:36:36 crc kubenswrapper[4752]: I0929 11:36:36.952932 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7bf498966c-vwbvq_4ba7065e-6eff-42bb-acc3-2595f5cc8e71/manager/0.log" Sep 29 11:36:37 crc kubenswrapper[4752]: I0929 11:36:37.029778 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-56cf9c6b99-vxd7j_c95eca0a-b789-4db4-a906-d3323f1ee7ed/kube-rbac-proxy/0.log" Sep 29 11:36:37 crc kubenswrapper[4752]: I0929 11:36:37.268320 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-687b9cf756-qnmsk_28e91257-715f-462e-be70-361652522cb3/kube-rbac-proxy/0.log" Sep 29 11:36:37 crc kubenswrapper[4752]: I0929 11:36:37.269187 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-56cf9c6b99-vxd7j_c95eca0a-b789-4db4-a906-d3323f1ee7ed/manager/0.log" Sep 29 11:36:37 crc kubenswrapper[4752]: I0929 11:36:37.357491 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-687b9cf756-qnmsk_28e91257-715f-462e-be70-361652522cb3/manager/0.log" Sep 29 11:36:37 crc kubenswrapper[4752]: I0929 11:36:37.471894 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-54d766c9f9-w5nvh_97664b33-b187-472d-8e08-462174d3e49a/kube-rbac-proxy/0.log" Sep 29 11:36:37 crc kubenswrapper[4752]: I0929 11:36:37.472340 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-54d766c9f9-w5nvh_97664b33-b187-472d-8e08-462174d3e49a/manager/0.log" Sep 29 11:36:37 crc kubenswrapper[4752]: I0929 11:36:37.648670 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-c7c776c96-5nh8j_db7baf7a-c952-4e2d-adaa-87815a1ad895/kube-rbac-proxy/0.log" Sep 29 11:36:37 crc kubenswrapper[4752]: I0929 11:36:37.669709 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-c7c776c96-5nh8j_db7baf7a-c952-4e2d-adaa-87815a1ad895/manager/0.log" Sep 29 11:36:37 crc kubenswrapper[4752]: I0929 11:36:37.766340 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-76fcc6dc7c-gkcw9_c35176b6-d50b-4e71-9c73-063a8213988c/kube-rbac-proxy/0.log" Sep 29 11:36:37 crc kubenswrapper[4752]: I0929 11:36:37.800254 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-76fcc6dc7c-gkcw9_c35176b6-d50b-4e71-9c73-063a8213988c/manager/0.log" Sep 29 11:36:37 crc kubenswrapper[4752]: I0929 11:36:37.909299 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6d776955-474s5_6c7e19dc-afaf-4b5e-96b0-2049f29b5d0a/kube-rbac-proxy/0.log" Sep 29 11:36:37 crc kubenswrapper[4752]: I0929 11:36:37.918850 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6d776955-474s5_6c7e19dc-afaf-4b5e-96b0-2049f29b5d0a/manager/0.log" Sep 29 11:36:38 crc kubenswrapper[4752]: I0929 11:36:38.016725 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-74b6d8f6ff-gsjz5_5471e4e9-e190-4c84-a23b-090158ae7133/kube-rbac-proxy/0.log" Sep 29 11:36:38 crc kubenswrapper[4752]: I0929 11:36:38.144786 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-2s7f8_b333e87b-b51e-4644-b1de-f08e213d84bd/registry-server/0.log" Sep 29 11:36:38 crc kubenswrapper[4752]: I0929 11:36:38.229642 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-5f95c46c78-tg8bp_54b928b4-cc8d-4093-8aef-4d6540a226c3/kube-rbac-proxy/0.log" Sep 29 11:36:38 crc kubenswrapper[4752]: I0929 11:36:38.376226 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-5f95c46c78-tg8bp_54b928b4-cc8d-4093-8aef-4d6540a226c3/manager/0.log" Sep 29 11:36:38 crc kubenswrapper[4752]: I0929 11:36:38.446013 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-74b6d8f6ff-gsjz5_5471e4e9-e190-4c84-a23b-090158ae7133/manager/0.log" Sep 29 11:36:38 crc kubenswrapper[4752]: I0929 11:36:38.457337 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-774b97b48-zgk9d_06abe84c-c450-4214-b94c-dc8ac39422bd/manager/0.log" Sep 29 11:36:38 crc kubenswrapper[4752]: I0929 11:36:38.480434 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-774b97b48-zgk9d_06abe84c-c450-4214-b94c-dc8ac39422bd/kube-rbac-proxy/0.log" Sep 29 11:36:38 crc kubenswrapper[4752]: I0929 11:36:38.605389 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-79d8469568-l55rr_5b974650-8f7f-4826-b767-0dcd35bb6f3f/operator/0.log" Sep 29 11:36:38 crc kubenswrapper[4752]: I0929 11:36:38.666012 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-bc7dc7bd9-48hc9_f6d91c53-a4c9-4516-8666-75dac446a27e/manager/0.log" Sep 29 11:36:38 crc kubenswrapper[4752]: I0929 11:36:38.670048 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-bc7dc7bd9-48hc9_f6d91c53-a4c9-4516-8666-75dac446a27e/kube-rbac-proxy/0.log" Sep 29 11:36:38 crc kubenswrapper[4752]: I0929 11:36:38.802931 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5bf96cfbc4-g82n6_ac414d81-10d8-4ef8-aeed-3f8bf43eae1d/kube-rbac-proxy/0.log" Sep 29 11:36:38 crc kubenswrapper[4752]: I0929 11:36:38.871775 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-f66b554c6-rwzzh_ec44f1a0-8747-41d6-bf36-9899957065fc/kube-rbac-proxy/0.log" Sep 29 11:36:39 crc kubenswrapper[4752]: I0929 11:36:39.023036 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5bf96cfbc4-g82n6_ac414d81-10d8-4ef8-aeed-3f8bf43eae1d/manager/0.log" Sep 29 11:36:39 crc kubenswrapper[4752]: I0929 11:36:39.055125 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-f66b554c6-rwzzh_ec44f1a0-8747-41d6-bf36-9899957065fc/manager/0.log" Sep 29 11:36:39 crc kubenswrapper[4752]: I0929 11:36:39.164707 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-565b894b7f-l7zgn_2dd90864-f3b9-4972-933f-d6c7a146f8c0/kube-rbac-proxy/0.log" Sep 29 11:36:39 crc kubenswrapper[4752]: I0929 11:36:39.244244 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-index-dmdnl_39307fc3-f44d-447a-80b4-f10d10871be4/registry-server/0.log" Sep 29 11:36:39 crc kubenswrapper[4752]: I0929 11:36:39.442522 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-565b894b7f-l7zgn_2dd90864-f3b9-4972-933f-d6c7a146f8c0/manager/0.log" Sep 29 11:36:41 crc kubenswrapper[4752]: I0929 11:36:41.031097 4752 scope.go:117] "RemoveContainer" containerID="4edd0d5d46b3e4dc4505574db2621f4af50e77316cad134acc80e3c6d10d13c6" Sep 29 11:36:41 crc kubenswrapper[4752]: E0929 11:36:41.031502 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgrvs_openshift-machine-config-operator(5863c243-797d-462a-b11f-71aaf005f8d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" Sep 29 11:36:55 crc kubenswrapper[4752]: I0929 11:36:55.030774 4752 scope.go:117] "RemoveContainer" containerID="4edd0d5d46b3e4dc4505574db2621f4af50e77316cad134acc80e3c6d10d13c6" Sep 29 11:36:55 crc kubenswrapper[4752]: E0929 11:36:55.031573 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgrvs_openshift-machine-config-operator(5863c243-797d-462a-b11f-71aaf005f8d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" Sep 29 11:36:56 crc kubenswrapper[4752]: I0929 11:36:56.106381 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-s8pf8_24ab4270-1ece-4201-94ae-51c71902c3f1/control-plane-machine-set-operator/0.log" Sep 29 11:36:56 crc kubenswrapper[4752]: I0929 11:36:56.346439 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-ntjk6_2477e356-dc04-44a6-bec0-e7304134493f/kube-rbac-proxy/0.log" Sep 29 11:36:56 crc kubenswrapper[4752]: I0929 11:36:56.346436 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-ntjk6_2477e356-dc04-44a6-bec0-e7304134493f/machine-api-operator/0.log" Sep 29 11:37:08 crc kubenswrapper[4752]: I0929 11:37:08.726995 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-7d4cc89fcb-pcv7b_77aec240-9f3c-4bbd-bb60-c1c1a81999ae/cert-manager-controller/0.log" Sep 29 11:37:08 crc kubenswrapper[4752]: I0929 11:37:08.888653 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7d9f95dbf-vqkv7_31dd96e4-2a05-4c30-b142-2cdcd6d82798/cert-manager-cainjector/0.log" Sep 29 11:37:08 crc kubenswrapper[4752]: I0929 11:37:08.958517 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-d969966f-bqbrt_466779ed-d985-4eba-9277-1c5be5b56f9d/cert-manager-webhook/0.log" Sep 29 11:37:10 crc kubenswrapper[4752]: I0929 11:37:10.035667 4752 scope.go:117] "RemoveContainer" containerID="4edd0d5d46b3e4dc4505574db2621f4af50e77316cad134acc80e3c6d10d13c6" Sep 29 11:37:10 crc kubenswrapper[4752]: I0929 11:37:10.967217 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" event={"ID":"5863c243-797d-462a-b11f-71aaf005f8d1","Type":"ContainerStarted","Data":"4f090cb06b45016b138b19baf0457d61e893d318bd9d300e02183fbc10bfe7df"} Sep 29 11:37:21 crc kubenswrapper[4752]: I0929 11:37:21.008890 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-864bb6dfb5-gwrnx_1f74c3bc-756d-48fc-848c-c8e8a045dee4/nmstate-console-plugin/0.log" Sep 29 11:37:21 crc kubenswrapper[4752]: I0929 11:37:21.229460 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58fcddf996-7pc9n_1031b1fd-032a-4972-835d-8f53171bc7de/nmstate-metrics/0.log" Sep 29 11:37:21 crc kubenswrapper[4752]: I0929 11:37:21.237456 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-nz9pc_5bace403-1b12-45b3-a28c-8a7c58393a22/nmstate-handler/0.log" Sep 29 11:37:21 crc kubenswrapper[4752]: I0929 11:37:21.270165 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58fcddf996-7pc9n_1031b1fd-032a-4972-835d-8f53171bc7de/kube-rbac-proxy/0.log" Sep 29 11:37:21 crc kubenswrapper[4752]: I0929 11:37:21.396193 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5d6f6cfd66-s6ttt_c7616dcc-135c-41d4-bf4c-8f72270fa5fd/nmstate-operator/0.log" Sep 29 11:37:21 crc kubenswrapper[4752]: I0929 11:37:21.466311 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6d689559c5-75qgt_91fc3e7a-9ef1-435e-9fac-a273bdf2de57/nmstate-webhook/0.log" Sep 29 11:37:35 crc kubenswrapper[4752]: I0929 11:37:35.257389 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5d688f5ffc-fqwg9_20827846-6dfa-4a33-b073-182cd993a60a/kube-rbac-proxy/0.log" Sep 29 11:37:35 crc kubenswrapper[4752]: I0929 11:37:35.417530 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5d688f5ffc-fqwg9_20827846-6dfa-4a33-b073-182cd993a60a/controller/0.log" Sep 29 11:37:35 crc kubenswrapper[4752]: I0929 11:37:35.441251 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rzcng_1311e458-5fb3-4322-9c19-08bce8710d6e/cp-frr-files/0.log" Sep 29 11:37:35 crc kubenswrapper[4752]: I0929 11:37:35.604600 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rzcng_1311e458-5fb3-4322-9c19-08bce8710d6e/cp-frr-files/0.log" Sep 29 11:37:35 crc kubenswrapper[4752]: I0929 11:37:35.625454 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rzcng_1311e458-5fb3-4322-9c19-08bce8710d6e/cp-metrics/0.log" Sep 29 11:37:35 crc kubenswrapper[4752]: I0929 11:37:35.664052 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rzcng_1311e458-5fb3-4322-9c19-08bce8710d6e/cp-reloader/0.log" Sep 29 11:37:35 crc kubenswrapper[4752]: I0929 11:37:35.671088 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rzcng_1311e458-5fb3-4322-9c19-08bce8710d6e/cp-reloader/0.log" Sep 29 11:37:35 crc kubenswrapper[4752]: I0929 11:37:35.920353 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rzcng_1311e458-5fb3-4322-9c19-08bce8710d6e/cp-metrics/0.log" Sep 29 11:37:36 crc kubenswrapper[4752]: I0929 11:37:36.028609 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rzcng_1311e458-5fb3-4322-9c19-08bce8710d6e/cp-reloader/0.log" Sep 29 11:37:36 crc kubenswrapper[4752]: I0929 11:37:36.039698 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rzcng_1311e458-5fb3-4322-9c19-08bce8710d6e/cp-frr-files/0.log" Sep 29 11:37:36 crc kubenswrapper[4752]: I0929 11:37:36.060559 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rzcng_1311e458-5fb3-4322-9c19-08bce8710d6e/cp-metrics/0.log" Sep 29 11:37:36 crc kubenswrapper[4752]: I0929 11:37:36.256847 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rzcng_1311e458-5fb3-4322-9c19-08bce8710d6e/cp-frr-files/0.log" Sep 29 11:37:36 crc kubenswrapper[4752]: I0929 11:37:36.263453 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rzcng_1311e458-5fb3-4322-9c19-08bce8710d6e/cp-reloader/0.log" Sep 29 11:37:36 crc kubenswrapper[4752]: I0929 11:37:36.277242 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rzcng_1311e458-5fb3-4322-9c19-08bce8710d6e/cp-metrics/0.log" Sep 29 11:37:36 crc kubenswrapper[4752]: I0929 11:37:36.303145 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rzcng_1311e458-5fb3-4322-9c19-08bce8710d6e/controller/0.log" Sep 29 11:37:36 crc kubenswrapper[4752]: I0929 11:37:36.443257 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rzcng_1311e458-5fb3-4322-9c19-08bce8710d6e/frr-metrics/0.log" Sep 29 11:37:36 crc kubenswrapper[4752]: I0929 11:37:36.479217 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rzcng_1311e458-5fb3-4322-9c19-08bce8710d6e/kube-rbac-proxy/0.log" Sep 29 11:37:36 crc kubenswrapper[4752]: I0929 11:37:36.491179 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rzcng_1311e458-5fb3-4322-9c19-08bce8710d6e/kube-rbac-proxy-frr/0.log" Sep 29 11:37:36 crc kubenswrapper[4752]: I0929 11:37:36.697938 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-5478bdb765-tmdtx_bafc1f1a-c41b-4f47-bf6c-134aa6d5f9f0/frr-k8s-webhook-server/0.log" Sep 29 11:37:36 crc kubenswrapper[4752]: I0929 11:37:36.702978 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rzcng_1311e458-5fb3-4322-9c19-08bce8710d6e/reloader/0.log" Sep 29 11:37:37 crc kubenswrapper[4752]: I0929 11:37:37.058295 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-74ffd6549c-bbnch_945488c0-5a99-455c-b0da-f3d188ff4438/manager/0.log" Sep 29 11:37:37 crc kubenswrapper[4752]: I0929 11:37:37.248498 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-7cb899cd5c-bldh9_9eeaaa19-fdbf-4d23-90bb-94c687033574/webhook-server/0.log" Sep 29 11:37:37 crc kubenswrapper[4752]: I0929 11:37:37.251153 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rzcng_1311e458-5fb3-4322-9c19-08bce8710d6e/frr/0.log" Sep 29 11:37:37 crc kubenswrapper[4752]: I0929 11:37:37.310454 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-s4jmn_09cd799c-a613-4441-b407-b907862fec48/kube-rbac-proxy/0.log" Sep 29 11:37:37 crc kubenswrapper[4752]: I0929 11:37:37.654865 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-s4jmn_09cd799c-a613-4441-b407-b907862fec48/speaker/0.log" Sep 29 11:37:59 crc kubenswrapper[4752]: I0929 11:37:59.301759 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_alertmanager-metric-storage-0_a00720d9-6655-4234-a3ea-25de9303b7e4/init-config-reloader/0.log" Sep 29 11:37:59 crc kubenswrapper[4752]: I0929 11:37:59.448140 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_alertmanager-metric-storage-0_a00720d9-6655-4234-a3ea-25de9303b7e4/alertmanager/0.log" Sep 29 11:37:59 crc kubenswrapper[4752]: I0929 11:37:59.494279 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_alertmanager-metric-storage-0_a00720d9-6655-4234-a3ea-25de9303b7e4/config-reloader/0.log" Sep 29 11:37:59 crc kubenswrapper[4752]: I0929 11:37:59.572308 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_alertmanager-metric-storage-0_a00720d9-6655-4234-a3ea-25de9303b7e4/init-config-reloader/0.log" Sep 29 11:37:59 crc kubenswrapper[4752]: I0929 11:37:59.655548 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_ceilometer-0_7b6facb1-a514-4754-bc86-f3b7acd0fc99/ceilometer-central-agent/0.log" Sep 29 11:37:59 crc kubenswrapper[4752]: I0929 11:37:59.695315 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_ceilometer-0_7b6facb1-a514-4754-bc86-f3b7acd0fc99/ceilometer-notification-agent/0.log" Sep 29 11:37:59 crc kubenswrapper[4752]: I0929 11:37:59.712559 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_ceilometer-0_7b6facb1-a514-4754-bc86-f3b7acd0fc99/proxy-httpd/0.log" Sep 29 11:37:59 crc kubenswrapper[4752]: I0929 11:37:59.753007 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_ceilometer-0_7b6facb1-a514-4754-bc86-f3b7acd0fc99/sg-core/0.log" Sep 29 11:37:59 crc kubenswrapper[4752]: I0929 11:37:59.989967 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_kube-state-metrics-0_8ebf91ee-06e1-493b-89ad-8af75463aa3e/kube-state-metrics/0.log" Sep 29 11:37:59 crc kubenswrapper[4752]: I0929 11:37:59.995898 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_keystone-556fd57974-4ns5d_514edfd0-8be7-4795-9f0e-fc403c69f692/keystone-api/0.log" Sep 29 11:38:00 crc kubenswrapper[4752]: I0929 11:38:00.295017 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_openstack-galera-0_58c49438-2c74-4c5e-b476-7fff98957387/mysql-bootstrap/0.log" Sep 29 11:38:00 crc kubenswrapper[4752]: I0929 11:38:00.377907 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_openstack-galera-0_58c49438-2c74-4c5e-b476-7fff98957387/mysql-bootstrap/0.log" Sep 29 11:38:00 crc kubenswrapper[4752]: I0929 11:38:00.434916 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_openstack-galera-0_58c49438-2c74-4c5e-b476-7fff98957387/galera/0.log" Sep 29 11:38:00 crc kubenswrapper[4752]: I0929 11:38:00.551215 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_openstackclient_228c713d-fd14-4b79-9658-e5e4dd26d2e0/openstackclient/0.log" Sep 29 11:38:00 crc kubenswrapper[4752]: I0929 11:38:00.708406 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_prometheus-metric-storage-0_e1e250ba-1828-4bd7-8e1d-71e5aea5a7f2/init-config-reloader/0.log" Sep 29 11:38:00 crc kubenswrapper[4752]: I0929 11:38:00.878918 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_prometheus-metric-storage-0_e1e250ba-1828-4bd7-8e1d-71e5aea5a7f2/config-reloader/0.log" Sep 29 11:38:00 crc kubenswrapper[4752]: I0929 11:38:00.916476 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_prometheus-metric-storage-0_e1e250ba-1828-4bd7-8e1d-71e5aea5a7f2/prometheus/0.log" Sep 29 11:38:00 crc kubenswrapper[4752]: I0929 11:38:00.988817 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_prometheus-metric-storage-0_e1e250ba-1828-4bd7-8e1d-71e5aea5a7f2/init-config-reloader/0.log" Sep 29 11:38:01 crc kubenswrapper[4752]: I0929 11:38:01.099916 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_prometheus-metric-storage-0_e1e250ba-1828-4bd7-8e1d-71e5aea5a7f2/thanos-sidecar/0.log" Sep 29 11:38:01 crc kubenswrapper[4752]: I0929 11:38:01.160822 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_rabbitmq-notifications-server-0_23451e36-d03b-4039-ba05-d20e013b089b/setup-container/0.log" Sep 29 11:38:01 crc kubenswrapper[4752]: I0929 11:38:01.419089 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_rabbitmq-notifications-server-0_23451e36-d03b-4039-ba05-d20e013b089b/setup-container/0.log" Sep 29 11:38:01 crc kubenswrapper[4752]: I0929 11:38:01.448606 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_rabbitmq-notifications-server-0_23451e36-d03b-4039-ba05-d20e013b089b/rabbitmq/0.log" Sep 29 11:38:01 crc kubenswrapper[4752]: I0929 11:38:01.648134 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_rabbitmq-server-0_0b2b1ef4-961c-4803-856d-4d6deb42cc10/setup-container/0.log" Sep 29 11:38:02 crc kubenswrapper[4752]: I0929 11:38:02.003491 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_rabbitmq-server-0_0b2b1ef4-961c-4803-856d-4d6deb42cc10/rabbitmq/0.log" Sep 29 11:38:02 crc kubenswrapper[4752]: I0929 11:38:02.011405 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_rabbitmq-server-0_0b2b1ef4-961c-4803-856d-4d6deb42cc10/setup-container/0.log" Sep 29 11:38:02 crc kubenswrapper[4752]: I0929 11:38:02.240323 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-20c0-account-create-jr4ls_307f40b1-f2e8-408e-9d6b-068f0259c850/mariadb-account-create/0.log" Sep 29 11:38:02 crc kubenswrapper[4752]: I0929 11:38:02.378969 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-db-create-84dqj_9bc68bdf-98e2-4ebf-be29-30fb59ebf7bd/mariadb-database-create/0.log" Sep 29 11:38:02 crc kubenswrapper[4752]: I0929 11:38:02.580016 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-api-0_04d72897-fee3-42f8-8b75-4ade4c6b4f9a/watcher-api/0.log" Sep 29 11:38:02 crc kubenswrapper[4752]: I0929 11:38:02.721652 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-api-0_04d72897-fee3-42f8-8b75-4ade4c6b4f9a/watcher-kuttl-api-log/0.log" Sep 29 11:38:02 crc kubenswrapper[4752]: I0929 11:38:02.953655 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-applier-0_555f166b-3ad0-4620-86ec-28c701938327/watcher-applier/0.log" Sep 29 11:38:03 crc kubenswrapper[4752]: I0929 11:38:03.067758 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-db-sync-fnlm6_c788fc02-bde8-4ee8-a380-0c50d4b56c65/watcher-kuttl-db-sync/0.log" Sep 29 11:38:03 crc kubenswrapper[4752]: I0929 11:38:03.248421 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_2de6e6b7-7c8e-4e4d-9e56-c341913f282c/watcher-decision-engine/0.log" Sep 29 11:38:16 crc kubenswrapper[4752]: I0929 11:38:16.845688 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_memcached-0_bfbc51a1-9e9b-4af1-865c-d6228444dded/memcached/0.log" Sep 29 11:38:20 crc kubenswrapper[4752]: I0929 11:38:20.314228 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb694ldsh_67244ff2-a859-4d03-8ef2-8898fcfd01c0/util/0.log" Sep 29 11:38:20 crc kubenswrapper[4752]: I0929 11:38:20.517493 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb694ldsh_67244ff2-a859-4d03-8ef2-8898fcfd01c0/util/0.log" Sep 29 11:38:20 crc kubenswrapper[4752]: I0929 11:38:20.526786 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb694ldsh_67244ff2-a859-4d03-8ef2-8898fcfd01c0/pull/0.log" Sep 29 11:38:20 crc kubenswrapper[4752]: I0929 11:38:20.549505 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb694ldsh_67244ff2-a859-4d03-8ef2-8898fcfd01c0/pull/0.log" Sep 29 11:38:20 crc kubenswrapper[4752]: I0929 11:38:20.722818 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb694ldsh_67244ff2-a859-4d03-8ef2-8898fcfd01c0/util/0.log" Sep 29 11:38:20 crc kubenswrapper[4752]: I0929 11:38:20.756594 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb694ldsh_67244ff2-a859-4d03-8ef2-8898fcfd01c0/extract/0.log" Sep 29 11:38:20 crc kubenswrapper[4752]: I0929 11:38:20.831906 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb694ldsh_67244ff2-a859-4d03-8ef2-8898fcfd01c0/pull/0.log" Sep 29 11:38:20 crc kubenswrapper[4752]: I0929 11:38:20.969665 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcwqf42_146d6e75-b6e0-4be4-a82e-cafe303eb1c1/util/0.log" Sep 29 11:38:21 crc kubenswrapper[4752]: I0929 11:38:21.138774 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcwqf42_146d6e75-b6e0-4be4-a82e-cafe303eb1c1/util/0.log" Sep 29 11:38:21 crc kubenswrapper[4752]: I0929 11:38:21.156614 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcwqf42_146d6e75-b6e0-4be4-a82e-cafe303eb1c1/pull/0.log" Sep 29 11:38:21 crc kubenswrapper[4752]: I0929 11:38:21.184568 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcwqf42_146d6e75-b6e0-4be4-a82e-cafe303eb1c1/pull/0.log" Sep 29 11:38:21 crc kubenswrapper[4752]: I0929 11:38:21.365132 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcwqf42_146d6e75-b6e0-4be4-a82e-cafe303eb1c1/util/0.log" Sep 29 11:38:21 crc kubenswrapper[4752]: I0929 11:38:21.387771 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcwqf42_146d6e75-b6e0-4be4-a82e-cafe303eb1c1/pull/0.log" Sep 29 11:38:21 crc kubenswrapper[4752]: I0929 11:38:21.388633 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcwqf42_146d6e75-b6e0-4be4-a82e-cafe303eb1c1/extract/0.log" Sep 29 11:38:21 crc kubenswrapper[4752]: I0929 11:38:21.535601 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dmwxrn_fd528af5-f62d-483e-8c18-ac07cdf64251/util/0.log" Sep 29 11:38:21 crc kubenswrapper[4752]: I0929 11:38:21.700768 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dmwxrn_fd528af5-f62d-483e-8c18-ac07cdf64251/pull/0.log" Sep 29 11:38:21 crc kubenswrapper[4752]: I0929 11:38:21.731818 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dmwxrn_fd528af5-f62d-483e-8c18-ac07cdf64251/pull/0.log" Sep 29 11:38:21 crc kubenswrapper[4752]: I0929 11:38:21.738949 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dmwxrn_fd528af5-f62d-483e-8c18-ac07cdf64251/util/0.log" Sep 29 11:38:21 crc kubenswrapper[4752]: I0929 11:38:21.879853 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dmwxrn_fd528af5-f62d-483e-8c18-ac07cdf64251/util/0.log" Sep 29 11:38:21 crc kubenswrapper[4752]: I0929 11:38:21.885321 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dmwxrn_fd528af5-f62d-483e-8c18-ac07cdf64251/pull/0.log" Sep 29 11:38:21 crc kubenswrapper[4752]: I0929 11:38:21.893818 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dmwxrn_fd528af5-f62d-483e-8c18-ac07cdf64251/extract/0.log" Sep 29 11:38:22 crc kubenswrapper[4752]: I0929 11:38:22.090299 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-brllf_b31d27ce-a227-4b44-ac25-10e7c9133fb8/extract-utilities/0.log" Sep 29 11:38:22 crc kubenswrapper[4752]: I0929 11:38:22.284166 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-brllf_b31d27ce-a227-4b44-ac25-10e7c9133fb8/extract-content/0.log" Sep 29 11:38:22 crc kubenswrapper[4752]: I0929 11:38:22.290213 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-brllf_b31d27ce-a227-4b44-ac25-10e7c9133fb8/extract-utilities/0.log" Sep 29 11:38:22 crc kubenswrapper[4752]: I0929 11:38:22.301385 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-brllf_b31d27ce-a227-4b44-ac25-10e7c9133fb8/extract-content/0.log" Sep 29 11:38:22 crc kubenswrapper[4752]: I0929 11:38:22.457524 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-brllf_b31d27ce-a227-4b44-ac25-10e7c9133fb8/extract-content/0.log" Sep 29 11:38:22 crc kubenswrapper[4752]: I0929 11:38:22.474612 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-brllf_b31d27ce-a227-4b44-ac25-10e7c9133fb8/extract-utilities/0.log" Sep 29 11:38:22 crc kubenswrapper[4752]: I0929 11:38:22.752997 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bpmrk_201657e8-ebe9-4415-acd9-9971ede44bd2/extract-utilities/0.log" Sep 29 11:38:23 crc kubenswrapper[4752]: I0929 11:38:23.054553 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-brllf_b31d27ce-a227-4b44-ac25-10e7c9133fb8/registry-server/0.log" Sep 29 11:38:23 crc kubenswrapper[4752]: I0929 11:38:23.191259 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bpmrk_201657e8-ebe9-4415-acd9-9971ede44bd2/extract-utilities/0.log" Sep 29 11:38:23 crc kubenswrapper[4752]: I0929 11:38:23.225963 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bpmrk_201657e8-ebe9-4415-acd9-9971ede44bd2/extract-content/0.log" Sep 29 11:38:23 crc kubenswrapper[4752]: I0929 11:38:23.227457 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bpmrk_201657e8-ebe9-4415-acd9-9971ede44bd2/extract-content/0.log" Sep 29 11:38:23 crc kubenswrapper[4752]: I0929 11:38:23.410881 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bpmrk_201657e8-ebe9-4415-acd9-9971ede44bd2/extract-content/0.log" Sep 29 11:38:23 crc kubenswrapper[4752]: I0929 11:38:23.430458 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bpmrk_201657e8-ebe9-4415-acd9-9971ede44bd2/extract-utilities/0.log" Sep 29 11:38:23 crc kubenswrapper[4752]: I0929 11:38:23.627072 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d966hkt4_71f15cd4-4eb9-4fcb-b2e9-789c06fa9670/util/0.log" Sep 29 11:38:23 crc kubenswrapper[4752]: I0929 11:38:23.934466 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d966hkt4_71f15cd4-4eb9-4fcb-b2e9-789c06fa9670/pull/0.log" Sep 29 11:38:23 crc kubenswrapper[4752]: I0929 11:38:23.977243 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d966hkt4_71f15cd4-4eb9-4fcb-b2e9-789c06fa9670/util/0.log" Sep 29 11:38:24 crc kubenswrapper[4752]: I0929 11:38:24.026754 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bpmrk_201657e8-ebe9-4415-acd9-9971ede44bd2/registry-server/0.log" Sep 29 11:38:24 crc kubenswrapper[4752]: I0929 11:38:24.050858 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d966hkt4_71f15cd4-4eb9-4fcb-b2e9-789c06fa9670/pull/0.log" Sep 29 11:38:24 crc kubenswrapper[4752]: I0929 11:38:24.168865 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d966hkt4_71f15cd4-4eb9-4fcb-b2e9-789c06fa9670/util/0.log" Sep 29 11:38:24 crc kubenswrapper[4752]: I0929 11:38:24.208134 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d966hkt4_71f15cd4-4eb9-4fcb-b2e9-789c06fa9670/extract/0.log" Sep 29 11:38:24 crc kubenswrapper[4752]: I0929 11:38:24.219264 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d966hkt4_71f15cd4-4eb9-4fcb-b2e9-789c06fa9670/pull/0.log" Sep 29 11:38:24 crc kubenswrapper[4752]: I0929 11:38:24.277336 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-64489_6d06e125-ac1f-4214-8e49-35a46d23413b/marketplace-operator/0.log" Sep 29 11:38:24 crc kubenswrapper[4752]: I0929 11:38:24.444300 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zwcjc_10b46062-2635-4c36-9153-0f5b4ae3c054/extract-utilities/0.log" Sep 29 11:38:24 crc kubenswrapper[4752]: I0929 11:38:24.543321 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zwcjc_10b46062-2635-4c36-9153-0f5b4ae3c054/extract-utilities/0.log" Sep 29 11:38:24 crc kubenswrapper[4752]: I0929 11:38:24.571085 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zwcjc_10b46062-2635-4c36-9153-0f5b4ae3c054/extract-content/0.log" Sep 29 11:38:24 crc kubenswrapper[4752]: I0929 11:38:24.581026 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zwcjc_10b46062-2635-4c36-9153-0f5b4ae3c054/extract-content/0.log" Sep 29 11:38:24 crc kubenswrapper[4752]: I0929 11:38:24.750705 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zwcjc_10b46062-2635-4c36-9153-0f5b4ae3c054/extract-content/0.log" Sep 29 11:38:24 crc kubenswrapper[4752]: I0929 11:38:24.801557 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zwcjc_10b46062-2635-4c36-9153-0f5b4ae3c054/extract-utilities/0.log" Sep 29 11:38:24 crc kubenswrapper[4752]: I0929 11:38:24.824356 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hqlqv_4a95b36e-5f4a-4681-9dce-b4980f41ec2a/extract-utilities/0.log" Sep 29 11:38:24 crc kubenswrapper[4752]: I0929 11:38:24.881390 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zwcjc_10b46062-2635-4c36-9153-0f5b4ae3c054/registry-server/0.log" Sep 29 11:38:25 crc kubenswrapper[4752]: I0929 11:38:25.029753 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hqlqv_4a95b36e-5f4a-4681-9dce-b4980f41ec2a/extract-utilities/0.log" Sep 29 11:38:25 crc kubenswrapper[4752]: I0929 11:38:25.033321 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hqlqv_4a95b36e-5f4a-4681-9dce-b4980f41ec2a/extract-content/0.log" Sep 29 11:38:25 crc kubenswrapper[4752]: I0929 11:38:25.049826 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hqlqv_4a95b36e-5f4a-4681-9dce-b4980f41ec2a/extract-content/0.log" Sep 29 11:38:25 crc kubenswrapper[4752]: I0929 11:38:25.216737 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hqlqv_4a95b36e-5f4a-4681-9dce-b4980f41ec2a/extract-content/0.log" Sep 29 11:38:25 crc kubenswrapper[4752]: I0929 11:38:25.219056 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hqlqv_4a95b36e-5f4a-4681-9dce-b4980f41ec2a/extract-utilities/0.log" Sep 29 11:38:25 crc kubenswrapper[4752]: I0929 11:38:25.814777 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hqlqv_4a95b36e-5f4a-4681-9dce-b4980f41ec2a/registry-server/0.log" Sep 29 11:38:37 crc kubenswrapper[4752]: I0929 11:38:37.817830 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-7c8cf85677-48xsw_6fed8192-c969-4d5e-90f8-8dfcbdb533f6/prometheus-operator/0.log" Sep 29 11:38:37 crc kubenswrapper[4752]: I0929 11:38:37.935759 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-66f87598ff-kwzp7_81a0fa10-1cf0-4d64-8d87-be74cb9f191c/prometheus-operator-admission-webhook/0.log" Sep 29 11:38:37 crc kubenswrapper[4752]: I0929 11:38:37.956394 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-66f87598ff-nzms9_0dcbf226-4af2-490a-8974-2c107af2a51f/prometheus-operator-admission-webhook/0.log" Sep 29 11:38:38 crc kubenswrapper[4752]: I0929 11:38:38.125000 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-cc5f78dfc-xlqjx_48a83e0b-a019-4677-9d2c-4eaafc3a36b9/operator/0.log" Sep 29 11:38:38 crc kubenswrapper[4752]: I0929 11:38:38.177766 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-ui-dashboards-6584dc9448-w8t78_a408d775-16c4-45e5-9750-e3b3f8141bd6/observability-ui-dashboards/0.log" Sep 29 11:38:38 crc kubenswrapper[4752]: I0929 11:38:38.340467 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-54bc95c9fb-gvb7z_094aaacc-0758-4722-bc88-f4d4fc529d36/perses-operator/0.log" Sep 29 11:39:26 crc kubenswrapper[4752]: I0929 11:39:26.175818 4752 patch_prober.go:28] interesting pod/machine-config-daemon-mgrvs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 11:39:26 crc kubenswrapper[4752]: I0929 11:39:26.176455 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 11:39:40 crc kubenswrapper[4752]: I0929 11:39:40.129669 4752 generic.go:334] "Generic (PLEG): container finished" podID="95d94158-0ee1-4665-ab07-2656c87c4881" containerID="73479a08349ee47fcb2639ac5576aa1eedca29689e911463f1118379c0be0f22" exitCode=0 Sep 29 11:39:40 crc kubenswrapper[4752]: I0929 11:39:40.129767 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-j48h8/must-gather-2f47z" event={"ID":"95d94158-0ee1-4665-ab07-2656c87c4881","Type":"ContainerDied","Data":"73479a08349ee47fcb2639ac5576aa1eedca29689e911463f1118379c0be0f22"} Sep 29 11:39:40 crc kubenswrapper[4752]: I0929 11:39:40.130925 4752 scope.go:117] "RemoveContainer" containerID="73479a08349ee47fcb2639ac5576aa1eedca29689e911463f1118379c0be0f22" Sep 29 11:39:40 crc kubenswrapper[4752]: I0929 11:39:40.411530 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-j48h8_must-gather-2f47z_95d94158-0ee1-4665-ab07-2656c87c4881/gather/0.log" Sep 29 11:39:48 crc kubenswrapper[4752]: I0929 11:39:48.722978 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-j48h8/must-gather-2f47z"] Sep 29 11:39:48 crc kubenswrapper[4752]: I0929 11:39:48.725023 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-j48h8/must-gather-2f47z" podUID="95d94158-0ee1-4665-ab07-2656c87c4881" containerName="copy" containerID="cri-o://cd561ceee4ab9eb71a590a7c10aa7091e297e3dd1e33f9f0d1d2986d87b7fee2" gracePeriod=2 Sep 29 11:39:48 crc kubenswrapper[4752]: I0929 11:39:48.729153 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-j48h8/must-gather-2f47z"] Sep 29 11:39:49 crc kubenswrapper[4752]: I0929 11:39:49.203853 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-j48h8_must-gather-2f47z_95d94158-0ee1-4665-ab07-2656c87c4881/copy/0.log" Sep 29 11:39:49 crc kubenswrapper[4752]: I0929 11:39:49.204553 4752 generic.go:334] "Generic (PLEG): container finished" podID="95d94158-0ee1-4665-ab07-2656c87c4881" containerID="cd561ceee4ab9eb71a590a7c10aa7091e297e3dd1e33f9f0d1d2986d87b7fee2" exitCode=143 Sep 29 11:39:49 crc kubenswrapper[4752]: I0929 11:39:49.767948 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-j48h8_must-gather-2f47z_95d94158-0ee1-4665-ab07-2656c87c4881/copy/0.log" Sep 29 11:39:49 crc kubenswrapper[4752]: I0929 11:39:49.768772 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-j48h8/must-gather-2f47z" Sep 29 11:39:49 crc kubenswrapper[4752]: I0929 11:39:49.847726 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/95d94158-0ee1-4665-ab07-2656c87c4881-must-gather-output\") pod \"95d94158-0ee1-4665-ab07-2656c87c4881\" (UID: \"95d94158-0ee1-4665-ab07-2656c87c4881\") " Sep 29 11:39:49 crc kubenswrapper[4752]: I0929 11:39:49.848182 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sd8k7\" (UniqueName: \"kubernetes.io/projected/95d94158-0ee1-4665-ab07-2656c87c4881-kube-api-access-sd8k7\") pod \"95d94158-0ee1-4665-ab07-2656c87c4881\" (UID: \"95d94158-0ee1-4665-ab07-2656c87c4881\") " Sep 29 11:39:49 crc kubenswrapper[4752]: I0929 11:39:49.857370 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95d94158-0ee1-4665-ab07-2656c87c4881-kube-api-access-sd8k7" (OuterVolumeSpecName: "kube-api-access-sd8k7") pod "95d94158-0ee1-4665-ab07-2656c87c4881" (UID: "95d94158-0ee1-4665-ab07-2656c87c4881"). InnerVolumeSpecName "kube-api-access-sd8k7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 11:39:49 crc kubenswrapper[4752]: I0929 11:39:49.941461 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95d94158-0ee1-4665-ab07-2656c87c4881-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "95d94158-0ee1-4665-ab07-2656c87c4881" (UID: "95d94158-0ee1-4665-ab07-2656c87c4881"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 11:39:49 crc kubenswrapper[4752]: I0929 11:39:49.949849 4752 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/95d94158-0ee1-4665-ab07-2656c87c4881-must-gather-output\") on node \"crc\" DevicePath \"\"" Sep 29 11:39:49 crc kubenswrapper[4752]: I0929 11:39:49.949895 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sd8k7\" (UniqueName: \"kubernetes.io/projected/95d94158-0ee1-4665-ab07-2656c87c4881-kube-api-access-sd8k7\") on node \"crc\" DevicePath \"\"" Sep 29 11:39:50 crc kubenswrapper[4752]: I0929 11:39:50.043708 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95d94158-0ee1-4665-ab07-2656c87c4881" path="/var/lib/kubelet/pods/95d94158-0ee1-4665-ab07-2656c87c4881/volumes" Sep 29 11:39:50 crc kubenswrapper[4752]: I0929 11:39:50.216256 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-j48h8_must-gather-2f47z_95d94158-0ee1-4665-ab07-2656c87c4881/copy/0.log" Sep 29 11:39:50 crc kubenswrapper[4752]: I0929 11:39:50.216857 4752 scope.go:117] "RemoveContainer" containerID="cd561ceee4ab9eb71a590a7c10aa7091e297e3dd1e33f9f0d1d2986d87b7fee2" Sep 29 11:39:50 crc kubenswrapper[4752]: I0929 11:39:50.216878 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-j48h8/must-gather-2f47z" Sep 29 11:39:50 crc kubenswrapper[4752]: I0929 11:39:50.237554 4752 scope.go:117] "RemoveContainer" containerID="73479a08349ee47fcb2639ac5576aa1eedca29689e911463f1118379c0be0f22" Sep 29 11:39:56 crc kubenswrapper[4752]: I0929 11:39:56.176211 4752 patch_prober.go:28] interesting pod/machine-config-daemon-mgrvs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 11:39:56 crc kubenswrapper[4752]: I0929 11:39:56.177292 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 11:40:18 crc kubenswrapper[4752]: I0929 11:40:18.046411 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-db-create-84dqj"] Sep 29 11:40:18 crc kubenswrapper[4752]: I0929 11:40:18.052984 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-db-create-84dqj"] Sep 29 11:40:20 crc kubenswrapper[4752]: I0929 11:40:20.045228 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9bc68bdf-98e2-4ebf-be29-30fb59ebf7bd" path="/var/lib/kubelet/pods/9bc68bdf-98e2-4ebf-be29-30fb59ebf7bd/volumes" Sep 29 11:40:23 crc kubenswrapper[4752]: I0929 11:40:23.208106 4752 scope.go:117] "RemoveContainer" containerID="ec28be1e5f634f90da981f9c5e07d88e4d408e07e2938fcce564217941a2ceee" Sep 29 11:40:26 crc kubenswrapper[4752]: I0929 11:40:26.175518 4752 patch_prober.go:28] interesting pod/machine-config-daemon-mgrvs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 11:40:26 crc kubenswrapper[4752]: I0929 11:40:26.176136 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 11:40:26 crc kubenswrapper[4752]: I0929 11:40:26.176179 4752 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" Sep 29 11:40:26 crc kubenswrapper[4752]: I0929 11:40:26.176888 4752 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4f090cb06b45016b138b19baf0457d61e893d318bd9d300e02183fbc10bfe7df"} pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 29 11:40:26 crc kubenswrapper[4752]: I0929 11:40:26.176933 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" podUID="5863c243-797d-462a-b11f-71aaf005f8d1" containerName="machine-config-daemon" containerID="cri-o://4f090cb06b45016b138b19baf0457d61e893d318bd9d300e02183fbc10bfe7df" gracePeriod=600 Sep 29 11:40:26 crc kubenswrapper[4752]: I0929 11:40:26.529450 4752 generic.go:334] "Generic (PLEG): container finished" podID="5863c243-797d-462a-b11f-71aaf005f8d1" containerID="4f090cb06b45016b138b19baf0457d61e893d318bd9d300e02183fbc10bfe7df" exitCode=0 Sep 29 11:40:26 crc kubenswrapper[4752]: I0929 11:40:26.529493 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" event={"ID":"5863c243-797d-462a-b11f-71aaf005f8d1","Type":"ContainerDied","Data":"4f090cb06b45016b138b19baf0457d61e893d318bd9d300e02183fbc10bfe7df"} Sep 29 11:40:26 crc kubenswrapper[4752]: I0929 11:40:26.529529 4752 scope.go:117] "RemoveContainer" containerID="4edd0d5d46b3e4dc4505574db2621f4af50e77316cad134acc80e3c6d10d13c6" Sep 29 11:40:27 crc kubenswrapper[4752]: I0929 11:40:27.546000 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgrvs" event={"ID":"5863c243-797d-462a-b11f-71aaf005f8d1","Type":"ContainerStarted","Data":"0318b63b458721532cf7a5cfd89cc02897c3635b9005d821dff9ae4fe342030e"} Sep 29 11:40:32 crc kubenswrapper[4752]: I0929 11:40:32.052076 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-20c0-account-create-jr4ls"] Sep 29 11:40:32 crc kubenswrapper[4752]: I0929 11:40:32.057934 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-20c0-account-create-jr4ls"] Sep 29 11:40:34 crc kubenswrapper[4752]: I0929 11:40:34.054431 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="307f40b1-f2e8-408e-9d6b-068f0259c850" path="/var/lib/kubelet/pods/307f40b1-f2e8-408e-9d6b-068f0259c850/volumes" Sep 29 11:40:39 crc kubenswrapper[4752]: I0929 11:40:39.046362 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-fnlm6"] Sep 29 11:40:39 crc kubenswrapper[4752]: I0929 11:40:39.054981 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-fnlm6"] Sep 29 11:40:40 crc kubenswrapper[4752]: I0929 11:40:40.042174 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c788fc02-bde8-4ee8-a380-0c50d4b56c65" path="/var/lib/kubelet/pods/c788fc02-bde8-4ee8-a380-0c50d4b56c65/volumes" Sep 29 11:41:23 crc kubenswrapper[4752]: I0929 11:41:23.280426 4752 scope.go:117] "RemoveContainer" containerID="3d7e7f3e450c6ea4f1c1ef55f7b063fcbd91f92640ab62a4ba48a2ea1225a5cb" Sep 29 11:41:23 crc kubenswrapper[4752]: I0929 11:41:23.305983 4752 scope.go:117] "RemoveContainer" containerID="6f2de9bbeb6be56bd04b3f9016ee5f773fc52594a8f8e7e200ac0433ccec7f63"